The Gaussian Channel with Colored Noise

Beyond White Noise

The white noise assumption — that noise samples are i.i.d. — is a convenient idealization. In practice, noise is often colored: correlated across time or frequency. Interference from neighboring cells, quantization noise in ADCs, and feedback from automatic gain control all produce correlated disturbances. The key insight of this section is that the Karhunen-Lo`eve (KL) expansion diagonalizes the noise covariance, converting the colored-noise channel into a set of parallel Gaussian sub-channels with unequal noise variances — and water-filling once again provides the optimal power allocation.

Definition:

Gaussian Channel with Colored Noise

Consider the vector channel

Y=X+Z,ZN(0,ΣZ),\mathbf{Y} = \mathbf{X} + \mathbf{Z}, \quad \mathbf{Z} \sim \mathcal{N}(\mathbf{0}, \boldsymbol{\Sigma}_{Z}),

where ΣZ\boldsymbol{\Sigma}_{Z} is a positive definite noise covariance matrix and the power constraint is 1nE[X2]P\frac{1}{n}\mathbb{E}[\|\mathbf{X}\|^2] \leq P.

Since ΣZ\boldsymbol{\Sigma}_{Z} is positive definite, it admits an eigendecomposition ΣZ=UΛUT\boldsymbol{\Sigma}_{Z} = \mathbf{U} \boldsymbol{\Lambda} \mathbf{U}^T, where Λ=diag(λ1,,λn)\boldsymbol{\Lambda} = \text{diag}(\lambda_1, \ldots, \lambda_n). In the rotated coordinate system X~=UTX\tilde{\mathbf{X}} = \mathbf{U}^T\mathbf{X}, Y~=UTY\tilde{\mathbf{Y}} = \mathbf{U}^T\mathbf{Y}, the channel decomposes into nn independent sub-channels with noise variances λk\lambda_k.

Theorem: Capacity with Colored Gaussian Noise

The capacity of the Gaussian channel with colored noise ZN(0,ΣZ)\mathbf{Z} \sim \mathcal{N}(\mathbf{0}, \boldsymbol{\Sigma}_{Z}) and per-symbol power constraint PP is

C=1nk=1n12log ⁣(1+Pkλk),C = \frac{1}{n}\sum_{k=1}^n \frac{1}{2}\log\!\left(1 + \frac{P_k^*}{\lambda_k}\right),

where λ1,,λn\lambda_1, \ldots, \lambda_n are the eigenvalues of ΣZ\boldsymbol{\Sigma}_{Z} and the power allocation is the water-filling solution:

Pk=[νλk]+,P_k^* = \left[\nu - \lambda_k\right]_+,

with ν\nu chosen so that 1nkPk=P\frac{1}{n}\sum_k P_k^* = P.

Colored noise has "easy directions" (eigenvectors with small eigenvalues) and "hard directions" (large eigenvalues). Water-filling exploits this structure by sending more power along the easy directions. This is the same principle as MIMO precoding: align the transmitted signal with the favorable directions of the channel.

Example: Water-Filling with Colored Noise

A 2-dimensional channel has noise covariance ΣZ=(2112)\boldsymbol{\Sigma}_{Z} = \begin{pmatrix} 2 & 1 \\ 1 & 2 \end{pmatrix} and power constraint P=3P = 3 per dimension. Find the capacity.

The Karhunen-Lo`eve Perspective

For stationary noise processes in the continuous-time setting, the Karhunen-Lo`eve expansion plays the role of the eigendecomposition. A stationary Gaussian process with power spectral density SZ(f)S_Z(f) is diagonalized in the frequency domain by the Fourier transform. The "eigenvalues" become the spectral density SZ(f)S_Z(f), and water-filling over the noise spectrum determines the optimal transmitted power spectral density.

This connects the algebraic (matrix eigendecomposition) and analytic (spectral decomposition) views of the same principle: always transmit along the eigenmodes of the channel, allocating power by water-filling over the eigenvalues.

Water-Filling over Colored Noise Spectrum

Visualize water-filling over a continuous noise power spectral density. The bowl shape is SZ(f)S_Z(f) (the noise spectrum), and water is poured to level ν\nu. Adjust the total power and the noise spectral shape to explore the allocation.

Parameters
5

Common Mistake: Do Not Whiten Then Ignore the Transformation

Mistake:

Whitening the noise by multiplying by ΣZ1/2\boldsymbol{\Sigma}_{Z}^{-1/2} but forgetting that this also transforms the signal and changes the effective power constraint.

Correction:

The whitening filter ΣZ1/2\boldsymbol{\Sigma}_{Z}^{-1/2} converts Y=X+Z\mathbf{Y} = \mathbf{X} + \mathbf{Z} to Y~=ΣZ1/2X+Z~\tilde{\mathbf{Y}} = \boldsymbol{\Sigma}_{Z}^{-1/2}\mathbf{X} + \tilde{\mathbf{Z}} where Z~N(0,I)\tilde{\mathbf{Z}} \sim \mathcal{N}(\mathbf{0}, \mathbf{I}). But the effective channel is now ΣZ1/2\boldsymbol{\Sigma}_{Z}^{-1/2} (not identity), and the power constraint becomes E[ΣZ1/2X2]\mathbb{E}[\|\boldsymbol{\Sigma}_{Z}^{-1/2}\mathbf{X}\|^2], which differs from E[X2]\mathbb{E}[\|\mathbf{X}\|^2] when ΣZσ2I\boldsymbol{\Sigma}_{Z} \neq \sigma^2\mathbf{I}. The correct approach is the eigendecomposition method.

Quick Check

For a channel with colored noise, when does equal power allocation achieve capacity?

When all noise eigenvalues are equal (i.e., the noise is actually white)

When the total power exceeds a certain threshold

Never — water-filling is always strictly better

When the noise covariance matrix is diagonal