Exercises
ex-ch08-01
EasyLet with . Compute the correlation coefficient between and .
.
Apply the formula
.
ex-ch08-02
EasyIf , what is the distribution of (where is the all-ones vector)?
Use the affine transformation theorem.
Apply $\mathbf{A} = 3\mathbf{I}$, $\mathbf{b} = \mathbf{1}$
.
ex-ch08-03
EasyLet with . Find .
Write with .
Compute
.
ex-ch08-04
EasyFor a scalar , compute , , and the distribution of .
For , and each real/imaginary part has variance .
Compute
. . is Rayleigh with parameter : , .
ex-ch08-05
EasyLet be i.i.d. . What is the mean and variance of ?
.
Apply chi-squared properties
and .
ex-ch08-06
MediumLet with
Find the conditional distribution of given .
Partition into and .
Compute (a matrix times a inverse).
Identify blocks
, , .
Compute $\ntn{covmat}_{22}^{-1}$
, so .
Conditional mean
.
Conditional variance
.
So .
ex-ch08-07
MediumProve that if and is orthogonal (), then is also Gaussian with the same eigenvalues in its covariance.
Use the affine transformation theorem.
Compare the eigenvalues of with those of .
Apply the theorem
.
Eigenvalues are preserved
and are similar (since is orthogonal), so they share the same eigenvalues. An orthogonal transformation rotates the Gaussian cloud without changing the principal axis lengths.
ex-ch08-08
MediumLet with . Find the whitening matrix using the Cholesky factorization and verify the result.
Find such that .
The whitening matrix is .
Cholesky factor
.
Whitening matrix
.
Verify
.
ex-ch08-09
MediumShow that the multivariate Gaussian characteristic function satisfies with equality iff .
Compute .
Use PSD of .
Compute the modulus
since .
Equality condition
iff . If , this holds only for .
ex-ch08-10
MediumLet . Compute .
.
For , .
Use chi-squared moments
where . .
ex-ch08-11
HardProve that the entropy of is (nats).
Compute .
Use .
Write the log-PDF
.
Take the expectation
The quadratic form has expectation . Therefore
.
ex-ch08-12
HardLet be i.i.d. in . Show that the sample mean and the scatter matrix are independent.
Show that and each are uncorrelated.
Use the Gaussian property that uncorrelated implies independent.
Compute the cross-covariance
.
Apply Gaussianity
Since is jointly Gaussian (as an affine function of the i.i.d. data) and is uncorrelated with each centered observation, they are independent. Since is a function of , it is independent of .
ex-ch08-13
HardProve the block matrix inversion formula: if with and invertible, then the block of is .
Multiply by the proposed inverse and verify the identity.
Eliminate from the second block row using .
Block LDU factorization
Factor where .
Invert each factor
The inverse of the middle factor is . The block of picks up .
ex-ch08-14
HardShow that the Gaussian maximizes differential entropy among all distributions with the same mean and covariance. That is, if has mean and covariance , and , then .
Consider .
Expand the KL divergence and use the matching moments.
Write the KL divergence
.
Evaluate the cross-entropy
.
The last term equals since and share the same covariance. So .
Conclude
, hence .
ex-ch08-15
HardLet with . Express the distribution of the real-valued representation in terms of .
Use the properness condition .
The real covariance is for the diagonal blocks.
Real and imaginary covariances
Let with and (Hermitian symmetry). Properness gives:
, .
Write the joint covariance
\hat{\mathbf{Z}} \sim \mathcal{N}(\mathbf{0}, \boldsymbol{\Sigma}_{\hat{\mathbf{Z}}})$.
ex-ch08-16
Challenge(Stein's Lemma) Let be jointly Gaussian with . Show that for any differentiable function with ,
Write where is independent of .
Use integration by parts on .
Decompose Y
By the conditional Gaussian formula, where is independent of and .
Compute the covariance
. Integration by parts: (since ).
Integration by parts
Integrating by parts: . Therefore .
ex-ch08-17
ChallengeLet have i.i.d. entries with . Show that the eigenvalues of are almost surely distinct.
follows a complex Wishart distribution .
The joint eigenvalue density of the complex Wishart includes a Vandermonde determinant factor.
Joint eigenvalue density
The joint density of the ordered eigenvalues of is
Distinctness
The Vandermonde factor vanishes whenever two eigenvalues coincide. Since this factor multiplies the density, the set has zero probability measure. Hence the eigenvalues are almost surely distinct.
ex-ch08-18
MediumLet with . Compute .
Use the moment formula for Gaussians: ... no, use the Isserlis (Wick) theorem.
Isserlis: .
Apply Isserlis theorem
With and :
.
ex-ch08-19
MediumLet with . Show that , where is a density level set (ellipsoid).
Whiten: .
The ellipsoid in -space becomes a ball in -space.
Transform
. So .
ex-ch08-20
Challenge(Information geometry) Let be the family of Gaussian distributions parameterized by . Compute the Fisher information matrix .
The Fisher information for a scalar with parameters is .
Independence of components makes the FIM block-diagonal.
Score functions
, .
Compute expectations
.
The off-diagonal blocks are zero by independence. Each component contributes a block .