Exercises
ex-ch05-01
EasyLet be i.i.d. with mean and variance . Show that is unbiased for and compute its MSE.
Use linearity of expectation for the bias.
Use independence of the summands for the variance.
Bias
, so is unbiased.
Variance and MSE
and .
ex-ch05-02
EasyLet with known. Compute from both expressions (squared score and negative expected curvature) and verify they agree.
Write the log-density, take one derivative, then take another.
Both expressions should give the same constant.
Score
. .
Variance of the score
.
Negative expected curvature
(constant). So . The two expressions agree.
ex-ch05-03
EasyFor i.i.d. samples with known, state the CRB on the variance of any unbiased estimator of .
Fisher information adds over independent samples.
FI scales linearly with $n$
.
CRB
. The sample mean attains this bound, so it is efficient.
ex-ch05-04
MediumFor , compute and the CRB on any unbiased estimator of from a single observation.
The Poisson pmf is .
Recall .
Log-likelihood and score
, .
Fisher information
.
CRB
. The estimator is unbiased and has variance , so it is efficient.
ex-ch05-05
MediumLet be i.i.d. . Show that is the MVUE of .
Recognize a one-parameter exponential family.
Apply Lehmann--Scheffe.
Exponential family form
. Natural parameter , natural sufficient statistic . The natural-parameter image is , so is complete sufficient.
Unbiased function of $T$
is unbiased and is a function of . Hence, by Lehmann--Scheffe, is the MVUE. Its variance is , which matches the CRB .
ex-ch05-06
MediumObserve , , with i.i.d. , known and , unknown . Derive the CRB on .
Compute .
Use .
For large and not near 0 or , .
Fisher information
for large and .
CRB
. This is the standard carrier-phase estimation CRB.
ex-ch05-07
MediumFor the Gaussian amplitude model with , show that the matched-filter estimator is (i) unbiased, (ii) efficient, and (iii) Gaussian-distributed.
Express as a linear functional of .
Use closure of Gaussians under linear transformations.
Unbiasedness
.
Efficiency
.
Gaussianity
is a linear combination of the Gaussian 's, hence Gaussian: .
ex-ch05-08
MediumFor i.i.d. samples with both parameters unknown, prove that the Bessel-corrected sample variance has variance and compare with the CRB.
Use that .
Recall .
Chi-squared distribution
, so .
Scale back
. The CRB is (from the FIM of EJoint CRB: Mean and Variance of a Gaussian). The gap factor goes to 1 as : is asymptotically efficient but not efficient at finite .
ex-ch05-09
MediumProve that if is an efficient estimator (attains the scalar CRB), then it is the MVUE.
Any unbiased estimator has variance at least the CRB.
One-liner
By the CRB, every unbiased estimator has variance . An efficient estimator attains this lower bound. Hence no unbiased estimator can have strictly smaller variance, and is the MVUE.
ex-ch05-10
HardLet be i.i.d. exponential with rate , i.e. for . Find (i) a sufficient statistic, (ii) the CRB on , and (iii) the MVUE.
Write the family in canonical exponential form.
The MVUE may differ from by a small Bessel-type correction.
Sufficient statistic
, so is complete sufficient (exponential family with full-dimensional natural parameter image ).
CRB
, so and the CRB on unbiased estimators of is .
MVUE
, so has . Therefore is unbiased, a function of , hence by Lehmann--Scheffe the MVUE. Its variance is , strictly exceeding the CRB for finite .
ex-ch05-11
HardObserve , unknown, . Derive the CRB on and exhibit the MVUE.
Compute the scalar Fisher information via the second derivative.
Think about as an unbiased estimator.
Fisher information
. . . .
CRB and MVUE
CRB: . The estimator is unbiased and a function of the complete sufficient statistic , hence MVUE. Its variance attains the CRB: efficient.
ex-ch05-12
HardConsider , , with i.i.d. , and both unknown. Derive the Fisher information matrix and the resulting componentwise CRBs.
The cross-derivative of the log-likelihood involves the noise realization and averages to zero.
You should get a block-diagonal FIM.
Log-likelihood partials
. . .
FIM entries
, , (cross-term averages out). Hence .
Componentwise CRBs
, . Because the FIM is diagonal, joint estimation does not inflate the individual bounds.
ex-ch05-13
HardShow that the conditional expectation in Rao--Blackwell always lies in the closed convex hull of realizations of .
Use that conditional expectation is a weighted average.
Conditional expectation as average
is a convex combination (with unit-mass probability weights) of realizations of consistent with , hence belongs to the convex hull of .
ex-ch05-14
HardLet , , unknown. Starting from the naive unbiased estimator (assuming ), verify by direct Gaussian conditioning that the Rao--Blackwellized estimator is and compare variances.
Use the Gaussian conditional mean formula.
Compare to .
Joint Gaussian
Under , is jointly Gaussian with , , cross-covariance . Conditional mean formula: .
Variance comparison
Original variance . Rao--Blackwellized variance , with equality only if (i.e., for ).
ex-ch05-15
ChallengeLet with known and . Prove the general Gaussian CRB .
Differentiate the log-density twice in .
Use .
Log-density in $\boldsymbol{\theta}$
. .
Second derivative
.
Take expectation
Under , is zero-mean, so the second term vanishes. Hence .
ex-ch05-16
ChallengeConsider a bearings-only localization problem: a source at position is observed by sensors at known positions , each returning a noisy bearing with and i.i.d. . Derive the FIM and the geometric interpretation of .
Compute as a unit vector tangent to a circle around , divided by the range .
The FIM is a sum of rank-one outer products weighted by .
Gradient of $\alpha_i$
Let and be the unit vector from to . Then , where is the unit vector perpendicular to .
FIM
Using Exercise 15, .
Geometry
is the covariance ellipse of the CRB. Its eigenvectors are determined by the spread of bearing directions: if all are nearly parallel (sensors collinear with the source), the FIM is nearly rank-one and the CRB ellipse is extremely elongated --- the classic bearings-only observability problem. This is the same geometry used in hearing-only and AoA-based localization in wireless positioning.
ex-ch05-17
ChallengeLet be i.i.d. . Show that the support of depends on , hence the CRB does not apply. Find the MVUE of and show that its variance is --- strictly faster than the parametric rate.
Use the order statistic .
Compute and adjust for unbiasedness.
Regularity fails
Support depends on , so , and the regularity condition used to derive the CRB does not hold.
MVUE
has CDF , so pdf on . . The unbiased estimator is a function of the complete sufficient statistic , hence MVUE.
Variance is $\Theta(1/n^2)$
, so . Then . The rate is , better than the that the CRB would suggest were it applicable --- a concrete reason why the regularity condition matters.
ex-ch05-18
ChallengeShow that in the exponential family , the Fisher information matrix can be written as .
Differentiate to get the score.
Use that in canonical form.
Score
, using that in canonical form .
Covariance of the score
.
ex-ch05-19
MediumVerify the CRB numerically: generate Monte Carlo realizations of under , , , . Compare the empirical variance to the CRB .
The companion Python script performs this experiment.
Expected: empirical variance within a few percent of .
Theoretical variance
.
Numerical check
Running the supplement with trials yields an empirical variance consistently within 5% of . The matched filter sits on the CRB, in line with the efficiency proof of Exercise 7.
ex-ch05-20
HardFor i.i.d. (both mean and variance tied to the same parameter , "coefficient-of-variation 1" family), derive the Fisher information and the CRB. Is the sample mean efficient?
The log-density depends on through both the exponent and the normalizing factor .
Work carefully with the chain rule.
Log-density
.
Score
.
Fisher information
Taking the second derivative and expectation, after algebra (using and ), . So for i.i.d. samples, and .
Sample mean is NOT efficient
is unbiased with variance , strictly larger than . The sample mean ignores the variance information. The efficient estimator exploits both moments. Lesson: parameter sharing changes which estimator is efficient.
ex-ch05-21
MediumConsider the linear Gaussian model , (full column rank), , . Show that the least-squares estimator is the MVUE and compute its covariance.
Compute the FIM from the log-density.
Recognize the exponential family form.
FIM
. .
Covariance of LS
, matching . The LS estimator is efficient, hence MVUE (and the MLE coincides with it in the Gaussian case).