Exercises
ex-ch11-01
EasyLet where . Show that , , and for all .
For a.s. convergence, note that for every .
Almost sure convergence
For every , since is a fixed finite number. This is actually sure convergence (even stronger than a.s.).
$L^r$ convergence
for all .
In probability (follows from a.s. or $L^r$)
Since a.s. convergence implies convergence in probability, the result follows. Alternatively, convergence implies convergence in probability.
ex-ch11-02
EasyLet be i.i.d. . Use the WLLN to show that .
Compute and for .
Parameters
, .
Apply WLLN
The are i.i.d. with finite variance, so by the WLLN: . Explicitly: .
ex-ch11-03
EasyIf , show that and .
Use the triangle inequality: .
For the second moment, write and use Cauchy-Schwarz.
Mean convergence
by Jensen (or Cauchy-Schwarz with the constant 1).
Second moment convergence
. The first factor . The second factor is bounded (since convergence implies is bounded). Hence the product .
ex-ch11-04
MediumLet be i.i.d. with and . Using the CLT, find an approximate expression for
in terms of the function. How many samples are needed so that when ?
Standardize: .
For the second part, solve .
CLT approximation
$
Sample size for confidence
We need . By CLT: . So , giving , hence and .
Compare with the Chebyshev bound: . The CLT gives a factor-15 improvement.
ex-ch11-05
MediumProve that convergence in probability implies the existence of a subsequence that converges almost surely. That is, if , then there exists a subsequence with .
Choose such that .
Apply the first Borel-Cantelli lemma.
Subsequence selection
Since for each , we can find (with ) such that .
Borel-Cantelli
Let . Then . By Borel-Cantelli: . This means a.s., for all sufficiently large , . Hence .
ex-ch11-06
MediumLet be i.i.d. with . Use the CLT to approximate where . Compare with the exact gamma CDF.
For : , .
CLT approximation
has mean and standard deviation . So:
Exact value
. Using numerical evaluation: . The CLT approximation error is about β quite good for .
ex-ch11-07
MediumLet be i.i.d. with unknown. The MLE is . Show that . Why doesn't the CLT apply here?
Find for .
Compute for .
CDF of the maximum
for .
Rescaled limit
This is the survival function of .
Why CLT fails
The MLE converges at rate (not ), and its limiting distribution is exponential, not Gaussian. The CLT does not apply because is not a smooth function of the sample mean β it is the maximum, which depends on the extreme order statistic.
ex-ch11-08
Medium(Delta method) Let be i.i.d. with mean and variance . Find the asymptotic distribution of .
at gives the variance factor.
Apply CLT
.
Delta method
, . By the delta method: Equivalently, .
ex-ch11-09
Hard(Lindeberg CLT) Let be independent (not necessarily identically distributed) with , , and . State the Lindeberg condition and show that it is sufficient for
The Lindeberg condition: for every , .
Lindeberg condition
For every : This says no single summand dominates the sum.
Proof sketch (Ch.F method)
The Ch.F of is . Taylor-expand each factor: where the remainder is controlled by the Lindeberg condition. The product converges to by showing . The full argument is in Billingsley (1995), Β§27.
ex-ch11-10
HardLet be i.i.d. with mean and variance . Define . Using the delta method, find the asymptotic distribution of and compute the asymptotic relative efficiency of as an estimator of .
, so .
The asymptotic variance of is .
Delta method
, . By CLT + delta method:
Asymptotic variance and bias
The asymptotic variance of as an estimator of is . Note that is biased: . The bias vanishes at rate , which is faster than the standard deviation (), so the bias is asymptotically negligible.
ex-ch11-11
Hard(Slutsky application) Let be i.i.d. with . Define . Show that the "studentized" statistic satisfies .
Show using the WLLN.
Apply Slutsky to the ratio.
$\hat{\sigma}_n^2$ is consistent
. By WLLN: and . So .
Apply the continuous mapping theorem and Slutsky
Since and , by the continuous mapping theorem . Write . By CLT, the numerator . By above, the denominator . By Slutsky: .
ex-ch11-12
Hard(Berry-Esseen application) Let be i.i.d. . Using the Berry-Esseen theorem, find an upper bound on the error . Evaluate for , , .
Compute for Bernoulli.
Compute the third moment
. For : , .
Berry-Esseen bound
Error .
So the CDF error is at most about for with , uniformly in . This confirms that the CLT is a reasonable approximation for this case.
ex-ch11-13
Challenge(Multivariate delta method) Let be i.i.d. in with and covariance matrix . Consider the ratio statistic . Find its asymptotic distribution.
Use with .
Multivariate CLT
.
Apply the multivariate delta method
. The gradient at is .
The asymptotic variance is:
ex-ch11-14
EasyShow that if where is a constant, then .
The CDF of is , which is continuous everywhere except at .
Use the CDF convergence
For : . Since and are continuity points of : and .
ex-ch11-15
Medium(Monte Carlo integration) To compute , we generate i.i.d. and estimate .
(a) Justify why and identify the mode of convergence. (b) Use the CLT to find an approximate 95% confidence interval for when .
Note where and .
For the confidence interval, you need .
(a) Convergence
are i.i.d. with and finite variance (since on ). By the SLLN: .
(b) Confidence interval
. Numerically: , , so , .
The 95% CI is .
ex-ch11-16
Challenge(Convergence of moments under CLT) Let be i.i.d. with for some . Let . Show that . Does ?
exactly (not just asymptotically). Why?
For convergence, you would need , which requires the variables on the same space.
Second moment
. This is exact for all , not just asymptotic.
$L^2$ convergence?
convergence to requires the variables to be defined on the same probability space. Even then, convergence in distribution plus is not sufficient for convergence in general (it suffices if uniform integrability holds, which the finite moment does provide). So yes, under the stated conditions, if they share a probability space (which can always be arranged by Skorokhod's theorem).