Exercises
ex-ch09-01
EasyCompute the MGF of directly from the definition .
Write .
Factor out and recognize the remaining series.
Expand and simplify
.
ex-ch09-02
EasyShow that for any real-valued random variable .
Use for real .
Direct computation
,
where the third equality uses the fact that expectation commutes with complex conjugation (since it is a real-linear operation on the real and imaginary parts).
ex-ch09-03
EasyFind the PGF of and verify that .
.
Compute the PGF
.
Verify the mean
, so .
ex-ch09-04
EasyCompute the first three cumulants of the exponential distribution .
, so .
Compute the CGF derivatives
. , so . , so . , so .
ex-ch09-05
EasyIf has CF and , find .
Use the affine transformation property.
Apply the formula
.
ex-ch09-06
MediumLet be i.i.d. . Use the MGF to find the distribution of .
The MGF of is .
Multiply MGFs and identify the result.
Multiply MGFs
.
This is the MGF of , so .
ex-ch09-07
MediumLet and be independent with and . Find the conditional distribution of given .
Use .
You already know .
Apply Bayes
.
This is .
ex-ch09-08
MediumFind the CF of the Cauchy distribution with location and scale (PDF ) and verify that the MGF does not exist.
The Fourier transform of is .
For the MGF, consider whether converges.
CF via Fourier transform
By scaling: .
MGF divergence
. For , the integrand grows as which is not integrable as . So for all .
ex-ch09-09
MediumProve the Chernoff bound: for any random variable and ,
Start with Markov's inequality applied to .
Apply Markov
For : .
Optimize
Since this holds for all : .
ex-ch09-10
MediumLet and be i.i.d. and independent of . Find the PGF and distribution of .
Use the compounding theorem: .
Apply compounding
and .
.
This is a negative binomial-like PGF. The distribution of does not simplify to a standard named distribution.
ex-ch09-11
MediumShow that the -th cumulant of satisfies for all when , and explain why this makes the Poisson distribution "maximally far" from Gaussian in cumulant space (given its first two cumulants).
Compute the CGF and expand.
Compare with the Gaussian, which has for .
CGF and cumulants
, so for all .
Comparison with Gaussian
A has and for . The Poisson has the same first two cumulants but all higher cumulants are nonzero and equal to . As , the normalized Poisson approaches Gaussian (by CLT), and the higher cumulants become negligible relative to .
ex-ch09-12
MediumA branching process has offspring PGF . Find the extinction probability as a function of .
This is the PGF of .
Solve for .
Compute $\mu$
. Critical: . Subcritical: . Supercritical: .
Solve the fixed-point equation
. Let . Then . Taking square roots: (nonneg. root). Let : , i.e., . .
For : . For : .
ex-ch09-13
HardProve that if , then
Expand to order with remainder.
Use DCT to handle the remainder term.
Taylor expansion with remainder
where .
Take expectations
.
. But we need the sharper bound: as (pointwise in ), and which is integrable. By DCT, .
ex-ch09-14
HardLet be i.i.d. with . Compute the rate function for and show that it equals the KL divergence .
.
Optimize over .
CGF
.
Optimize
gives , .
Evaluate the rate function
.
Cramer's theorem then says , connecting large deviations directly to information theory.
ex-ch09-15
HardLet and be Hermitian with eigenvalues . Find the CF of .
Diagonalize: .
where are i.i.d. .
Reduce to independent terms
Let . Then .
CF of each term
, so the CF of is .
Product by independence
.
ex-ch09-16
ChallengeProve the Levy continuity theorem (forward direction): if (convergence of CDFs at continuity points), then for all .
Use the Helly-Bray theorem: if and is bounded and continuous, then .
Apply with .
Apply Helly-Bray
The function is bounded () and continuous. Since at all continuity points:
.
The Helly-Bray theorem applies because are distribution functions and is bounded continuous. (Technically, one must also handle the possibility of mass escaping to infinity, which requires tightness β but convergence to a CDF implies tightness.)
ex-ch09-17
Challenge(Cramer's theorem for the Gaussian case.) Let i.i.d. Compute for and verify that Cramer's theorem gives the exact tail probability asymptotics.
.
Compare with the exact Gaussian tail .
Rate function
. (achieved at ).
Cramer says
.
Exact asymptotics
.
The exponential rate is indeed . Cramer's theorem captures the dominant exponential behavior; the prefactor is subexponential and invisible on the log scale.
ex-ch09-18
HardIn a branching process with offspring distribution , (geometric with parameter , ), find a closed-form expression for and verify the extinction probability formula.
.
Try computing and guess a pattern.
Iterate the composition
. .
For : .
Extinction probability
as when .
For general : . Since , extinction is certain iff ().
ex-ch09-19
MediumUse the CF to show that if and are independent, then .
The CF of is .
Use the scaling and product rules.
CF of the average
.
This is the CF of . By uniqueness, .
This means averaging two i.i.d. Cauchy RVs does not reduce the spread β the sample mean has the same distribution as a single observation! This is why the LLN fails for the Cauchy: it has no finite mean.
ex-ch09-20
Hard(Berry-Esseen bound.) Let be i.i.d. with , , and . Use the CF expansion to show that the CLT approximation error satisfies
where and is the standard Gaussian CDF.
Expand to third order.
The difference gives the rate via smoothing inequalities.
Third-order expansion
. .
Bound the CF difference
Using for suitable :
for bounded .
Convert to CDF error
By the Berry-Esseen smoothing lemma, with .