Exercises
ex-ch05-01
EasyLet be a discrete random variable with , , . Compute and .
Use the definition: .
For variance, use the shortcut: .
Mean
.
Second moment
.
Variance
.
ex-ch05-02
EasyShow that for any event .
The indicator takes values and .
Direct computation
.
ex-ch05-03
EasyA fair die is rolled once. Compute the entropy in bits.
Use for a uniform distribution.
Solution
bits.
ex-ch05-04
EasyIf , what is ?
Use the memoryless property.
Apply memoryless property
.
ex-ch05-05
MediumLet . Derive the MGF directly from the definition.
Write .
Recognize the binomial theorem.
Compute
$
by the binomial theorem.
ex-ch05-06
MediumProve that , with equality iff is a.s. constant.
, which is an expectation of a non-negative quantity.
Non-negativity
for all outcomes, so .
Equality condition
iff a.s., i.e., a.s.
ex-ch05-07
MediumLet . Verify directly that .
Use the Taylor expansion of .
Sum the PMF
$
ex-ch05-08
MediumA packet is retransmitted until it is received correctly. Each transmission succeeds independently with probability . Let be the total number of transmissions. Find and .
What distribution does follow?
Identify the distribution
.
Compute
transmissions. .
ex-ch05-09
MediumShow that the geometric distribution is the only discrete memoryless distribution. That is, if takes values in and satisfies for all , then for some .
Let . Show that .
Use the memoryless property recursively.
Derive the tail
Let . The memoryless property gives (since after conditioning). The only solution with and decreasing is for .
Recover the PMF
, which is Geometric() with .
ex-ch05-10
MediumCompute the entropy of the geometric distribution in bits.
Use .
Split the log and evaluate each sum separately.
Expand
Let .
Evaluate the sums
. .
Result
H(X) = \frac{-(1-p)\log_2(1-p) - p\log_2 p}{p}$.
ex-ch05-11
HardProve the Poisson limit theorem: if for a fixed , then as .
Write out the binomial PMF and take the limit term by term.
Use .
Write the binomial PMF
$
Factor and take limits
\mathbb{P}(X_n = k) \to \frac{e^{-\lambda}\lambda^k}{k!}\blacksquare$
ex-ch05-12
HardLet and be independent with and . Show that .
Use the MGF: .
Compute the MGF of the sum
.
This is the MGF of Poisson. Since the MGF uniquely determines the distribution, .
ex-ch05-13
HardShow that entropy is concave: if and are two PMFs on and , then
Use the concavity of on and apply it term by term.
Pointwise concavity
The function is concave on (since ). For each :
Sum over $x$
Summing over all :
ex-ch05-14
HardDerive the mean and variance of the hypergeometric distribution using the indicator method. Specifically, draw items without replacement from a population of ( good, bad). Show and .
Let . Find and .
Mean by indicators
by symmetry (each draw is equally likely to be any item). So .
Covariance
for .
.
Variance
.
ex-ch05-15
ChallengeLet be a random variable taking values in with . Among all such distributions, find the one that maximizes the entropy .
Use Lagrange multipliers: maximize subject to and .
The solution is an exponential (geometric-like) distribution.
Set up the Lagrangian
Maximize subject to and .
Take the derivative
,
so for constants .
Identify the distribution
This is the geometric distribution with parameter , where (so that ). The geometric distribution is the maximum-entropy distribution on the positive integers with a given mean.
ex-ch05-16
MediumA sensor network has 100 nodes. Each node independently fails with probability 0.02. Approximate the probability that at least 5 nodes fail, using the Poisson approximation.
Model as Binomial(100, 0.02) and approximate with Poisson().
Poisson approximation
. .
Compute
.
.
ex-ch05-17
MediumCompute the MGF of the Poisson distribution and use it to verify that and .
.
and .
MGF
.
Mean
, so .
Variance
, so . Then .
ex-ch05-18
Challenge(Coupon collector) There are types of coupons. Each purchase gives a uniformly random coupon. Let be the number of purchases needed to collect all types. Show that , where is the -th harmonic number.
Let be the number of purchases after collecting types until collecting the -th new type.
What distribution does follow?
Decompose into phases
After collecting distinct types, the probability of getting a new type on the next purchase is . So .
Sum the expectations
, and .
Conclude
\gamma \approx 0.577\blacksquare$
ex-ch05-19
EasyIf , compute and .
Use the formulas and .
Compute
. .
ex-ch05-20
HardShow that the entropy of the Poisson distribution grows as for large .
For large , the Poisson is approximately by the CLT.
Use the differential entropy of the Gaussian: .
CLT approximation
For large , is approximately . The differential entropy of a continuous Gaussian with variance is .
Discrete-to-continuous
For a discrete distribution concentrated on integers that approximates a continuous distribution,
More precisely, for integer-valued RVs well-approximated by a Gaussian, for large . A rigorous derivation uses the entropy of a discretized Gaussian.