Exercises
ex-ch03-01
EasyA source has alphabet with probabilities . Compute and estimate the size of the typical set for .
.
Entropy and typical set
bits.
Typical set size . Total sequences: . Fraction: .
ex-ch03-02
EasyTrue or false: every element of the typical set has the same probability.
Consider the definition of the weak typical set.
Answer
False for the weak typical set (probabilities lie in a range , not at a single value).
For the type class (a subset of the strongly typical set for a specific type ), all sequences have exactly the same probability . But the typical set contains sequences from multiple types.
ex-ch03-03
EasyFor a binary source with , , and : (a) How many sequences are in ? (b) Approximately how many are typical? (c) What fraction of sequences are typical?
.
Compute
(a) .
(b) bits. Typical set: .
(c) Fraction: .
A vanishingly small fraction of all binary sequences are typical for this highly biased source. The typical sequences have about 100 ones and 900 zeros (with fluctuations).
ex-ch03-04
EasyFor a fair coin (), what fraction of sequences are typical?
What is for a fair coin?
Answer
bit . Typical set size , which equals the total number of sequences. So the fraction approaches . For a uniform source, all sequences are typical. This makes sense: every sequence is equally likely, so the AEP condition is trivially satisfied.
ex-ch03-05
MediumProve that if , then the conditional type satisfies for all where as .
Use .
Both and are close to their true values.
Conditional type approximation
.
By strong typicality: and .
.
As : . Specifically, .
ex-ch03-06
MediumShow that the size of the conditional typical set satisfies for any .
Every jointly typical has probability given .
The total probability of the conditional typical set approaches 1.
Size bounds
For :
.
Each has .
Upper bound: .
Lower bound: .
Therefore .
ex-ch03-07
MediumUse the AEP to prove the source coding theorem: an i.i.d. source with entropy can be losslessly compressed to bits per symbol for any and sufficiently large blocklength .
Assign unique binary codewords to typical sequences.
How many bits do you need to index sequences?
Handle atypical sequences with a flag bit.
Encoding scheme
-
If : send a bit followed by the index of in the typical set, using bits.
-
If : send a bit followed by the raw description of using bits.
Average rate
Average bits: .
Per symbol: .
For small and large , this approaches .
ex-ch03-08
MediumThe number of binary sequences of length with exactly ones is . Using Stirling's approximation, show that for .
Use .
.
Stirling
.
Therefore . This shows that the type class of a binary type has size , consistent with the general formula.
ex-ch03-09
MediumVerify Sanov's theorem for the binary case: if are i.i.d. Bernoulli(), show that for .
The dominant type is .
Sum over types with .
Upper bound
.
The minimum of over is achieved at (the boundary), giving .
Lower bound
.
Together: .
ex-ch03-10
HardUse the packing lemma to prove the channel coding theorem: for a DMC with capacity , any rate is achievable.
Generate a random codebook: codewords i.i.d. where achieves capacity.
Decoder: find the unique codeword jointly typical with .
Bound the error probability using the packing lemma.
Random codebook
Fix achieving . Generate codewords independently.
Encoding and decoding
To send message : transmit . Decoder: find the unique such that .
Error analysis
By symmetry, assume was sent. Error event: .
First event: by the AEP. Second event: by the packing lemma, since .
Therefore , proving achievability.
ex-ch03-11
HardProve the Slepian-Wolf theorem: for correlated i.i.d. sources , separate encoding with joint decoding can achieve the rate region , , .
Use random binning: assign each to a random bin .
The decoder finds the unique in bin that is jointly typical with .
Apply the packing lemma within each bin.
Random binning
Assign each to a random bin uniformly from .
Encoder for : send . Encoder for : send losslessly at rate .
Decoding
The decoder knows and . It finds the unique in bin that is jointly typical with .
Error analysis
Error occurs if: (a) (probability ), or (b) another in the same bin is jointly typical with .
Expected number of in the same bin that are jointly typical: if .
ex-ch03-12
HardShow that the number of joint types for sequences of length over alphabets satisfies . Then show that the conditional type class has size where .
The first part follows from the same counting argument as the marginal case.
For the conditional type class, condition on each value of separately.
Number of joint types
A joint type assigns a non-negative integer count to each , with total . The number of ways: .
Conditional type class size
For a fixed with type : at each position where , the value is drawn from the conditional type . The number of such is:
.
ex-ch03-13
Challenge(Blowing-up lemma) Prove that for any set with , and for any , the "blown-up" set satisfies for some constant depending only on and .
This is a concentration of measure result.
Use the method of types: most of the probability is on types near , and sequences of the same type are close in Hamming distance.
Sketch
The proof uses the isoperimetric inequality on the Hamming cube. For i.i.d. sequences, the probability of being at Hamming distance from a set with decays exponentially in . This is the "blowing-up lemma" of Marton (1986), which is a key tool in the strong converse of the channel coding theorem.
The method of types approach: partition by type, use the type class concentration, and note that Hamming neighbors within the same type class are close in probability.
ex-ch03-14
MediumFor the binary symmetric channel with crossover : (a) Compute the capacity . (b) If we use rate bits/use, estimate the decoding error exponent at the uniform input distribution.
.
Capacity
bits.
Error exponent estimate
At uniform input: .
The packing lemma gives error probability .
This bound is useless because ! At capacity, the packing lemma bound does not decay. We need for a positive exponent. For : exponent , and , which decays nicely.
ex-ch03-15
EasyHow many distinct types can arise from binary sequences of length ?
A binary type is determined by number of ones, where .
Count
A binary type is for . There are possible types.
This matches the bound , which is loose. The exact count is for binary alphabets.