Exercises
ex10-01
EasyCompute the capacity of a real AWGN channel with signal power mW and noise power mW.
Use .
The SNR is .
Compute SNR
(or 10 dB).
Apply capacity formula
bits per real channel use.
ex10-02
EasyA complex AWGN channel has dB. What spectral efficiency (bits/s/Hz) can be achieved?
Convert dB to linear: .
For complex AWGN: .
Convert SNR
.
Compute spectral efficiency
bits/s/Hz.
ex10-03
EasyShow that for the real AWGN channel, the mutual information with input equals .
Use .
with and independent. What is ?
Compute conditional entropy
.
Compute output entropy
Since and are independent, , so .
Subtract
.
ex10-04
EasyCompute (in dB) for a system operating at spectral efficiency bits/s/Hz on a complex AWGN channel at capacity.
At capacity: , so .
.
Find the required SNR
.
Compute $E_b/\ntn{n0}$
. In dB: dB.
This is dB above the Shannon limit.
ex10-05
MediumProve that among all input distributions with zero mean and variance at most , the Gaussian maximizes for the AWGN channel .
Write .
The conditional entropy does not depend on .
Use the maximum entropy property of the Gaussian.
Reduce to maximizing $\ntn{hd}(Y)$
Since is fixed, . Maximizing is equivalent to maximizing .
Apply the Gaussian entropy maximizer
. Among all distributions with variance at most , the Gaussian maximizes .
Identify the optimal input
is Gaussian if and only if is Gaussian (since is Gaussian and independent of ). Therefore is the unique maximizer.
ex10-06
MediumConsider parallel complex Gaussian sub-channels with gains and . The total power is . Find the water-filling power allocation and the total capacity.
The inverse gains are .
Start by assuming all channels are active and check for negative powers.
If any power is negative, remove the weakest channel and retry.
Compute inverse gains
: , , , .
Try all 4 active
, . But .
Try 3 active
, .
, , , . All non-negative.
Compute capacity
$
ex10-07
MediumShow that equal power allocation achieves the same capacity as water-filling when (i.e., when total power with and fixed).
At high power, all channels become active.
Write and compare with equal power.
Show that the difference vanishes relative to .
Water-filling at high SNR
When is large, , so all channels are active with .
Compare capacities
Water-filling: .
Equal power: .
The difference is bounded by a constant (depending on the channel gains) while both capacities grow as . Therefore the ratio .
ex10-08
MediumA continuous-time AWGN channel has bandwidth MHz and noise spectral density W/Hz. Plot the capacity (bits/s) as a function of transmit power from 0.1 mW to 100 mW. At what power does doubling yield less than 10% capacity increase?
Use .
The 10% threshold: solve .
Express the noise power
W = 1 mW.
Capacity at extreme values
At mW: , Mbit/s.
At mW: , Mbit/s.
Find the threshold power
We need . Numerically, this occurs at ( mW), where Mbit/s and Mbit/s. Beyond this point, capacity is firmly in the logarithmic regime and power becomes an inefficient resource.
ex10-09
MediumDerive the sphere-packing bound: show that the maximum number of codewords for a real AWGN channel with block length , power , and noise variance satisfies
Each codeword is surrounded by a noise sphere of radius .
All received vectors lie in a sphere of radius .
Non-overlapping noise spheres must fit inside the received sphere.
Signal and noise sphere radii
Codewords satisfy . The noise concentrates on a sphere of radius . The received lies in a sphere of radius .
Volume ratio
The volume of a sphere of radius in is , where is a constant depending only on .
Rate bound
.
ex10-10
HardProve that the capacity of the complex AWGN channel , , with power constraint is .
Write and decompose into two real channels.
The circularly symmetric Gaussian has independent real and imaginary parts, each .
Show that the optimal input is using the complex entropy maximizer.
Express mutual information
. for .
Maximize output entropy
. By the complex Gaussian entropy maximizer: , achieved when .
Compute capacity
$
ex10-11
HardConsider parallel Gaussian channels with gains sorted in decreasing order. Show that the number of active channels under water-filling is where .
Assume the first channels are active and compute .
The -th channel is active iff .
Show that is decreasing in (adding weaker channels lowers the water level).
Water level for $j$ active channels
If channels are active: , so .
Activity condition
Channel is active iff , i.e., .
Monotonicity
. If channel is active, then , so . The water level decreases as we add more channels. The optimal is the largest such that .
ex10-12
HardA discrete-time ISI channel has impulse response (three taps). Using subcarriers and , compute the capacity per channel use with water-filling at total power . Account for the cyclic prefix overhead.
Compute the frequency response: .
Apply water-filling over .
Include the CP overhead factor where .
Compute frequency response
for . The gains vary across subcarriers; .
Water-filling
Solve for such that . This requires numerical computation (bisection on ).
Capacity with overhead
64/66 \approx 0.970C \approx 3.1$ bits per channel use.
ex10-13
HardShow that the capacity of the band-limited AWGN channel is concave in and find its behavior as and .
Compute and show it is negative.
For : use .
For : use the same approximation.
First derivative
Let . .
.
Second derivative is negative
.
Since , is strictly concave.
Limiting behavior
As : (constant, the power-limited ceiling).
Wait β more carefully: as (no bandwidth, no transmission). The slope bits/s per Hz.
As : bits/s (finite).
ex10-14
Challenge(Entropy power inequality approach to converse) Prove the converse of the AWGN capacity theorem using the entropy power inequality (EPI): for independent continuous random variables and ,
Show that this implies .
Apply EPI with .
Use .
Use the fact that .
Apply EPI
.
Bound using entropy maximizer
and .
Conclude
.
Wait β this is a lower bound, not an upper bound! The EPI gives . This is actually the opposite direction.
The correct converse uses the maximum entropy property directly: . The EPI provides an alternative proof that Gaussian input is optimal (achievability), not the converse.
ex10-15
Challenge(Water-filling duality with rate-distortion) Show that the rate-distortion function for parallel Gaussian sources with MSE distortion has the "reverse water-filling" solution:
and explain the duality with channel coding water-filling.
The R-D function is subject to .
For Gaussian sources, .
Compare the KKT conditions with those of channel coding water-filling.
Formulate the optimization
subject to , .
KKT conditions
.
KKT: if (active). So for active sources. If , then (zero rate).
Duality
Channel coding: allocate MORE resources (power) to BETTER channels. Source coding: allocate MORE distortion to WEAKER sources (smaller variance).
Both are convex optimizations with the same mathematical structure. The water level (channel) and (source) play dual roles. This duality extends to the separation theorem: the total system is optimal when source and channel coding are separately optimized.
ex10-16
MediumA system uses OFDM with subcarriers over a channel with exponentially decaying power delay profile: for . Compare the capacity with water-filling vs. equal power allocation at dB and dB.
Generate a realization of for each tap.
Compute via the DFT.
Apply water-filling and equal power allocation separately.
Setup
Generate the channel taps and compute the -point DFT to get . The frequency response will show frequency-selective fading with approximately 6 dB variation.
At SNR = 10 dB
At moderate SNR, some sub-carriers may be in deep fades. Water-filling will shut off the deeply faded sub-carriers and concentrate power on the good ones. The capacity gain of WF over EP is typically 5-15% at this SNR.
At SNR = 30 dB
At high SNR, all sub-carriers have positive SNR even in fades. Water-filling still outperforms equal power but the gap shrinks to about 1-3%, confirming the theoretical result that equal power is asymptotically optimal.
ex10-17
EasyWhat is the maximum capacity (in bits/s) of a channel with bandwidth kHz and dB? Compare this with the data rate of GSM (270.833 kbit/s gross, ~13 kbit/s net speech rate).
Convert 30 dB to linear.
GSM uses GMSK modulation with 1 bit/symbol at 270.833 ksymbols/s.
Shannon capacity
. Mbit/s.
Compare with GSM
GSM achieves 270.833 kbit/s gross (before error correction) in 200 kHz, i.e., bits/s/Hz. The Shannon limit is bits/s/Hz. GSM operates at only 14% of the Shannon limit β reflecting its 1980s technology (no turbo/LDPC, no adaptive modulation).
ex10-18
MediumVerify the MMSE-entropy bound for the scalar AWGN channel: if with and , show that .
Compute the MMSE for jointly Gaussian .
Compute and verify equality.
MMSE
For jointly Gaussian: .
.
Conditional entropy
.
.
Verify
.