Summary
Chapter 11 Summary: Information Theory for Wireless Channels
Key Points
- 1.
Entropy and mutual information provide the mathematical foundation for quantifying information. Channel capacity is the maximum rate for reliable communication, and Shannon's coding theorem guarantees that capacity-approaching codes exist.
- 2.
AWGN capacity reveals the fundamental bandwidth-power trade-off. The Shannon limit dB is the absolute minimum energy per bit for reliable communication, approached in the infinite-bandwidth regime.
- 3.
Flat-fading channels require distinguishing ergodic capacity (long codewords spanning many fading realisations, ) from outage capacity (delay-limited systems where the channel is quasi-static). Water-filling in time with full CSI allocates more power to stronger channel states.
- 4.
Frequency-selective channels decompose into parallel independent sub-channels via OFDM. Water-filling across frequency allocates power optimally: more to strong sub-carriers, none to deeply faded ones. The total capacity is the sum of individual sub-channel capacities.
- 5.
Multiple antennas increase capacity through array gain. SIMO with antennas achieves via MRC. MISO with CSIT achieves the same through beamforming, but without CSIT provides only diversity gain. MIMO (Chapter 15) offers a linear capacity scaling with .
- 6.
Practical coding (turbo, LDPC, polar) closes the gap to capacity from 8-10 dB (uncoded) to under 1 dB. Adaptive modulation and coding (AMC) provides a finite-rate approximation to water-filling, achieving 85-95% of ergodic capacity in 4G/5G systems.
Looking Ahead
Chapter 12 develops channel coding in detail: convolutional codes, turbo codes, LDPC codes, and polar codes. The capacity results from this chapter provide the performance benchmarks against which all practical codes are measured. Chapter 15 extends the capacity analysis to full MIMO systems, where multiple spatial streams yield a linear capacity increase with the number of antennas.