Chapter Summary
Chapter Summary
Key Points
- 1.
The normal approximation is the fundamental formula for finite-blocklength coding. The channel dispersion governs the speed of convergence to capacity: two channels with the same capacity but different dispersions behave very differently at short blocklengths.
- 2.
The RCU bound and meta-converse provide tight, computable, non-asymptotic bounds on that replace the asymptotic achievability-converse pair (random coding + Fano). Together they sandwich the true maximum rate to within a fraction of a bit for most practical channels, even at .
- 3.
The rate-reliability-blocklength tradeoff is the central design tool for URLLC. At and , the achievable rate can be 20-40% below the Shannon capacity, depending on the channel and SNR. Using the capacity formula for short-blocklength design leads to significant under-provisioning of resources.
- 4.
The AWGN dispersion in nats approaches at high SNR. In fading channels, the dispersion includes an additional term from fading variance, which dominates at low diversity orders and makes multi-antenna systems essential for URLLC.
- 5.
Multi-user finite-blocklength theory extends the normal approximation to MAC and BC. The MAC dispersion is a matrix governing the shrinkage of the capacity region. The sum-rate dispersion equals the point-to-point dispersion at the sum power. Superposition coding remains second-order optimal for the degraded BC.
- 6.
Massive MTC and grant-free access operate in the many-access regime where grows with . The per-user energy-per-bit requirement grows as , fundamentally limiting short-packet random access scalability beyond the penalty.
Looking Ahead
The finite-blocklength analysis shows that short codes incur a rate penalty. But what if we could reduce the effective blocklength seen by the receiver by pre-placing content at the user? This is precisely the idea behind coded caching (Chapter 27). By exploiting user caches to create multicast opportunities, coded caching achieves a delivery rate that scales inversely with the cache size, effectively turning memory into bandwidth. The information-theoretic framework of coded caching provides fundamental limits on the memory-rate tradeoff, connecting to the source-channel duality ideas from the first half of this book.