Chapter Summary
Chapter Summary
Key Points
- 1.
The relay channel model. The relay channel is the simplest network where cooperation helps: a source communicates to a destination with the help of a relay, subject to a strictly causal constraint ( depends only on ). Despite over fifty years of research, the capacity of the general relay channel remains an open problem — known exactly only for the degraded case.
- 2.
The cut-set bound as the universal benchmark. The cut-set bound provides the tightest general upper bound. It arises from two bottlenecks: the broadcast cut (source to relay+destination) and the multiple-access cut (source to destination with relay side information). For many practical channels, the gap to achievability is small.
- 3.
Decode-and-forward: optimal when the relay has a good view. DF achieves via block-Markov encoding and backward decoding. It achieves capacity for the degraded relay channel. The key limitation: the rate is bottlenecked by the weakest hop (source-to-relay).
- 4.
Compress-and-forward: exploiting noisy observations as side information. CF does not require the relay to decode — it compresses using Wyner-Ziv coding and forwards the compressed version. CF excels when the source-relay link is weak but the relay-destination link is strong. The achievable rate depends on the compression-forwarding trade-off.
- 5.
Compute-and-forward: structured codes for function decoding. Using nested lattice codes, the relay can decode integer linear combinations of source codewords — bridging channel coding and network coding. CaF achieves high rates when channel coefficients align with integer vectors, connecting information theory to number theory.
- 6.
The diamond network and multi-relay extensions. The diamond network (source → two relays → destination) introduces four cuts and foreshadows noisy network coding. Quantize-map-and-forward achieves within a constant gap of the cut-set bound, showing that simple compression-based strategies are near-optimal for general relay networks.
Looking Ahead
Chapter 23 specializes these results to the Gaussian relay channel, derives explicit DF and CF rates, studies capacity scaling laws for large networks (Gupta-Kumar, Özgür-Lévêque-Tse), and introduces noisy network coding as the general-purpose relay strategy that achieves within a constant gap of the cut-set bound for arbitrary Gaussian networks.