Chapter Summary

Chapter Summary

Key Points

  • 1.

    The Gaussian relay channel: capacity to within 0.5 bits. For the Gaussian relay channel Y=X+Xr+ZY = X + X_r + Z, Yr=X+ZrY_r = X + Z_r, DF achieves capacity when NrNN_r \leq N (degraded case) via coherent combining. CF achieves within 0.5 bits of the cut-set bound for all parameter regimes — a universal constant-gap result. The gap comes from independent (vs. correlated) inputs, costing at most 12log2=0.5\frac{1}{2}\log 2 = 0.5 bits.

  • 2.

    DF vs. CF: it depends on the link qualities. DF excels when the source-relay link is strong (NrN_r small); CF excels when the relay-destination link is strong. Neither strategy uniformly dominates. Practical systems should adaptively select based on measured link quality.

  • 3.

    Gupta-Kumar: multi-hop routing does not scale. For nn nodes in a unit square with multi-hop routing, the per-node throughput scales as Θ(1/nlogn)\Theta(1/\sqrt{n\log n}) — it vanishes as the network grows. Aggregate interference is the fundamental bottleneck.

  • 4.

    Hierarchical cooperation achieves linear scaling. Özgür, Lévêque, and Tse showed that cooperative communication via virtual MIMO arrays achieves Θ(n)\Theta(n) total throughput for path-loss exponent α=2\alpha = 2. Cooperation converts interference into useful signal.

  • 5.

    Noisy network coding: a universal relay strategy. NNC generalizes CF to arbitrary networks: each relay compresses independently, and the destination jointly decodes. NNC achieves within (K1)/2(K-1)/2 bits of the cut-set bound for any KK-node Gaussian network — a constant gap independent of channel parameters.

Looking Ahead

Chapter 24 studies the role of feedback and interaction in multiuser channels. We will see that feedback can enlarge the MAC capacity region (unlike the point-to-point case) and improve BC capacity. The techniques — iterative refinement, retransmission strategies, and the two-way channel — complete our treatment of cooperative communication.