The General Relay Network

Why Relay Networks Are Hard

The relay channel β€” a three-node network where a relay helps communication between source and destination β€” has been studied since Cover and El Gamal (1979). For the simplest case (one relay), the capacity is known only in special cases (degraded relay, deterministic relay). For general relay networks with multiple relays, the gap between the best achievable rates and the best outer bounds remains significant.

The difficulty is fundamental: in a relay network, each node must simultaneously transmit and receive, creating complex dependencies between signals. The cut-set bound provides an elegant outer bound, but it is not always tight because it ignores the causality constraints that distinguish relaying from full cooperation.

Definition:

The Cut-Set Bound

For a general network with nodes V\mathcal{V} and channel PYV∣XVP_{Y_\mathcal{V} | X_\mathcal{V}}, the rate from source ss to destination dd is bounded by: R≀max⁑PXVmin⁑Ω:s∈Ω,d∈ΩcI(XΞ©;YΞ©c∣XΞ©c)R \leq \max_{P_{X_\mathcal{V}}} \min_{\Omega: s \in \Omega, d \in \Omega^c} I(X_\Omega; Y_{\Omega^c} | X_{\Omega^c}) where the minimum is over all cuts separating ss from dd, and the maximum is over all input distributions. This is the max-flow min-cut theorem of information theory.

The cut-set bound assumes that all nodes on the source side of each cut cooperate fully, ignoring the fact that relay nodes can only transmit functions of their past received signals (causality). This makes the cut-set bound generally loose for relay networks.

Theorem: The Gap for the Gaussian Relay Channel

For the Gaussian relay channel YR=XS+ZRY_R = X_S + Z_R, YD=XS+XR+ZDY_D = X_S + X_R + Z_D (with power constraints PS,PRP_S, P_R and noise variances ΟƒR2,ΟƒD2\sigma^2_{R}, \sigma^2_{D}):

  • Cut-set bound: C≀min⁑ ⁣{12log⁑ ⁣(1+PSΟƒR2+PSΟƒD2),12log⁑ ⁣(1+(PS+PR)2ΟƒD2)}C \leq \min\!\left\{\frac{1}{2}\log\!\left(1 + \frac{P_S}{\sigma^2_{R}} + \frac{P_S}{\sigma^2_{D}}\right), \frac{1}{2}\log\!\left(1 + \frac{(\sqrt{P_S} + \sqrt{P_R})^2}{\sigma^2_{D}}\right)\right\}
  • Compress-forward achieves: RCFβ‰₯Cβˆ’12R_{\text{CF}} \geq C - \frac{1}{2} bit for all parameters. The gap is at most 1/21/2 bit, but the exact capacity is unknown in general.

The cut-set bound has two terms: the first is the cut between {S,R}\{S, R\} and {D}\{D\} (relay and source cooperate), the second is the cut between {S}\{S\} and {R,D}\{R, D\} (source-to-relay link). Compress-forward is nearly optimal because the relay's quantization noise adds at most 1/21/2 bit of rate loss. Decode-forward can be tighter when the source-relay link is strong.

Definition:

Noisy Network Coding (NNC)

Noisy network coding (Lim, Kim, El Gamal, Chung, 2011) is a relaying strategy for general multi-source multi-relay networks. Each relay quantizes its received signal and maps it to a channel codeword. The destination jointly decodes the source message and all relay quantization indices. NNC achieves: Rβ‰₯max⁑PXVmin⁑Ω[I(XΞ©;Y^Ξ©c,YD∣XΞ©c)βˆ’βˆ‘j∈Ωcβˆ–DI(Yj;Y^j∣XV,Y^jc,YD)]R \geq \max_{P_{X_\mathcal{V}}} \min_{\Omega} \left[I(X_\Omega; \hat{Y}_{\Omega^c}, Y_D | X_{\Omega^c}) - \sum_{j \in \Omega^c \setminus D} I(Y_j; \hat{Y}_j | X_\mathcal{V}, \hat{Y}_{j^c}, Y_D)\right] where Y^j\hat{Y}_j is relay jj's quantized observation.

NNC is the multi-relay generalization of compress-forward. It achieves within a constant gap of the cut-set bound for many network topologies. The gap grows logarithmically with the number of relays in the worst case, which means NNC is near-optimal for small networks but may leave a significant gap for large ones.

Relay Channel: Achievable Rates vs. Cut-Set Bound

Compare decode-forward, compress-forward, and the cut-set bound for the Gaussian relay channel as a function of relay position and power.

Parameters
10
10
0.5
3

Common Mistake: The Cut-Set Bound Is Not Always Tight

Mistake:

Assuming the cut-set bound is the capacity of a relay network because it matches an achievable scheme in some special cases.

Correction:

The cut-set bound is tight for the degraded relay channel and for networks where the relay can fully decode the source message. For non-degraded channels, especially those with interference between relay and direct paths, the cut-set bound can be strictly loose. The gap can be arbitrarily large for certain discrete memoryless relay channels. Always state whether the cut-set bound is known to be tight for the specific channel model.

Quick Check

Which relaying strategy is near-optimal (within 1/2 bit) for the general Gaussian relay channel?

Decode-forward

Amplify-forward

Compress-forward

Time-sharing between decode-forward and direct transmission

Cut-Set Bound

An upper bound on achievable rates in a network, obtained by considering all cuts separating the source from the destination and bounding the information flow across each cut. The network-theoretic generalization of the channel capacity bound.

Related: The Cut-Set Bound

Noisy Network Coding (NNC)

A multi-relay coding strategy where each relay quantizes and forwards its observation. The destination jointly decodes all quantization indices and the source message. Generalizes compress-forward to multi-relay networks.

Related: Noisy Network Coding (NNC)