The Nair-El Gamal Outer Bound
The Need for Tight Outer Bounds
Marton's inner bound tells us what rates are achievable β but how do we know whether we are leaving performance on the table? For the degraded BC, the converse was relatively straightforward (Bergmans' argument using the entropy power inequality). For the general BC, the converse is much harder because we cannot appeal to a degradation ordering.
The strongest known outer bound for the general broadcast channel is due to Nair and El Gamal (2007). It tightens the earlier UV outer bound by introducing an auxiliary random variable that captures the correlation structure between the two receivers' observations. When this outer bound matches Marton's inner bound, we have the capacity region.
Definition: The UV Outer Bound
The UV Outer Bound
For any achievable rate pair on the two-user BC , there exist auxiliary random variables such that forms a Markov chain and:
This is essentially the converse of superposition coding β it bounds the rates by what a "layered" encoding could achieve.
The UV outer bound is not tight in general. It does not capture the gains that binning (Marton coding) can provide for non-degraded channels.
Theorem: The Nair-El Gamal Outer Bound
For any achievable rate pair on the two-user BC , there exist auxiliary random variables with such that:
More precisely, the Nair-El Gamal bound introduces an additional constraint:
This "sum-rate tightening" uses the fact that the total information extractable from jointly cannot exceed , combined with a careful single-letterization argument.
The key insight is that receiver 1's observation constrains what receiver 2 can learn about the common message . The term captures how much "new" information provides about beyond what already reveals. This is tighter than simply bounding by .
Standard converse ingredients
Start with Fano's inequality: for , where . The chain rule gives:
Introduce time-sharing and auxiliary identification
Define the auxiliary random variables via a time-sharing argument. Let be uniform on and set and . Then forms a Markov chain, and the individual rate bounds follow from standard manipulations.
The sum-rate tightening
For the tighter sum-rate bound, we write: Now use the key inequality: The first term is bounded by the rate at which receiver 1 could decode (creating a MAC-type bound), and the second term captures the "innovation" in beyond . Single-letterizing via the auxiliary identification yields the bound.
When the bound is tight
The Nair-El Gamal outer bound matches Marton's inner bound for: (a) degraded BCs (by construction), (b) the deterministic BC, (c) certain binary-input BCs where the auxiliary cardinality is small enough for explicit computation. For the general case, a gap may exist β closing this gap is an open problem.
Example: Tightness for the Binary-Input Broadcast Channel
Consider a two-user BC with binary input and arbitrary output alphabets . Show that the Nair-El Gamal outer bound matches Marton's inner bound, establishing the capacity region.
Binary input simplification
With , the auxiliary in Marton's bound can take at most values, and at most values. This makes the optimization tractable.
Matching inner and outer bounds
Nair and El Gamal showed that for binary-input BCs, the additional sum-rate constraint in the outer bound exactly matches the Marton binning penalty . The proof uses the Mrs. Gerber's Lemma and properties specific to binary entropy.
Result
The capacity region of any binary-input broadcast channel is: This is exactly the superposition coding region β for binary inputs, binning provides no additional benefit over superposition.
Historical Note: The Quest for the BC Capacity Region
1972-2007The broadcast channel capacity problem has a fascinating history. Cover introduced the problem in 1972 and conjectured that superposition coding is optimal for the general BC β this conjecture turned out to be false for channels with more than two inputs. Marton's 1979 inner bound showed that binning can strictly improve on superposition. El Gamal and Van der Meulen, and later Nair and El Gamal (2007), developed increasingly tight outer bounds.
Despite nearly 50 years of effort, the capacity region of the general discrete memoryless broadcast channel remains unknown. The problem is listed among the major open problems in network information theory by El Gamal and Kim. The Gaussian MIMO case, however, was resolved by Weingarten, Steinberg, and Shamai (2006), building on the work of Caire and Shamai on dirty-paper coding.
Common Mistake: The Outer Bound Is Not a Converse for Every Channel
Mistake:
Claiming that the Nair-El Gamal outer bound provides a tight converse for all broadcast channels.
Correction:
The outer bound is the best known but is not proven tight for all channels. For channels with more than two inputs, there may be a gap between Marton's inner bound and the Nair-El Gamal outer bound. The gap, if it exists, is not yet understood.
Mrs. Gerber's Lemma
An information-theoretic inequality relating the entropy of the output of a binary symmetric channel to the entropy of its input. Formally: if and with , then , where denotes binary convolution.
Related: Binary entropy function, Data Processing Inequality
Quick Check
For a degraded broadcast channel, the Nair-El Gamal outer bound reduces to which known result?
The superposition coding capacity region (Bergmans-Gallager)
The MAC capacity region with time-sharing
Shannon's point-to-point capacity
For a degraded BC, the Markov chain allows the outer bound to be tightened to match superposition coding exactly, recovering the known capacity region.
Key Takeaway
The Nair-El Gamal outer bound is the tightest known general converse for the broadcast channel. It matches Marton's inner bound for degraded, deterministic, and binary-input BCs, establishing the capacity region in these cases. For the general discrete memoryless BC, the capacity region remains open β one of the few fundamental single-hop channels for which this is the case.