Non-Shannon Inequalities
Beyond Shannon's Inequalities
All the information inequalities we have used in this book β the chain rule, the data processing inequality, submodularity of entropy β follow from a single source: the non-negativity of mutual information (or equivalently, the non-negativity of KL divergence). These are called Shannon-type (or polymatroidal) inequalities. For three or fewer random variables, Shannon-type inequalities characterize all valid entropy relationships.
But for four or more random variables, there exist valid inequalities that cannot be derived from Shannon's axioms. These are non-Shannon inequalities, and their existence has profound implications for network information theory and network coding.
Definition: The Entropy Region
The Entropy Region
For random variables , the entropy vector is the dimensional vector containing the entropy of every non-empty subset. The entropy region is the closure of all entropy vectors achievable by some joint distribution on . The Shannon outer bound is the cone defined by all Shannon-type inequalities (elemental inequalities plus monotonicity and submodularity).
For , β Shannon-type inequalities are sufficient. For , β non-Shannon inequalities carve out additional constraints that the entropy region must satisfy.
Theorem: The Zhang-Yeung Inequality
For four random variables , the following inequality holds: This inequality is not implied by any combination of Shannon-type inequalities. It was the first non-Shannon inequality discovered (Zhang and Yeung, 1998).
The Zhang-Yeung inequality reveals that the entropy structure of four or more random variables is richer than what Shannon's axioms capture. Intuitively, there are correlations among four variables that cannot be decomposed into pairwise and conditional pairwise relationships. This is analogous to how a tetrahedron has geometric properties not determined by its faces alone.
Proof by linear programming
The proof verifies that the inequality holds for all joint distributions by showing that the left-hand side minus the right-hand side is non-positive for every entropy vector in . This is done by expressing it as a conic combination of conditional mutual informations (which are non-negative) plus terms that are identically zero β but no such combination exists using only Shannon-type inequalities.
Non-Shannon verification
To verify this is a non-Shannon inequality, one constructs an entropy vector that satisfies all Shannon-type inequalities but violates the Zhang-Yeung inequality. The existence of such a vector proves that .
Definition: Polymatroidal Region and Shannon-Type Inequalities
Polymatroidal Region and Shannon-Type Inequalities
The polymatroidal region is the set of all vectors satisfying:
- (normalization)
- for (monotonicity)
- for all (submodularity) These are the Shannon-type inequalities. They define a polyhedral cone. An inequality is non-Shannon if it is valid for but not implied by the polymatroidal constraints.
Implications for Network Coding and Capacity
Non-Shannon inequalities have direct implications for:
-
Network coding capacity: The capacity region of a network coding problem is characterized by the projection of onto the rate variables. Since , outer bounds based on Shannon-type inequalities alone may be strictly loose. Tighter outer bounds using non-Shannon inequalities have been found for specific networks.
-
Distributed source coding: The rate region for distributed lossy compression depends on the entropy structure of the sources. Non-Shannon inequalities can tighten the outer bound.
-
Secret key agreement: The secret key capacity for multiple terminals involves optimizing over the entropy region. Non-Shannon constraints can change the capacity.
The difficulty is that infinitely many non-Shannon inequalities exist for , and no finite characterization of is known for . This means that proving tight outer bounds for multi-terminal problems is fundamentally harder than the two-user and three-user cases.
Example: A Non-Shannon Inequality Tightens a Network Coding Bound
Consider a network coding problem with four source-sink pairs where the Shannon outer bound gives capacity region . Show that the Zhang-Yeung inequality can strictly tighten this bound.
Setup
Consider a butterfly-like network with four sources and edges with unit capacity. The Shannon outer bound allows certain rate tuples based on cut constraints and Shannon-type inequalities on the auxiliary variables.
Apply Zhang-Yeung
Substituting the edge variables into the Zhang-Yeung inequality gives an additional constraint: where is an edge capacity. This constraint is not implied by any cut bound or Shannon-type inequality.
Tightening
The resulting outer bound excludes rate tuples that the Shannon bound allows but that are information-theoretically infeasible. The gap is small but non-zero, demonstrating that non-Shannon inequalities are operationally relevant for network capacity.
Historical Note: The Discovery of Non-Shannon Inequalities
1998βpresentFor decades after Shannon's work, it was widely believed (or at least hoped) that the basic information inequalities β non-negativity of entropy, mutual information, and conditional mutual information β were sufficient to characterize all entropy relationships. Zhang and Yeung's 1998 discovery that this is false for was a conceptual earthquake. It showed that the entropy region has a fundamentally more complex structure than the polyhedral cone defined by Shannon's axioms. Since then, infinitely many non-Shannon inequalities have been discovered (Dougherty, Freiling, Zeger; MatΓΊΕ‘), but a complete characterization of for any remains one of the deepest open problems in information theory.
Common Mistake: Shannon Inequalities Are NOT Sufficient for Four or More Variables
Mistake:
Proving an outer bound for a network with four or more auxiliary random variables using only Shannon-type inequalities and claiming it is tight.
Correction:
For problems involving four or more random variables, Shannon-type inequalities may not characterize the entropy region completely. The outer bound may be strictly loose. To claim tightness, either prove a matching inner bound, or explicitly verify that non-Shannon inequalities do not further constrain the region. In practice, for many problems of interest, Shannon-type inequalities are sufficient β but this must be verified, not assumed.
Entropy Region
The set of all achievable entropy vectors for random variables, characterizing all possible relationships among the joint entropies of all subsets.
Related: The Entropy Region
Non-Shannon Inequality
An information inequality that holds for all joint distributions but cannot be derived from the basic Shannon-type inequalities (non-negativity of mutual information, chain rule, submodularity).
Related: The Zhang-Yeung Inequality