Graph-Theoretic Formulations
Why a Graph?
The key to making index coding tractable is to reframe it as a graph problem. Two receivers are "compatible" β can be served by a single transmission β when their demands are covered by each other's side information in a sense we will make precise. Non-compatibility is an edge; compatibility is an anti-edge. The resulting graph encodes the combinatorial structure of the instance.
Colorings, cliques, and independent sets of this graph directly give us upper and lower bounds on the broadcast rate. For the MAN family, the bounds coincide and yield the rate formula of Chapter 2.
Definition: The Conflict Graph of an Index Coding Instance
The Conflict Graph of an Index Coding Instance
Given an index coding instance with single-demand receivers, the conflict graph is defined by:
- β one vertex per receiver.
- (an edge) iff and . That is, receivers and conflict when neither has the other's desired message as side information.
Equivalently, is a non-edge (anti-edge) iff or β at least one of them already knows the other's demand.
Some references use the side-information graph (directed) instead: directed edge iff . The undirected conflict graph here is the "symmetric complement" and is what governs the clique-cover upper bound.
Theorem: Chromatic Number Upper-Bounds Index Coding Rate
For any single-demand index coding instance with conflict graph , the optimal broadcast rate is upper-bounded by the fractional chromatic number:
Partition the receivers into independent sets (color classes) of . Each color class is a group of receivers whose demands are pairwise satisfiable by a single XOR-coded transmission β because within an independent set, every pair of receivers has the desired message of the other in its side information.
Color the graph
Let be a proper coloring. For each color , let β an independent set in .
One XOR per color class
Within , every pair has or (non-adjacent in ). The server transmits . Each receiver recovers by XORing out the other summands (all in its side information β this requires a small additional argument handled by the MAN structure; see Lemma 4.1 of Bar-Yossef et al.).
Count transmissions
Total rate: file units. Fractional chromatic number is achievable by time-sharing fractional colorings.
Tightness question
The bound is not always tight. For general graphs, is possible (gaps have been constructed). For MAN conflict graphs, however, (Β§4.4).
Conflict Graph of a MAN Instance
Visualize the MAN conflict graph for small and . Each node is a (user, missing-subfile) pair; edges connect node pairs that cannot share a single delivery message. The number of independent sets needed to cover the graph equals the index-coding rate.
Parameters
Definition: Broadcast Rate and Minrank
Broadcast Rate and Minrank
The broadcast rate of an index coding instance is the optimal rate over all (possibly non-linear) codes. The linear broadcast rate restricts to linear codes over a fixed finite field. A foundational result (Bar-Yossef et al. 2011) shows where minrk is the minimum rank of a matrix with and whenever , free otherwise.
In general, , and the gap can be nonzero β there are index coding instances where non-linear codes strictly outperform linear codes.
Theorem: Independence Number Lower-Bounds Index Coding Rate
For any single-demand index coding instance with conflict graph , where is the independence number of .
Let be a maximum independent set of . Any two receivers in are non-conflicting β their demands can in principle share transmissions. But crucially, the information delivered to each receiver in still has entropy , so the broadcast must contain at least distinct bits worth of information from the server's side. Rate .
Choose an independent set
Let be a maximum independent set. By definition of conflict graph, for each pair , or .
Information bottleneck
Consider a genie receiver that combines the side information of all receivers in except receiver . The genie wants to recover . By receiver 's decoder, this requires receiving at least bits worth of information about .
Union bound
Summing over the receivers in : the broadcast contains at least bits of distinct demanded-message entropy. Hence rate .
Key Takeaway
The broadcast rate is sandwiched between and : For MAN conflict graphs, the sandwich closes: . For general index coding, both the upper and lower bounds can be loose, and the gap is what makes the problem NP-hard.
Example: An Index Coding Instance with
Consider 4 receivers with the following side information: , , , . Each wants file with . Compute the conflict graph, its chromatic number, and the optimal broadcast rate.
Conflict graph edges
Check each pair: : , so non-edge. : ; . Edge. : , non-edge. : , non-edge. : ; . Edge. : , non-edge.
Conflict graph: two edges and , forming two disjoint pairs.
Chromatic number
Two disjoint edges graph is bipartite. .
Optimal broadcast
With , transmit two XORs: and . Each receiver recovers its demand using its side information. Rate: .
Comparison with uncoded
Uncoded: . Coded: . Gain: factor 2. This instance exemplifies why coding beats unicasting even for simple graphs.
Common Mistake: The Chromatic Number Is Not Always Tight
Mistake:
Assuming the index coding rate always equals the chromatic number.
Correction:
is an upper bound on ; it is not always tight. Lubetzky and Stav (2009) constructed index coding instances with but . The gap is due to non-linear codes outperforming any linear (XOR-based) scheme.
For MAN conflict graphs, linear XOR codes are optimal and the chromatic / fractional chromatic / independence numbers coincide β hence the clean rate formula of Chapter 2.