Prerequisites & Notation
Before You Begin
The Coded Caching book relies on a compact information-theoretic toolkit plus some basic combinatorics. If any item below feels unfamiliar, revisit the linked chapter before proceeding.
- Shannon entropy and mutual information for discrete RVs(Review ch01)
Self-check: Can you state the chain rule and identify when equality in the joint-entropy bound holds?
- Broadcast channel basics — single sender, multiple receivers(Review ch15)
Self-check: Given a Gaussian BC with two receivers of different SNRs, can you sketch the capacity region?
- Basic probability and expectations over discrete distributions
Self-check: Can you compute when is Zipf-distributed over ?
- Binomial coefficients and basic counting
Self-check: Can you state that and sketch why ?
- Linear codes over and the XOR operation
Self-check: Given bits and , can you recover if you know and ?
- Asymptotic notation , ,
Self-check: Can you distinguish from in the context of scaling laws?
Notation for This Chapter
Symbols introduced in Chapter 1. The coded-caching vocabulary sticks around for the rest of the book, so it pays to internalize it now.
| Symbol | Meaning | Introduced |
|---|---|---|
| Number of users sharing the bottleneck link | s01 | |
| Number of files in the library | s01 | |
| Per-user cache size, measured in file units () | s01 | |
| Memory ratio — the x-axis of the memory-load tradeoff curve | s01 | |
| Delivery rate (file units per channel use of the shared link) | s01 | |
| Request probability for file (popularity distribution) | s02 | |
| Cache content of user after the placement phase | s01 | |
| Demand vector where | s01 |