Prerequisites & Notation
Before You Begin
Chapter 2 re-expresses the three challenges of Chapter 1 in the formal language of network information theory. The mathematical prerequisites are modest but concrete β most notably entropy, mutual information, and conditional entropy for discrete sources. Readers who are fluent with Book ITA Chapters 1β3 can proceed directly; others may want to refresh before tackling Section 2.3's converse argument.
- Cut-set bound for multi-terminal networks(Review ch21)
Self-check: Can you state the cut-set bound for a three-node relay network?
- Chapter 1 of this book β MapReduce, stragglers, distributed SGD(Review ch01)
Self-check: Can you state the computation load and communication load informally?
Notation for This Chapter
Chapter 2 is where we formalize the information-theoretic objects that will reappear throughout the book. Keep this table close while reading Sections 2.2β2.4.
| Symbol | Meaning | Introduced |
|---|---|---|
| Input dataset as a random variable (uniform over its support) | s01 | |
| Portion of stored at worker | s01 | |
| Number of workers | s01 | |
| Output of the distributed computation (scalar or vector) | s01 | |
| Message transmitted by worker (output of its local encoder) | s01 | |
| Normalized per-worker storage: | s02 | |
| Normalized total communication load (bits / intermediate-file size) | s02 | |
| Recovery threshold: any responses suffice | s02 | |
| , | Computation and communication rates (bits per input sample) | s02 |
| Shannon entropy | s03 | |
| Mutual information | s03 |