References & Further Reading
References
- S. Li, M. A. Maddah-Ali, Q. Yu, and A. S. Avestimehr, A Fundamental Tradeoff Between Computation and Communication in Distributed Computing, 2018
The paper that established the $(\mu, \Delta)$ tradeoff of this chapter. Essential reading for understanding the achievability construction and the matching cut-set converse.
- Q. Yu, M. A. Maddah-Ali, and A. S. Avestimehr, Polynomial Codes: An Optimal Design for High-Dimensional Coded Matrix Multiplication, 2017. [Link]
The polynomial-code construction that reaches the information- theoretic recovery threshold for coded matrix multiplication. Chapter 5 of this book develops the result in detail; Chapter 2 uses it to illustrate the general framework.
- R. Ahlswede, N. Cai, S.-Y. R. Li, and R. W. Yeung, Network Information Flow, 2000
The foundational paper on network coding. Introduces the butterfly network and demonstrates that coding at intermediate nodes can exceed the rates of routing alone — the conceptual ancestor of coded distributed computing.
- T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley-Interscience, 2nd ed., 2006
Standard reference for the entropy, mutual information, and cut-set tools used in Sections 2.1–2.4.
- A. El Gamal and Y.-H. Kim, Network Information Theory, Cambridge University Press, 2011
Complete treatment of multi-terminal coding theorems, including the cut-set bound in its most general form.
- M. Goldenbaum, H. Boche, and S. Stańczak, Harnessing Interference for Analog Function Computation in Wireless Sensor Networks, 2013
Reference for over-the-air computation — the paradigm that Chapter 16 builds on. Demonstrates that the multiple-access channel can be used to compute aggregates analytically, short-circuiting the digital-communication-plus-sum pipeline.
- S. Li and A. S. Avestimehr, Coded Computing: Mitigating Fundamental Bottlenecks in Large-Scale Distributed Computing and Machine Learning, 2020
The definitive survey of coded computing at the time of writing. Chapters 2 and 5 of this book share notation and framing with this survey.
- R. Tandon, Q. Lei, A. G. Dimakis, and N. Karampatziakis, Gradient Coding: Avoiding Stragglers in Distributed Learning, 2017. [Link]
Introduces gradient coding as a coded-computing technique for distributed gradient descent. Chapter 6 of this book develops the result within the Chapter 2 framework.
- A. G. Dimakis, P. B. Godfrey, Y. Wu, M. J. Wainwright, and K. Ramchandran, Network Coding for Distributed Storage Systems, 2010
Connects distributed computing to distributed storage. Many of the storage/communication tradeoff tools in Part II of this book have their origins in the distributed-storage literature.
- M. A. Maddah-Ali and U. Niesen, Fundamental Limits of Caching, 2014
The fundamental-limits-of-caching result whose achievability construction (centralized coded caching) directly mirrors the coded-shuffling construction of §2.3.
- P. Kairouz, H. B. McMahan, and others, Advances and Open Problems in Federated Learning, 2021
Comprehensive survey of the federated-learning setting in which the $(\mu, \Delta, K)$ framework of this chapter is routinely augmented with privacy and Byzantine parameters.
Further Reading
Resources for readers who want to go deeper into the network information theory that underlies this book.
Cut-set bounds at full generality
El Gamal & Kim, *Network Information Theory*, Ch. 4
The cleanest modern treatment of the cut-set theorem and its specializations. Worth studying before Chapter 13, where PIR converse arguments depend on more delicate cut choices.
The connection between coded computing and network coding
R. Dougherty, C. Freiling, and K. Zeger, 'Insufficiency of Linear Coding in Network Information Flow,' IEEE T-IT, 2005
Shows when linear codes suffice for network information flow and when they do not — relevant context for understanding why polynomial codes (linear over an extension field) work for coded matrix multiplication.
Information-theoretic security foundations
Book ITA, Chapter 20 (Secrecy Capacity)
Sets up the entropy-based privacy guarantees that Chapter 10 will require. Readers planning to focus on Part III should work through the ITA secrecy chapter first.
Multi-terminal source coding for federated data
Book ITA, Chapter 7 (Distributed Source Coding)
Slepian–Wolf coding and its variants are the source-coding dual of the channel-coding view developed here. Part V of this book occasionally combines the two (e.g., joint source- channel coding for analog FL).