Chapter Summary
Chapter 18 Summary
Key Points
- 1.
Non-linear coded computing (§18.1): polynomial approximation gives recovery threshold for smooth , but breaks down for non-smooth (ReLU). Learned coded computing handles arbitrary functions empirically but lacks theoretical foundation. Information-theoretic optimality is open.
- 2.
Three-axis frontier (§18.2): joint privacy-robustness-DP has known lower bound . Achievable via serial composition at cost; jointly-optimal schemes are open. CommIT research extends ByzSecAgg with DP.
- 3.
Decentralized FL (§18.3): D-SGD over graphs matches centralized asymptotic rate with graph-dependent constants. CCESA-style sparsification, AirComp-over-gossip, and graph-aware coded computing are open directions.
- 4.
Modern ML (§18.4): coded attention, coded MoE, federated RAG bridge the book's framework to transformer-era workloads. Quantum PIR offers strict rate advantages in some regimes. All are active research.
- 5.
The golden thread persists: privacy, robustness, and communication efficiency remain fundamentally coupled across all five parts. The book closes by mapping the open frontier — not by wrapping it up.