Part 6: Modern Extensions and Research Frontiers

Chapter 28: Information-Theoretic Approaches to Machine Learning

Research~150 min

Learning Objectives

  • Formulate the information bottleneck as a lossy source coding problem and solve it via the Blahut–Arimoto algorithm
  • Derive mutual information bounds on generalization error and connect them to rate-distortion theory
  • Characterize the communication complexity of distributed statistical estimation and federated learning
  • Analyze the computation capacity of the multiple access channel for over-the-air aggregation
  • Connect information-theoretic tools (entropy, divergence, mutual information) to learning-theoretic concepts (generalization, compression, sample complexity)
  • Evaluate the fundamental limits of federated learning under communication constraints

Sections

💬 Discussion

Loading discussions...