Part 1: Foundations: Entropy, Divergence, and Typicality
Chapter 1: Information Measures for Discrete Random Variables
Foundational~210 min
Learning Objectives
- Define entropy, joint entropy, conditional entropy, and mutual information for discrete random variables
- State and prove the information inequality via Jensen's inequality
- Compute KL divergence and understand its role as the mother of all information inequalities
- Apply the data processing inequality and Fano's inequality in converse arguments
- Characterize maximum entropy distributions under moment constraints
- Recognize convexity/concavity properties that make capacity optimization tractable
Sections
💬 Discussion
Loading discussions...