Prerequisites & Notation
Before You Begin
This chapter builds on classical information theory (entropy, rate-distortion, channel coding) and the information bottleneck framework from Chapter 28. We also draw on the joint source-channel coding theorem from Chapter 19. Familiarity with basic neural network concepts (autoencoders, end-to-end training) is helpful but not required.
- Rate-distortion theory(Review ita/ch06)
Self-check: Can you state the rate-distortion function and explain the achievability proof?
- Channel coding theorem(Review ita/ch09)
Self-check: Can you state the channel coding theorem for a DMC?
- Joint source-channel coding (separation theorem)(Review ita/ch19)
Self-check: Can you state Shannon's separation theorem and explain when it is optimal?
- Information bottleneck(Review ita/ch28)
Self-check: Can you write the IB Lagrangian and explain the relevance-compression tradeoff?
Notation for This Chapter
We introduce notation for semantic communication extending the standard information-theoretic framework.
| Symbol | Meaning | Introduced |
|---|---|---|
| Rate-distortion function | ch06 | |
| Channel capacity | ch09 | |
| Utility function (task performance metric) | s01 | |
| Rate-utility function: minimum rate to achieve utility level | s01 | |
| Semantic distortion measure | s03 | |
| Mutual information | ch01 | |
| Shannon entropy | ch01 |