Prerequisites & Notation

Before You Begin

This chapter builds on classical information theory (entropy, rate-distortion, channel coding) and the information bottleneck framework from Chapter 28. We also draw on the joint source-channel coding theorem from Chapter 19. Familiarity with basic neural network concepts (autoencoders, end-to-end training) is helpful but not required.

  • Rate-distortion theory(Review ita/ch06)

    Self-check: Can you state the rate-distortion function and explain the achievability proof?

  • Channel coding theorem(Review ita/ch09)

    Self-check: Can you state the channel coding theorem for a DMC?

  • Joint source-channel coding (separation theorem)(Review ita/ch19)

    Self-check: Can you state Shannon's separation theorem and explain when it is optimal?

  • Information bottleneck(Review ita/ch28)

    Self-check: Can you write the IB Lagrangian and explain the relevance-compression tradeoff?

Notation for This Chapter

We introduce notation for semantic communication extending the standard information-theoretic framework.

SymbolMeaningIntroduced
RRRate-distortion functionch06
CCChannel capacitych09
U(S^,G)U(\hat{S}, G)Utility function (task performance metric)s01
RU(u)R_U(u)Rate-utility function: minimum rate to achieve utility level uus01
dsemd_{\text{sem}}Semantic distortion measures03
IIMutual informationch01
HHShannon entropych01