Prerequisites & Notation
Before You Begin
This chapter extends information measures to continuous random variables. We assume mastery of the discrete case from Chapter 1 and basic familiarity with continuous probability.
- Entropy, mutual information, KL divergence (Chapter 1)(Review ita/ch01)
Self-check: Can you state the information inequality and its consequences?
- Probability density functions, expectation, variance
Self-check: Can you compute for a continuous RV with PDF ?
- Gaussian distribution: PDF, moments, moment-generating function
Self-check: Can you write the PDF of and compute its variance?
- Multivariate Gaussian: joint PDF, covariance matrix, conditional distributions
Self-check: Can you write the joint PDF of and state the conditional distribution formula?
- Integration by parts, change of variables
Self-check: Can you evaluate ?
- Determinants and positive definite matrices(Review telecom/ch01)
Self-check: Can you compute the determinant of a covariance matrix?
Notation for This Chapter
Symbols introduced in this chapter. See Chapter 1 for the discrete information measures that carry over.
| Symbol | Meaning | Introduced |
|---|---|---|
| Probability density function (PDF) of continuous RV | s01 | |
| Differential entropy: | s01 | |
| Conditional differential entropy | s01 | |
| Gaussian distribution with mean and variance | s02 | |
| Covariance matrix | s03 | |
| Entropy power: | s04 | |
| Quantized version of with bin width | s05 |