Prerequisites & Notation
Before You Begin
This chapter treats equalization as an inference problem. Readers should be comfortable with the material below; if any item feels unsteady, revisit the linked chapter before proceeding.
- Maximum-likelihood estimation and detection under Gaussian noise(Review ch02)
Self-check: Can you derive the log-likelihood of a Gaussian observation and reduce ML detection to a quadratic form?
- Wiener filtering and the orthogonality principle(Review ch09)
Self-check: Can you state the Wiener–Hopf equations and solve them in the frequency domain?
- Discrete-time linear systems: convolution, -transform, DTFT
Self-check: Can you compute given a causal FIR with taps ?
- Jointly Gaussian random vectors and conditional expectation(Review ch03)
Self-check: Can you write in closed form when are jointly Gaussian?
- Dynamic programming on a graph / shortest-path thinking
Self-check: Do you recognize Bellman's principle of optimality as the engine behind Viterbi?
Notation for This Chapter
Symbols introduced or used heavily in this chapter. The channel impulse response is denoted and the channel memory is (one fewer than the number of taps). We write for the size of the modulation alphabet.
| Symbol | Meaning | Introduced |
|---|---|---|
| , | Discrete-time channel impulse response taps and channel memory (so taps total) | s01 |
| Column vector of channel taps | s01 | |
| or | DTFT of the channel impulse response | s01 |
| , , | Transmitted symbols, received samples, and AWGN samples | s01 |
| , | Modulation alphabet and its size | s01 |
| Maximum-likelihood sequence estimate | s01 | |
| Trellis state at time ; | s02 | |
| Viterbi path metric: minimum cumulative squared error reaching state at time | s02 | |
| , | Frequency responses of the ZF and MMSE linear equalizers | s03 |
| , | Noise power spectral density and input signal-to-noise ratio | s03 |
| , | Feedforward and feedback filter taps of the DFE | s04 |
| Decision delay of the equalizer (in symbols) | s04 |