Prerequisites & Notation
Before You Begin
This chapter depends on the signal-space geometry and error-probability machinery of Chapter 1 plus two staples from classical coding theory (convolutional codes and the Viterbi algorithm). If any item feels unfamiliar, revisit the linked material before proceeding.
- Chapter 1 β Signal-space view, coding gain, pairwise error probability(Review ch01)
Self-check: Can you write the pairwise error probability for an AWGN constellation in terms of the Euclidean distance between two points?
- Convolutional codes: shift-register structure, generator polynomials, trellis representation
Self-check: Given a rate- convolutional code with and , can you draw the encoder, list the states, and compute the free Hamming distance?
- Viterbi algorithm: forward path metrics, survivor paths, traceback
Self-check: Can you run Viterbi by hand on a trellis with 4 states for 4 time steps, tracking all survivor paths?
- Standard constellations: BPSK, QPSK, 8-PSK, 16-QAM β coordinates and minimum distance
Self-check: Can you compute for 8-PSK (uniform on the unit circle) and for 16-QAM (square, unit average energy)?
- Q-function asymptotics and the AWGN error exponent
Self-check: At high SNR, does a coding scheme with gain dB over the uncoded baseline?
Notation for This Chapter
Symbols introduced in this chapter. See also the NGlobal Notation Table master table in the front matter.
| Symbol | Meaning | Introduced |
|---|---|---|
| Signal constellation (finite subset of or ) | s01 | |
| Constellation size, | s01 | |
| Minimum squared Euclidean distance within any subset at partition level (intra-subset MSED) | s01 | |
| A subset (coset) at partition level | s01 | |
| Trellis state | s03 | |
| Number of trellis states, for convolutional memory | s03 | |
| Convolutional-encoder memory (number of delay cells) | s03 | |
| Constraint length of a convolutional code, | s03 | |
| Generator polynomials of a rate- convolutional code (octal notation) | s03 | |
| Hamming distance between two binary sequences | s03 | |
| Free Euclidean distance of a TCM scheme (minimum Euclidean distance between any two distinct code paths through the trellis) | s03 | |
| Squared free Euclidean distance | s03 | |
| Minimum distance of the uncoded constellation at the same spectral efficiency (baseline) | s05 | |
| Asymptotic coding gain: | s05 | |
| Number of information bits per modulation symbol | s02 | |
| Number of coded bits per symbol used to select a subset, | s02 | |
| Error probability (symbol or bit, context-dependent) | s05 | |
| Average symbol energy | s01 | |
| One-sided noise power spectral density | s05 |