Prerequisites & Notation

Before You Begin

This chapter depends on the signal-space geometry and error-probability machinery of Chapter 1 plus two staples from classical coding theory (convolutional codes and the Viterbi algorithm). If any item feels unfamiliar, revisit the linked material before proceeding.

  • Chapter 1 β€” Signal-space view, coding gain, pairwise error probability(Review ch01)

    Self-check: Can you write the pairwise error probability for an AWGN constellation in terms of the Euclidean distance between two points?

  • Convolutional codes: shift-register structure, generator polynomials, trellis representation

    Self-check: Given a rate-1/21/2 convolutional code with K=3K=3 and (g0,g1)=(7,5)oct(g_0, g_1) = (7, 5)_{\rm oct}, can you draw the encoder, list the states, and compute the free Hamming distance?

  • Viterbi algorithm: forward path metrics, survivor paths, traceback

    Self-check: Can you run Viterbi by hand on a trellis with 4 states for 4 time steps, tracking all survivor paths?

  • Standard constellations: BPSK, QPSK, 8-PSK, 16-QAM β€” coordinates and minimum distance

    Self-check: Can you compute dmin⁑2d_{\min}^2 for 8-PSK (uniform on the unit circle) and for 16-QAM (square, unit average energy)?

  • Q-function asymptotics and the AWGN error exponent

    Self-check: At high SNR, does a coding scheme with dfree2=3β‹…duncoded2d_{\rm free}^2 = 3 \cdot d_{\rm uncoded}^2 gain 10log⁑103β‰ˆ4.7710\log_{10} 3 \approx 4.77 dB over the uncoded baseline?

Notation for This Chapter

Symbols introduced in this chapter. See also the NGlobal Notation Table master table in the front matter.

SymbolMeaningIntroduced
X\mathcal{X}Signal constellation (finite subset of C\mathbb{C} or R2\mathbb{R}^2)s01
MMConstellation size, M=∣X∣M = |\mathcal{X}|s01
Ξ”i2\Delta_i^2Minimum squared Euclidean distance within any subset at partition level ii (intra-subset MSED)s01
DiD_iA subset (coset) at partition level iis01
ssTrellis states03
NsN_sNumber of trellis states, Ns=2Ξ½N_s = 2^\nu for convolutional memory Ξ½\nus03
Ξ½\nuConvolutional-encoder memory (number of delay cells)s03
KKConstraint length of a convolutional code, K=Ξ½+1K = \nu + 1s03
g0,g1g_0, g_1Generator polynomials of a rate-1/21/2 convolutional code (octal notation)s03
dHd_HHamming distance between two binary sequencess03
dfreed_{\rm free}Free Euclidean distance of a TCM scheme (minimum Euclidean distance between any two distinct code paths through the trellis)s03
dfree2d_{\rm free}^2Squared free Euclidean distances03
duncodedd_{\rm uncoded}Minimum distance of the uncoded constellation at the same spectral efficiency (baseline)s05
Ξ³c\gamma_cAsymptotic coding gain: Ξ³c=dfree2/duncoded2\gamma_c = d_{\rm free}^2 / d_{\rm uncoded}^2s05
mmNumber of information bits per modulation symbols02
m~\tilde{m}Number of coded bits per symbol used to select a subset, m~≀m\tilde{m} \le ms02
PeP_eError probability (symbol or bit, context-dependent)s05
EsE_sAverage symbol energys01
N0N_0One-sided noise power spectral densitys05