Prerequisites

Before You Begin

This chapter builds on linear algebra (Chapter 1), probability and random processes (Chapter 2), digital modulation (Chapter 8), detection and estimation theory (Chapter 9), and information theory (Chapter 11). Generator and parity-check matrices require matrix algebra from Chapter 1. Performance analysis uses the Q-function and error probability framework from Chapter 9. Coding bounds and capacity-approaching arguments rely on the channel capacity concepts from Chapter 11.

  • Matrix operations over finite fields (binary arithmetic)(Review ch01)

    Self-check: Can you perform matrix-vector multiplication over GF(2)\mathrm{GF}(2) (i.e., with modulo-2 addition and multiplication)?

  • Probability, Bayes rule, and random variables(Review ch02)

    Self-check: Can you compute posterior probabilities using Bayes' rule and work with log-likelihood ratios?

  • Digital modulation and BER in AWGN(Review ch08)

    Self-check: Can you compute the uncoded BER for BPSK as Pb=Q(2Eb/N0)P_b = Q(\sqrt{2E_b/N_0})?

  • ML and MAP detection(Review ch09)

    Self-check: Can you derive the ML decision rule for binary hypothesis testing and compute the resulting error probability?

  • Channel capacity and the Shannon limit(Review ch11)

    Self-check: Can you state the channel capacity of the binary-input AWGN channel and explain what it means for a code to be "capacity-approaching"?

Chapter 12 Notation

Key symbols introduced or heavily used in this chapter.

SymbolMeaningIntroduced
nnBlock code length (number of coded bits)s01
kkNumber of information bits per blocks01
Rc=k/nR_c = k/nCode rates01
dmind_{\min}Minimum Hamming distance of a codes01
ttError-correcting capability, t=(dmin1)/2t = \lfloor (d_{\min}-1)/2 \rfloors01
G\mathbf{G}Generator matrix (k×nk \times n)s01
H\mathbf{H}Parity-check matrix ((nk)×n(n-k) \times n)s01
KKConstraint length of a convolutional codes02
dfreed_{\text{free}}Free distance of a convolutional codes02
L(u)L(u)Log-likelihood ratio for bit uus03
IA,IEI_A, I_EA priori and extrinsic mutual information (EXIT chart)s03
γc\gamma_cCoding gain (dB)s01
CBICMC_{\text{BICM}}BICM capacity (bits/channel use)s06
NiterN_{\text{iter}}Number of decoding iterationss03
DDInterleaving depths08