Overview of Practical Channel Codes

From Existence to Construction

Shannon proved in 1948 that capacity-achieving codes exist, but his random coding argument gave no hint of how to build them. The search for practical capacity-approaching codes became the central challenge of coding theory for the next 50 years. Three families eventually emerged, each with a different key idea:

  • Turbo codes (1993): two simple convolutional codes connected by an interleaver, decoded iteratively.
  • LDPC codes (1960/1996): sparse parity-check matrices enabling efficient message-passing decoding.
  • Polar codes (2009): channel polarization transforms NN copies of a mediocre channel into nearly-perfect and nearly-useless channels.

This section provides an information-theoretic perspective on each family β€” the construction details and decoding algorithms are covered in Book telecom, Ch. 12.

Definition:

Turbo Codes

A turbo code consists of two (or more) constituent convolutional encoders connected in parallel through a random interleaver. The information bits are encoded by both encoders: the first sees them in their original order, the second sees them after interleaving.

The key innovation is iterative decoding: each constituent decoder produces soft information (log-likelihood ratios) about the information bits, which is passed as a priori information to the other decoder. The decoders iterate, with each iteration refining the estimates, until convergence.

Turbo code

A parallel concatenated convolutional code with an interleaver, decoded iteratively. The first family of codes to approach within 0.5 dB of the Shannon limit for moderate block lengths.

Related: LDPC code, Polar code

Information-Theoretic View of Turbo Codes

From an information-theoretic perspective, the turbo code succeeds because the interleaver ensures that the two constituent codes see "nearly independent" versions of the information. Iterative decoding approximates optimal (MAP) decoding: at each iteration, the extrinsic information from one decoder plays the role of the prior for the other. As the block length grows, the interleaver makes the constituent codes "more independent," and the iterative decoder approaches MAP performance.

The density evolution analysis (introduced for LDPC codes but applicable here) tracks the distribution of log-likelihood ratios through iterations, predicting the SNR threshold below which the decoder fails to converge.

Historical Note: The Turbo Revolution

When Berrou, Glavieux, and Thitimajshima presented turbo codes at the 1993 ICC conference in Geneva, claiming performance within 0.5 dB of the Shannon limit, the audience was deeply skeptical. Many suspected an error in the simulation. It took several months of independent verification before the community accepted the result.

The impact was enormous: turbo codes were adopted in 3G (UMTS/CDMA2000), deep-space communications (CCSDS), and DVB satellite standards. They also catalyzed the rediscovery of LDPC codes and eventually the invention of polar codes β€” all sharing the common thread of iterative processing and soft information exchange.

Definition:

Low-Density Parity-Check (LDPC) Codes

An LDPC code is a linear block code defined by a sparse parity-check matrix H\mathbf{H}, meaning that each row and column of H\mathbf{H} has a small number of ones relative to the block length.

A (dv,dc)(d_v, d_c)-regular LDPC code has exactly dvd_v ones per column (variable node degree) and dcd_c ones per row (check node degree). The code rate is approximately 1βˆ’dv/dc1 - d_v/d_c for large block lengths.

Decoding is by belief propagation (message passing) on the Tanner graph: variable nodes and check nodes exchange soft messages iteratively until convergence.

LDPC code

Low-Density Parity-Check code: a linear code with a sparse parity-check matrix, decoded by belief propagation. Invented by Gallager (1962), rediscovered by MacKay and Neal (1996). Capacity-approaching for many channel models.

Related: Turbo code, Polar code

Theorem: LDPC Codes Achieve Capacity

For the binary-input AWGN channel, there exist sequences of irregular LDPC codes decoded by belief propagation whose thresholds approach the Shannon limit as the maximum variable node degree grows. Specifically, for any Ο΅>0\epsilon > 0, there exists an LDPC code ensemble with belief propagation threshold within Ο΅\epsilon of the Shannon limit.

The key tool is density evolution: tracking the probability density of the messages passed in belief propagation as a function of iteration number and SNR. The threshold is the minimum SNR at which the message densities converge to the correct codeword. By optimizing the degree distribution (the fraction of variable nodes with each degree), the threshold can be pushed arbitrarily close to capacity.

Comparison of Capacity-Approaching Code Families

PropertyTurbo CodesLDPC CodesPolar Codes
Year introduced19931962 (rediscovered 1996)2009
Key ideaParallel concatenation + interleaverSparse parity-check + belief propagationChannel polarization + successive cancellation
Capacity achieving?Conjectured (no proof)Yes (irregular, BP)Yes (provably, SC)
DecodingIterative (BCJR)Iterative (BP)Successive cancellation (SC/SCL)
Decoding complexityO(Kn)O(Kn) per iterationO(ndv)O(n d_v) per iterationO(nlog⁑n)O(n \log n) for SC
Error floorPresent (low-weight codewords)Present (trapping sets)No error floor
Standards3G, LTE, CCSDS5G NR (data), Wi-Fi, DVB5G NR (control)
Gap to capacity (practical)~0.3-0.5 dB~0.1-0.3 dB~0.2-0.5 dB (SCL)
⚠️Engineering Note

Channel Coding in 5G NR

5G NR uses two different coding families:

  • LDPC codes for data channels (PDSCH/PUSCH): two base graphs optimized for different rate ranges (BG1 for rates β‰₯1/4\geq 1/4, BG2 for rates <1/4< 1/4). Block lengths up to 8448 bits.
  • Polar codes for control channels (PDCCH/PUCCH): block lengths up to 1024 bits with CRC-aided successive cancellation list (CA-SCL) decoding.

The choice reflects the different requirements: data channels need high throughput (favoring the parallelizable structure of LDPC), while control channels need reliability at short block lengths (where polar codes with list decoding excel).

Practical Constraints
  • β€’

    LDPC base graph 1: max info bits 8448, rates 1/3 to 8/9

  • β€’

    LDPC base graph 2: max info bits 3840, rates 1/5 to 2/3

  • β€’

    Polar codes: max block length 1024, CA-SCL with list size 8

πŸ“‹ Ref: 3GPP TS 38.212

BER Performance of Turbo, LDPC, and Polar Codes

Compare the bit error rate (BER) of turbo, LDPC, and polar codes as a function of Eb/N0E_b/N_0 for a fixed code rate and block length. Observe how close each comes to the Shannon limit.

Parameters
0.5
1024

Common Mistake: 'Capacity Achieving' Has Caveats

Mistake:

Assuming that a "capacity-achieving" code family performs near capacity at any block length and with practical decoding.

Correction:

"Capacity-achieving" means that performance approaches capacity as block length nβ†’βˆžn \to \infty with optimal (or specified) decoding. At finite block lengths, there is always a gap. Turbo codes are only conjectured to achieve capacity. LDPC codes achieve capacity under belief propagation (suboptimal) decoding. Polar codes provably achieve capacity under successive cancellation but need list decoding (CA-SCL) for competitive finite-length performance.

Quick Check

Which coding family is used for data channels (PDSCH) in 5G NR?

Turbo codes

LDPC codes

Polar codes

Convolutional codes