Overview of Practical Channel Codes
From Existence to Construction
Shannon proved in 1948 that capacity-achieving codes exist, but his random coding argument gave no hint of how to build them. The search for practical capacity-approaching codes became the central challenge of coding theory for the next 50 years. Three families eventually emerged, each with a different key idea:
- Turbo codes (1993): two simple convolutional codes connected by an interleaver, decoded iteratively.
- LDPC codes (1960/1996): sparse parity-check matrices enabling efficient message-passing decoding.
- Polar codes (2009): channel polarization transforms copies of a mediocre channel into nearly-perfect and nearly-useless channels.
This section provides an information-theoretic perspective on each family β the construction details and decoding algorithms are covered in Book telecom, Ch. 12.
Definition: Turbo Codes
Turbo Codes
A turbo code consists of two (or more) constituent convolutional encoders connected in parallel through a random interleaver. The information bits are encoded by both encoders: the first sees them in their original order, the second sees them after interleaving.
The key innovation is iterative decoding: each constituent decoder produces soft information (log-likelihood ratios) about the information bits, which is passed as a priori information to the other decoder. The decoders iterate, with each iteration refining the estimates, until convergence.
Turbo code
A parallel concatenated convolutional code with an interleaver, decoded iteratively. The first family of codes to approach within 0.5 dB of the Shannon limit for moderate block lengths.
Related: LDPC code, Polar code
Information-Theoretic View of Turbo Codes
From an information-theoretic perspective, the turbo code succeeds because the interleaver ensures that the two constituent codes see "nearly independent" versions of the information. Iterative decoding approximates optimal (MAP) decoding: at each iteration, the extrinsic information from one decoder plays the role of the prior for the other. As the block length grows, the interleaver makes the constituent codes "more independent," and the iterative decoder approaches MAP performance.
The density evolution analysis (introduced for LDPC codes but applicable here) tracks the distribution of log-likelihood ratios through iterations, predicting the SNR threshold below which the decoder fails to converge.
Historical Note: The Turbo Revolution
When Berrou, Glavieux, and Thitimajshima presented turbo codes at the 1993 ICC conference in Geneva, claiming performance within 0.5 dB of the Shannon limit, the audience was deeply skeptical. Many suspected an error in the simulation. It took several months of independent verification before the community accepted the result.
The impact was enormous: turbo codes were adopted in 3G (UMTS/CDMA2000), deep-space communications (CCSDS), and DVB satellite standards. They also catalyzed the rediscovery of LDPC codes and eventually the invention of polar codes β all sharing the common thread of iterative processing and soft information exchange.
Definition: Low-Density Parity-Check (LDPC) Codes
Low-Density Parity-Check (LDPC) Codes
An LDPC code is a linear block code defined by a sparse parity-check matrix , meaning that each row and column of has a small number of ones relative to the block length.
A -regular LDPC code has exactly ones per column (variable node degree) and ones per row (check node degree). The code rate is approximately for large block lengths.
Decoding is by belief propagation (message passing) on the Tanner graph: variable nodes and check nodes exchange soft messages iteratively until convergence.
LDPC code
Low-Density Parity-Check code: a linear code with a sparse parity-check matrix, decoded by belief propagation. Invented by Gallager (1962), rediscovered by MacKay and Neal (1996). Capacity-approaching for many channel models.
Related: Turbo code, Polar code
Theorem: LDPC Codes Achieve Capacity
For the binary-input AWGN channel, there exist sequences of irregular LDPC codes decoded by belief propagation whose thresholds approach the Shannon limit as the maximum variable node degree grows. Specifically, for any , there exists an LDPC code ensemble with belief propagation threshold within of the Shannon limit.
The key tool is density evolution: tracking the probability density of the messages passed in belief propagation as a function of iteration number and SNR. The threshold is the minimum SNR at which the message densities converge to the correct codeword. By optimizing the degree distribution (the fraction of variable nodes with each degree), the threshold can be pushed arbitrarily close to capacity.
Density evolution
Under the assumption that the Tanner graph is locally tree-like (valid for large block lengths), the messages in belief propagation are independent. The density of messages at iteration is a deterministic function of the density at iteration and the channel SNR.
Threshold computation
The threshold is the infimum of SNR values for which the density evolution converges to the point-mass at (correct decoding). For optimized irregular degree distributions, as the maximum degree .
Comparison of Capacity-Approaching Code Families
| Property | Turbo Codes | LDPC Codes | Polar Codes |
|---|---|---|---|
| Year introduced | 1993 | 1962 (rediscovered 1996) | 2009 |
| Key idea | Parallel concatenation + interleaver | Sparse parity-check + belief propagation | Channel polarization + successive cancellation |
| Capacity achieving? | Conjectured (no proof) | Yes (irregular, BP) | Yes (provably, SC) |
| Decoding | Iterative (BCJR) | Iterative (BP) | Successive cancellation (SC/SCL) |
| Decoding complexity | per iteration | per iteration | for SC |
| Error floor | Present (low-weight codewords) | Present (trapping sets) | No error floor |
| Standards | 3G, LTE, CCSDS | 5G NR (data), Wi-Fi, DVB | 5G NR (control) |
| Gap to capacity (practical) | ~0.3-0.5 dB | ~0.1-0.3 dB | ~0.2-0.5 dB (SCL) |
Channel Coding in 5G NR
5G NR uses two different coding families:
- LDPC codes for data channels (PDSCH/PUSCH): two base graphs optimized for different rate ranges (BG1 for rates , BG2 for rates ). Block lengths up to 8448 bits.
- Polar codes for control channels (PDCCH/PUCCH): block lengths up to 1024 bits with CRC-aided successive cancellation list (CA-SCL) decoding.
The choice reflects the different requirements: data channels need high throughput (favoring the parallelizable structure of LDPC), while control channels need reliability at short block lengths (where polar codes with list decoding excel).
- β’
LDPC base graph 1: max info bits 8448, rates 1/3 to 8/9
- β’
LDPC base graph 2: max info bits 3840, rates 1/5 to 2/3
- β’
Polar codes: max block length 1024, CA-SCL with list size 8
BER Performance of Turbo, LDPC, and Polar Codes
Compare the bit error rate (BER) of turbo, LDPC, and polar codes as a function of for a fixed code rate and block length. Observe how close each comes to the Shannon limit.
Parameters
Common Mistake: 'Capacity Achieving' Has Caveats
Mistake:
Assuming that a "capacity-achieving" code family performs near capacity at any block length and with practical decoding.
Correction:
"Capacity-achieving" means that performance approaches capacity as block length with optimal (or specified) decoding. At finite block lengths, there is always a gap. Turbo codes are only conjectured to achieve capacity. LDPC codes achieve capacity under belief propagation (suboptimal) decoding. Polar codes provably achieve capacity under successive cancellation but need list decoding (CA-SCL) for competitive finite-length performance.
Quick Check
Which coding family is used for data channels (PDSCH) in 5G NR?
Turbo codes
LDPC codes
Polar codes
Convolutional codes
5G NR uses LDPC codes for data channels and polar codes for control channels. This replaced the turbo codes used in 4G LTE for data. The switch was driven by LDPC's better throughput-per-area at high code rates and block lengths, and the ease of parallelizing the decoder in hardware.