Prerequisites & Notation

Before You Begin

This chapter builds directly on belief propagation from Chapter 18 and on classical linear MMSE estimation from earlier FSI chapters. The turbo principle is the practical incarnation of message passing on loopy factor graphs: iterative exchange of extrinsic information between soft-input soft-output (SISO) blocks. Readers should be comfortable with the following.

  • Factor graphs and the sum-product algorithm(Review ch18)

    Self-check: Can you write the variable-to-factor and factor-to-variable update rules and describe why BP is exact on trees?

  • Log-likelihood ratios and soft-input soft-output decoding(Review ch18)

    Self-check: Can you express a posterior probability as a sigmoid of an LLR and explain the distinction between a-priori, a-posteriori, and extrinsic information?

  • Linear MMSE estimation

    Self-check: Given y=Hx+w\mathbf{y} = \mathbf{H}\mathbf{x} + \mathbf{w} with known second-order statistics, can you write the LMMSE estimator and its MSE matrix?

  • Convolutional codes and the BCJR algorithm

    Self-check: Can you describe the forward-backward recursion on a trellis and obtain bitwise a-posteriori LLRs?

  • ISI channel model and MMSE equalization

    Self-check: Can you derive the block-form LMMSE equalizer for a finite-length ISI channel?

  • Mutual information between a binary input and an LLR

    Self-check: Given a symmetric Gaussian LLR model, can you compute I(X;L)=J(Οƒ)I(X; L) = J(\sigma)?

Notation for This Chapter

Symbols introduced and used in Chapter 19. The mutual-information axis variables IAI_A and IEI_E are the workhorses of EXIT analysis.

SymbolMeaningIntroduced
LAL_AA-priori log-likelihood ratio input to a SISO blocks01
LEL_EExtrinsic log-likelihood ratio output of a SISO blocks01
LDL_DA-posteriori (decision) log-likelihood ratio, LD=LA+LE+LchL_D = L_A + L_E + L_{ch}s01
IA,IEI_A, I_EMutual information between the coded bit and the a-priori / extrinsic LLRs01
Tdec(IA)T_{\text{dec}}(I_A)EXIT transfer function of a soft-input soft-output decoders01
Teq(IA)T_{\text{eq}}(I_A)EXIT transfer function of a soft-input soft-output equalizer/detectors02
J(Οƒ)J(\sigma)Mutual information between an equiprobable binary symbol and its LLR observed through a symmetric Gaussian channel of LLR std. deviation Οƒ\sigmas01
xˉk\bar{x}_kSoft (posterior mean) symbol estimate of data symbol xkx_ks02
vkv_kResidual variance of the soft symbol estimate, vk=E[∣xkβˆ’xΛ‰k∣2]v_k = \mathbb{E}[|x_k - \bar{x}_k|^2]s02
Ξ \boldsymbol{\Pi}Interleaver permutation matrixs01
q⋆(x)q^\star(x)Projected (Gaussian) message produced by the EP moment-matching steps04
ProjF[β‹…]\text{Proj}_{\mathcal{F}}[\cdot]Projection onto exponential family F\mathcal{F} via KL minimizations04