Turbo Equalization

From Turbo Codes to Turbo Equalization

An ISI channel is, algebraically, a kind of code: it maps the input symbol sequence to a noisy output sequence through a trellis determined by the channel taps. A channel code does the same thing. Douillard and co-authors (1995) observed that if both of these operations are treated as SISO blocks, the turbo principle applies: a SISO equalizer and a SISO decoder can exchange extrinsic information through a bit interleaver.

The result is turbo equalization: a receiver architecture that performs joint detection and decoding by message passing on the factor graph formed by the ISI trellis and the code's Tanner graph, joined by an interleaver. The gain over separately equalizing and decoding is typically several decibels on severely dispersive channels.

Factor Graph for Turbo Equalization

Factor Graph for Turbo Equalization
The ISI channel imposes constraints on consecutive received samples (diamond factors); the code imposes constraints on groups of bits (square parity factors). An interleaver Π\boldsymbol{\Pi} connects the two subgraphs. Extrinsic LLRs flow back and forth through the interleaver.

Definition:

Linear ISI Channel Model

With transmitted symbol sequence {xk}k=1K\{x_k\}_{k=1}^K drawn from a finite-alphabet constellation X\mathcal{X} (for simplicity X={+1,1}\mathcal{X} = \{+1,-1\}) and channel impulse response h=(h0,h1,,hL1)\mathbf{h} = (h_0, h_1, \ldots, h_{L-1}), the received samples are yk==0L1hxk+wk,wkCN(0,σ2).y_k = \sum_{\ell=0}^{L-1} h_\ell \, x_{k-\ell} + w_k, \qquad w_k \sim \mathcal{CN}(0, \sigma^2). In block form, y=Hx+w\mathbf{y} = \mathbf{H}\mathbf{x} + \mathbf{w} where H\mathbf{H} is a banded Toeplitz matrix.

Definition:

MAP (BCJR) Equalizer

The MAP (Bahl–Cocke–Jelinek–Raviv) equalizer computes, for each symbol xkx_k, the a-posteriori LLR LD[k]=logPr(xk=+1y,LA)Pr(xk=1y,LA),L_D[k] = \log \frac{\Pr(x_k = +1 | \mathbf{y}, L_A)}{\Pr(x_k = -1 | \mathbf{y}, L_A)}, on the ISI trellis with 2L12^{L-1} states. It is the sum-product algorithm on the chain factor graph of the ISI channel. Extrinsic LLRs are obtained by subtracting the a-priori input: LE[k]=LD[k]LA[k]L_E[k] = L_D[k] - L_A[k].

BCJR has complexity O(K2L1X)\mathcal{O}(K \cdot 2^{L-1} \cdot |\mathcal{X}|) per iteration. For short delay spreads (L5L \leq 5, binary symbols) this is feasible, but for massive MIMO, frequency-selective OFDM tails, or high-order constellations the exponential state count forces a linear approximation — the MMSE-PIC equalizer.

Definition:

MMSE-PIC Equalizer with Soft Interference Cancellation

Let xˉ=E[xLA]\bar{\mathbf{x}} = \mathbb{E}[\mathbf{x}|L_A] be the vector of soft symbol estimates computed from a-priori LLRs, and let V=diag(v1,,vK)\mathbf{V} = \text{diag}(v_1, \ldots, v_K) with vk=Var(xkLA[k])=1xˉk2v_k = \text{Var}(x_k | L_A[k]) = 1 - \bar{x}_k^2 for BPSK.

To estimate xkx_k, cancel all other interferers with their soft estimates and apply an LMMSE filter to the residual:

y~k=yHxˉ+hkxˉk,\tilde{\mathbf{y}}_k = \mathbf{y} - \mathbf{H}\bar{\mathbf{x}} + \mathbf{h}_k \bar{x}_k,

x^k=fkHy~k,fk=(HVkHH+σ2I(vk1)hkhkH)1hk,\hat{x}_k = \mathbf{f}_k^H \tilde{\mathbf{y}}_k, \qquad \mathbf{f}_k = \Big(\mathbf{H}\mathbf{V}_k\mathbf{H}^H + \sigma^2\mathbf{I} - (v_k - 1)\mathbf{h}_k\mathbf{h}_k^H\Big)^{-1}\mathbf{h}_k,

where hk\mathbf{h}_k is column kk of H\mathbf{H} and Vk\mathbf{V}_k is V\mathbf{V} with its kk-th entry replaced by 11 (full variance on the target symbol). The extrinsic LLR is then formed under a Gaussian assumption on x^kxk\hat{x}_k | x_k.

The filter fk\mathbf{f}_k depends on xˉ\bar{\mathbf{x}} through V\mathbf{V}, so it must be recomputed every iteration. The low-complexity variant freezes VvˉI\mathbf{V} \approx \bar{v}\mathbf{I} at the ensemble average, yielding a single filter per iteration.

Theorem: Fixed Points of MMSE-PIC Are LMMSE at Endpoints

Consider the MMSE-PIC equalizer with scalar variance vˉ\bar{v} acting on the ISI channel. The limiting behavior is:

  • When vˉ=1\bar{v} = 1 (no a-priori information, LA=0L_A = 0), MMSE-PIC reduces to the classical linear MMSE equalizer fk=(HHH+σ2I)1hk\mathbf{f}_k = (\mathbf{H}\mathbf{H}^H + \sigma^2\mathbf{I})^{-1}\mathbf{h}_k.
  • When vˉ=0\bar{v} = 0 (perfect a-priori information, LA|L_A| \to \infty), MMSE-PIC reduces to matched filtering on the residual: fk=hk/σ2\mathbf{f}_k = \mathbf{h}_k / \sigma^2, and all ISI is cancelled. Therefore MMSE-PIC interpolates between LMMSE (iteration 0) and the matched filter bound (convergence).

The a-priori variance vˉ\bar{v} controls how much the equalizer trusts the soft decisions. At vˉ=1\bar{v}=1 the decisions are useless and MMSE-PIC does not subtract anything, collapsing to LMMSE. At vˉ=0\bar{v}=0 the decisions are perfect and all interference disappears.

Turbo Equalization Receiver

Complexity: O(Tmax(KL2+DEC))O(T_{\max}(K L^2 + \text{DEC})) with the scalar-variance MMSE-PIC variant
Input: Received samples y\mathbf{y}; channel H\mathbf{H}; noise
variance σ2\sigma^2; interleaver Π\boldsymbol{\Pi}; code decoder
DEC()\text{DEC}(\cdot); iterations TmaxT_{\max}.
Output: Decoded bits b^\hat{\mathbf{b}}.
1. Initialize LAeq0\mathbf{L}_A^{\text{eq}} \leftarrow \mathbf{0}.
2. for t=1,,Tmaxt = 1, \ldots, T_{\max} do
3. \quad Compute soft symbols xˉktanh(LAeq[k]/2)\bar{x}_k \leftarrow \tanh(L_A^{\text{eq}}[k]/2)
and variances vk1xˉk2v_k \leftarrow 1 - \bar{x}_k^2.
4. \quad Run MMSE-PIC: form residual
y~k=yHxˉ+hkxˉk\tilde{\mathbf{y}}_k = \mathbf{y} - \mathbf{H}\bar{\mathbf{x}} + \mathbf{h}_k\bar{x}_k
and filter to obtain x^k,μk,νk\hat{x}_k, \mu_k, \nu_k.
5. \quad Form extrinsic LLRs
LEeq[k]=2μkνkx^kL_E^{\text{eq}}[k] = \frac{2\mu_k}{\nu_k}\hat{x}_k.
6. \quad Deinterleave: LAdecΠ1LEeq\mathbf{L}_A^{\text{dec}} \leftarrow \boldsymbol{\Pi}^{-1}\mathbf{L}_E^{\text{eq}}.
7. \quad Run SISO decoder: LDdecDEC(LAdec)\mathbf{L}_D^{\text{dec}} \leftarrow \text{DEC}(\mathbf{L}_A^{\text{dec}}).
8. \quad Compute extrinsic LEdec=LDdecLAdec\mathbf{L}_E^{\text{dec}} = \mathbf{L}_D^{\text{dec}} - \mathbf{L}_A^{\text{dec}}.
9. \quad Interleave: LAeqΠLEdec\mathbf{L}_A^{\text{eq}} \leftarrow \boldsymbol{\Pi}\,\mathbf{L}_E^{\text{dec}}.
10. end for
11. return hard decisions on LDdec\mathbf{L}_D^{\text{dec}}.

EXIT Curves: MMSE-PIC Equalizer and Convolutional Decoder

EXIT functions for an MMSE-PIC equalizer on an ISI channel (varies with σ2\sigma^2) and for a SISO convolutional decoder. Observe how the equalizer curve shifts with channel SNR and how opening the tunnel determines the turbo-equalization threshold.

Parameters
6

Proakis benchmark channels (A: mild, B: moderate, C: severe)

Example: Turbo Equalization on the Proakis-C Channel

Consider the notorious Proakis-C channel h=[0.227,0.460,0.688,0.460,0.227]T\mathbf{h} = [0.227, 0.460, 0.688, 0.460, 0.227]^\mathsf{T}, which has deep spectral nulls. A rate-1/21/2 convolutional code (memory 22, octal (5,7)(5,7)) is employed with a pseudo-random interleaver of length 20482048. Linear MMSE equalization alone is known to fail catastrophically here. At what iteration does turbo equalization approach the matched-filter bound?

Turbo Equalization BER per Iteration on Proakis-C

Stacked BER curves showing how each turbo-equalization iteration progressively fills the ISI spectral nulls.
Iteration 00 is LMMSE + decoding; iteration \infty approaches the matched-filter bound (AWGN link of the same h2/σ2\|\mathbf{h}\|^2/\sigma^2).

Common Mistake: Target Symbol Variance Must Stay at 11

Mistake:

When constructing Vk\mathbf{V}_k for the MMSE-PIC filter at symbol kk, replacing the kk-th diagonal entry with the soft variance vkv_k instead of with 11 makes the filter attempt to cancel the target itself.

Correction:

The target symbol xkx_k is what we want to estimate — its variance in the filter design must remain full (=1=1 for BPSK). Only interferers xj,jkx_j, j \neq k, get their soft variances vjv_j. Equivalently, subtract xˉk\bar{x}_k from the a-priori cancellation signal but keep vk=1v_k = 1 in the filter covariance.

Common Mistake: Noise Variance Estimation Drift

Mistake:

Using a noise-variance estimate from the LMMSE iteration throughout the turbo loop causes a growing mismatch as soft-decision quality improves, producing overconfident LLRs at late iterations.

Correction:

Track the residual variance empirically: after cancellation, σ^t2=yHxˉ2/KMMSE term\hat{\sigma}^2_t = \|\mathbf{y} - \mathbf{H}\bar{\mathbf{x}}\|^2/K - \text{MMSE term}. Alternatively adopt the scalar-variance approximation vˉt=1xˉˉt2\bar{v}_t = 1 - \bar{\bar{x}}_t^2 where xˉˉt2\bar{\bar{x}}_t^2 is the empirical mean of xˉk2\bar{x}_k^2.

🔧Engineering Note

Turbo Equalization in Underwater Acoustic Links

Underwater acoustic channels routinely exhibit delay spreads of tens to hundreds of symbols due to multipath reflections between the surface and seabed. Classical LMMSE or DFE equalization fails, but turbo equalization closes most of the gap to the matched-filter bound. Stojanovic's group at MIT and WHOI demonstrated turbo-equalized coherent communication at 10103030 kbps over several kilometres, with delay spreads of L50L \approx 50 symbols. Because the trellis-based BCJR equalizer has 2L2^L states, only the MMSE-PIC form is tractable.

Practical Constraints
  • Doppler tracking must run inside the turbo loop

  • Channel estimation is adaptive and jointly iterated with detection

🎓CommIT Contribution(2004)

Soft Interference Cancellation for BICM-ID with Higher-Order Constellations

G. Caire, R. R. MüllerIEEE Trans. Information Theory, vol. 50, no. 9

Caire and Müller derived the asymptotically optimal received power profile for iterative soft interference cancellation in multi-user settings. The analysis showed that for iterative MMSE-PIC to converge near-capacity on CDMA and ISI channels, the power assignment to different users or subcarriers must satisfy a specific equalization condition that matches the decoder's EXIT curve. This insight has been applied to power allocation in turbo-equalized OFDM and in multi-user iterative MIMO detection.

mmse-picexit-chartpower-allocationView Paper →

MMSE-PIC

Linear MMSE equalization combined with parallel soft interference cancellation. Each symbol is estimated by subtracting the soft estimates of all other symbols from the received vector, then applying an LMMSE filter on the residual. Used as the SISO equalizer block in turbo equalization.

Soft Symbol

The posterior mean of a modulated symbol given the a-priori LLRs of its constituent bits. For BPSK, xˉk=tanh(LA[k]/2)\bar{x}_k = \tanh(L_A[k]/2). For higher modulations, xˉk=sXsPr(xk=sLA)\bar{x}_k = \sum_{s \in \mathcal{X}} s \cdot \Pr(x_k = s | L_A).

Related: MMSE-PIC

Why This Matters: Turbo Equalization for Insufficient OFDM Cyclic Prefix

When an OFDM symbol is transmitted with a cyclic prefix shorter than the channel delay spread, inter-carrier interference (ICI) and inter-symbol interference (ISI) appear and LMMSE equalization per subcarrier fails. Turbo equalization with an MMSE-PIC equalizer on the residual ICI+ISI matrix recovers most of the loss, enabling robust reception in dense urban multipath or repeater-extended cellular cells.

See full treatment in Chapter 14