Chapter Summary

Chapter 18 Summary

Key Points

  • 1.

    Sum-product (belief propagation) is a local message-passing algorithm on a factor graph. Variable-to-factor messages multiply incoming factor messages (extrinsic); factor-to-variable messages sum out irrelevant variables weighted by incoming variable messages.

  • 2.

    On trees, sum-product terminates after two passes and returns exact marginals. On loopy graphs, iterating yields a principled approximation (loopy BP) whose fixed points are Bethe stationary points.

  • 3.

    For LDPC decoding, log-domain BP alternates between check-node updates (soft-XOR via Ο•=βˆ’log⁑tanh⁑\phi = -\log\tanh, or min-sum) and variable-node updates (sum of incoming LLRs plus channel LLR). The normalized min-sum approximation is standard in hardware.

  • 4.

    Density evolution tracks the asymptotic BP message distribution on locally-tree-like random Tanner graphs. On the BEC, it reduces to a scalar recursion xβ„“+1=ϡλ(1βˆ’Ο(1βˆ’xβ„“))x_{\ell+1} = \epsilon\lambda(1-\rho(1-x_\ell)), pinpointing the BP threshold Ο΅βˆ—\epsilon^* exactly.

  • 5.

    Max-product replaces βˆ‘\sum with max⁑\max to compute the joint MAP arg⁑max⁑xp(x)\arg\max_{\mathbf{x}} p(\mathbf{x}). The Viterbi algorithm is max-product on a trellis. On loopy graphs, fixed points are locally optimal.

  • 6.

    Use sum-product when bit-level MMSE matters (LDPC decoding); use max-product when sequence-level MAP matters (convolutional decoding).

  • 7.

    Gaussian BP specializes sum-product to jointly Gaussian models. Messages are Gaussian information-form pairs (h,J)(h, J); the updates are scalar per-edge. On any convergent graph (including loopy), the means are exact; only the variances approximate.

  • 8.

    GaBP is the algorithmic backbone for iterative MIMO detection, distributed least squares, and large-scale sparse linear system solving.

Looking Ahead

Chapter 19 develops the turbo principle and iterative receivers, where two (or more) BP-style decoders exchange extrinsic information through an interleaver. Chapter 20 extends BP to approximate message passing (AMP) β€” a variant tailored to dense random matrices and high-dimensional compressed sensing, where it achieves the Donoho-Tanner phase transition (Chapter 16) with O(MN)O(MN) per-iteration complexity.