Prerequisites & Notation

Before You Begin

This chapter builds the second canonical coded-modulation construction β€” multilevel coding (MLC) β€” and its natural companion receiver, multistage decoding (MSD). The reader is expected to be comfortable with the tools established in Chapters 1 and 2, and with the information-theoretic language of mutual information and the chain rule.

  • Signal-space constellations, dmin⁑d_{\min}, and the coding-gain criterion(Review ch01)

    Self-check: Can you compute dmin⁑2d_{\min}^{2} for 8-PSK at unit EsE_s, and state the coding-gain formula γc=dmin⁑2/duncoded2\gamma_c = d_{\min}^{2} / d_{\rm uncoded}^2?

  • Ungerboeck set partitioning and trellis-coded modulation(Review ch02)

    Self-check: Can you draw the set-partitioning tree for 8-PSK down to three levels and read off the intra-subset minimum distance at each level?

  • Mutual information, the chain rule, and conditional mutual information(Review ch02)

    Self-check: Can you state the chain rule I(Y;B0,B1,…,BLβˆ’1)=βˆ‘i=0Lβˆ’1I(Y;Bi∣B0,…,Biβˆ’1)I(Y; B_0, B_1, \ldots, B_{L-1}) = \sum_{i=0}^{L-1} I(Y; B_i \mid B_0, \ldots, B_{i-1}) and explain why each term is non-negative?

  • AWGN channel capacity and the binary-input AWGN channel(Review ch09)

    Self-check: Can you write the BI-AWGN capacity integral C=1βˆ’βˆ«p(y)log⁑2(1+eβˆ’2y/Οƒ22) dyC = 1 - \int p(y) \log_2(1 + e^{-2y/{\sigma^2}^{2}}) \, dy and explain why it is strictly below log⁑2(1+SNR)\log_2(1 + \text{SNR})?

  • Binary channel coding: convolutional codes, LDPC, and the Shannon gap(Review ch11)

    Self-check: Can you sketch the Shannon limit for a BI-AWGN channel at rate 1/21/2, and explain what "capacity-approaching" means for a modern LDPC code?

  • The concept of coset decoding and parallel transitions in a trellis(Review ch02)

    Self-check: Given an MM-PSK constellation partitioned into subsets of size M/2M/2, can you describe how the TCM trellis handles the two parallel transitions between each pair of states?

Notation for This Chapter

Symbols specific to the multilevel coding and multistage decoding framework. See the chapter-opener notation table of Book CM for the shared symbols (constellation X\mathcal{X}, energy EsE_s, noise density N0N_0, etc.).

SymbolMeaningIntroduced
L=log⁑2ML = \log_2 MNumber of partition levels (and number of binary codes in MLC) for an MM-ary constellations01
bi∈{0,1}b_i \in \{0,1\}Label bit at level ii, 0≀i≀Lβˆ’10 \le i \le L-1 (level 0 is the most-significant, coarsest partition)s01
ΞΌ:{0,1}Lβ†’X\mu : \{0,1\}^L \to \mathcal{X}Partition-based labelling map from the binary label (b0,…,bLβˆ’1)(b_0, \ldots, b_{L-1}) to the constellation points01
Ai(b0,…,biβˆ’1)\mathcal{A}_i^{(b_0,\ldots,b_{i-1})}Coset at level ii indexed by the decoded history (b0,…,biβˆ’1)(b_0, \ldots, b_{i-1})s01
CiC_iCapacity of the ii-th binary sub-channel: Ci=I(Y;Bi∣B0,…,Biβˆ’1)C_i = I(Y; B_i \mid B_0, \ldots, B_{i-1})s02
RiR_iRate of the binary code used at level ii; the capacity rule sets Ri=CiR_i = C_is02
CCMC_{\rm CM}Coded-modulation capacity I(Y;X)I(Y; X) of the constellation X\mathcal{X} under uniform inputss04
CBICMC_{\rm BICM}Bit-interleaved coded modulation capacity βˆ‘iI(Y;Bi)\sum_i I(Y; B_i) (unconditional sum)s04
CMLC/MSDC_{\rm MLC/MSD}Achievable rate of MLC with multistage decoding; equals CCMC_{\rm CM}s04
Ξ·\etaSpectral efficiency, Ξ·=βˆ‘iRi\eta = \sum_i R_i bits per 2D symbols02
I(Y;X)I(Y; X)Mutual information between the channel output YY and input XXs02