Prerequisites & Notation
Before You Begin
This chapter takes the BICM capacity formula of Chapter 5 and the PEP analysis of Chapter 6 and places them on a rigorous information-theoretic footing through the lens of mismatched decoding. The reader should be comfortable with the parallel-binary-channel decomposition of BICM, the Gray and set-partition labellings, and the pairwise-error-probability / union bound machinery. A working knowledge of Gallager's random-coding exponent for binary-input channels is also assumed — we review only what is needed but do not reprove the classical results.
- BICM capacity formula (Review ch05)
Self-check: Can you state why this is the achievable rate under the BICM product bit metric, and why it is bounded above by ? Can you write down the gap as a sum of conditional mutual informations?
- BICM PEP and union bound on fading channels(Review ch06)
Self-check: Can you write the union bound on codeword error probability in terms of pairwise error probabilities, and identify the BICM diversity order from the product of free distance and minimum distinct-bit count?
- Gallager's random-coding exponent for binary-input channels(Review ch13)
Self-check: Can you state the form and identify the channel-dependent function for a given input distribution ?
- Mismatched decoding and generalised mutual information (GMI)(Review ch14)
Self-check: Can you state the GMI formula and explain when it coincides with the mutual information (matched case) versus when it lies strictly below the capacity?
- Cutoff rate and its operational meaning(Review ch13)
Self-check: Can you state Gallager's cutoff-rate theorem and explain why, for sequential and list decoders, is the practical rate limit while capacity is the theoretical one?
- Bhattacharyya parameter and Chernoff exponent on BI-channels(Review ch13)
Self-check: Can you compute the Bhattacharyya parameter for a BI-AWGN channel and identify it as ?
Notation for This Chapter
Symbols specific to the capacity / error-exponent analysis of BICM. The Chapter 5–6 BICM notation (constellation , labelling , bit positions , per-bit channel , LLR , subset ) continues to apply and is not repeated here.
| Symbol | Meaning | Introduced |
|---|---|---|
| Decoding metric (generic). Matched case ; BICM product metric | s02 | |
| Decoder scaling parameter. The mismatched rate depends on ; the optimal maximises the GMI | s02 | |
| Generalised mutual information at scaling ; BICM achievable rate is | s03 | |
| Gallager function for input distribution and Gallager parameter | s04 | |
| Random-coding error exponent, | s04 | |
| Random-coding error exponents for the CM decoder and the BICM (mismatched, product-metric) decoder respectively | s04 | |
| Gallager cutoff rate, ; equivalently for binary inputs | s05 | |
| Bhattacharyya parameter of a binary-input channel, | s05 | |
| Hamming distance between two codewords in the binary code (carried over from Ch. 6) | s05 |