Prerequisites & Notation

Before You Begin

This chapter analyzes the fundamental bottleneck of FDD massive MIMO β€” the overhead of downlink pilot transmission and uplink CSI feedback β€” and surveys the principal solutions: compressed feedback, codebook-based reporting (5G NR), deep learning compression, and JSDM-based dimensionality reduction. A solid understanding of the JSDM framework from Chapter 7 and the TDD/FDD contrast from Chapter 1 is essential.

  • Massive MIMO system model: H∈CNrΓ—Nt\mathbf{H} \in \mathbb{C}^{N_r \times N_t}, channel hardening, favorable propagation(Review ch01)

    Self-check: Can you state the channel hardening law 1NtHkHHk→1\frac{1}{N_t} \mathbf{H}_{k}^{H} \mathbf{H}_{k} \to 1 and explain its implication for CSI acquisition?

  • TDD reciprocity and the FDD overhead barrier(Review s05)

    Self-check: Can you explain why TDD avoids the NtN_t-scaling overhead that FDD suffers?

  • Angular-domain channel representation and spatial covariance structure(Review ch02)

    Self-check: Can you write Hk=UkΞ›k1/2Gk\mathbf{H}_{k} = \mathbf{U}_k \mathbf{\Lambda}_k^{1/2} \mathbf{G}_k and explain the role of each factor?

  • JSDM: two-stage precoding, group structure, pre-beamforming matrix Bg\mathbf{B}_g(Review ch07)

    Self-check: Can you explain how JSDM reduces the effective channel dimension from NtN_t to rgr_g?

  • Linear precoding: MRT, ZF, MMSE(Review ch06)

    Self-check: Can you write the ZF precoder W=H(HHH)βˆ’1\mathbf{W} = \mathbf{H}(\mathbf{H}^{H} \mathbf{H})^{-1} and state when it is optimal?

  • Rate-distortion theory: R(D)R(D) function, achievability, converse

    Self-check: Can you state the rate-distortion function for a Gaussian source and explain its operational meaning?

β€’

Massive MIMO fundamentals: scaling laws, channel hardening, favorable propagation, achievable rates

β€’

JSDM framework: group-based precoding, spatial covariance exploitation

β€’

Rate-distortion theory provides the information-theoretic framework for CSI compression

Notation for This Chapter

Symbols introduced in this chapter. See also the NGlobal Notation Table master table in the front matter.

SymbolMeaningIntroduced
Ο„d\tau_dDownlink pilot overhead (number of DL training symbols)s01
BfbB_{\text{fb}}Number of uplink feedback bits per coherence blocks01
TcT_cCoherence interval length (in symbols)s01
H^\hat{\mathbf{H}}CSI estimate at the base station (reconstructed from feedback)s01
Ha\mathbf{H}_{a}Angular-domain channel: Ha=FHH\mathbf{H}_{a} = \mathbf{F}^H \mathbf{H} where F\mathbf{F} is the DFT matrixs02
mathbfF\\mathbf{F}Unitary DFT matrix (NtΓ—NtN_t \times N_t)s02
boldsymbolPhi\\boldsymbol{\\Phi}Compression (measurement) matrix for CSI feedbacks02
mathcalC\\mathcal{C}Codebook: finite set of candidate beamforming vectorss03
dc(mathbfH,mathcalC)d_c(\\mathbf{H}, \\mathcal{C})Chordal distance between channel subspace and nearest codebook entrys03
mathcalE(cdot)\\mathcal{E}(\\cdot)CsiNet encoder (UE-side neural network)s04
mathcalD(cdot)\\mathcal{D}(\\cdot)CsiNet decoder (BS-side neural network)s04
gamma\\gammaCompression ratio Ξ³=M/(2Nt)\gamma = M / (2 N_t) for CsiNets04
mathbfBg\\mathbf{B}_gPre-beamforming matrix for group gg (from JSDM)s05
rgr_gEffective rank (number of dominant eigenmodes) of group gg's covariances05
mathbfHtexteff,k\\mathbf{H}_{\\text{eff},k}Effective reduced-dimension channel: Heff,k=HkBg\mathbf{H}_{\text{eff},k} = \mathbf{H}_{k} \mathbf{B}_gs05