Prerequisites & Notation

Before You Begin

This chapter introduces information-theoretic secrecy, which requires a solid background in channel coding and multiuser information theory.

  • Channel capacity and the channel coding theorem(Review ch09)

    Self-check: Can you state the channel coding theorem and sketch the random coding achievability proof?

  • The Gaussian channel and its capacity(Review ch10)

    Self-check: Can you derive C=12log(1+SNR)C = \frac{1}{2}\log(1 + \text{SNR}) and explain why Gaussian input is optimal?

  • Random binning and Slepian–Wolf coding(Review ch07)

    Self-check: Can you explain how random binning assigns multiple source sequences to the same index?

  • The broadcast channel and superposition coding(Review ch15)

    Self-check: Can you describe superposition coding for the degraded BC and state its capacity region?

  • Mutual information and the data processing inequality(Review ch01)

    Self-check: Can you prove the data processing inequality from the chain rule of mutual information?

  • MIMO channel model and capacity

    Self-check: Can you write the MIMO channel capacity formula C=logdet(I+SNRHHH)C = \log\det(\mathbf{I} + \text{SNR}\mathbf{H}\mathbf{H}^{H})?

Notation for This Chapter

Symbols introduced or heavily used in this chapter.

SymbolMeaningIntroduced
CsC_sSecrecy capacity (bits per channel use)s01
ZZEavesdropper's observation (or eavesdropper's channel output)s01
YYLegitimate receiver's channel outputs01
ReR_eEquivocation rate at the eavesdroppers01
I(X;Z)I(X; Z)Information leakage to the eavesdroppers01
CKC_KSecret key capacitys02
HE\mathbf{H}_{E}Eavesdropper's channel matrix (MIMO wiretap)s03
Q\mathbf{Q}Input covariance matrixs03
vAN\mathbf{v}_{\text{AN}}Artificial noise beamforming directions03