Prerequisites & Notation

Before You Begin

Chapter 2 takes the binary-hypothesis-testing machinery of Chapter 1 and specialises it to the Gaussian observation model that governs nearly every physical receiver. A reader comfortable with the LRT, the Neyman--Pearson lemma, the QQ-function, and with linear algebra over Rn\mathbb{R}^n and Cn\mathbb{C}^n will follow this chapter without friction.

  • Likelihood ratio test (LRT) and Neyman--Pearson lemma(Review ch01)

    Self-check: State the Neyman--Pearson lemma and sketch its proof via the variational argument.

  • Gaussian distributions, QQ-function, and error probabilities(Review ch01)

    Self-check: Compute Q(2)Q(2) and explain why Q(x)≀12eβˆ’x2/2Q(x) \leq \tfrac12 e^{-x^2/2}.

  • Inner products, norms, projections in Rn\mathbb{R}^n(Review ch01)

    Self-check: Compute the orthogonal projection of y\mathbf{y} onto span(s)\text{span}(\mathbf{s}).

  • Positive-definite matrices and Cholesky factorisation(Review ch01)

    Self-check: Explain why a covariance matrix admits a Cholesky factor L\mathbf{L} with LLT=C\mathbf{L}\mathbf{L}^{\mathsf{T}} = \mathbf{C}.

  • Linear time-invariant systems and convolution

    Self-check: Relate the output of an LTI filter to the convolution of its input with the impulse response.

Chapter 2 Notation

Symbols introduced or used repeatedly in this chapter. Boldface lowercase denotes column vectors; boldface uppercase denotes matrices; w\mathbf{w} denotes additive noise throughout (per the book-wide convention).

SymbolMeaningIntroduced
y\mathbf{y}Observation vector in Rn\mathbb{R}^ns01
s\mathbf{s}Known (deterministic) signal vectors01
w\mathbf{w}Additive white Gaussian noise vector, w∼N(0,Οƒ2I)\mathbf{w}\sim\mathcal{N}(\mathbf{0},\sigma^2\mathbf{I})s01
EsE_sSignal energy, Es=βˆ₯sβˆ₯2=βˆ‘isi2E_s = \|\mathbf{s}\|^2 = \sum_i s_i^2s01
N0N_0One-sided noise PSD; per-sample variance is Οƒ2=N0/2\sigma^2 = N_0/2s01
T(y)T(\mathbf{y})Test statistic (often the sufficient statistic)s01
d2d^2Deflection coefficient, d2=2Es/N0d^2 = 2E_s/N_0 in AWGNs01
PF,PDP_F, P_DFalse-alarm and detection probabilitiess01
h(t)h(t)Matched-filter impulse response, h(t)=s(Tβˆ’t)h(t) = s(T-t)s04
Cw\mathbf{C}_wNoise covariance matrix (colored case)s03
L\mathbf{L}Cholesky factor of Cw\mathbf{C}_w, so Cw=LLT\mathbf{C}_w = \mathbf{L}\mathbf{L}^{\mathsf{T}}s03
βˆ₯vβˆ₯C2\|\mathbf{v}\|_{\mathbf{C}}^2Mahalanobis squared norm, vTCβˆ’1v\mathbf{v}^{\mathsf{T}}\mathbf{C}^{-1}\mathbf{v}s03
⟨y,s⟩\langle y, s\rangleContinuous-time inner product, ∫0Ty(t)s(t) dt\int_0^T y(t)s(t)\,dts04