Prerequisites & Notation

Prerequisites for Chapter 19

This chapter extends the AMP/VAMP framework from Chapters 17–18 in two directions: (1) unknown hyperparameters β€” EM-GAMP estimates noise variance and sparsity from data automatically; (2) non-Gaussian likelihoods β€” GAMP handles 1-bit, Poisson, and power-only measurements via a general output channel goutg_{\text{out}}. A third extension, multi-layer inference, connects message passing to deep generative priors.

  • AMP and OAMP/VAMP (Chapter 17)(Review ch17)

    Self-check: Can you write the AMP iteration, state evolution recursion, and name one advantage of VAMP over AMP for structured matrices?

  • Expectation-Maximization (EM) algorithm

    Self-check: Can you state the E-step and M-step for a latent variable model and explain why EM is guaranteed to not decrease the log-likelihood?

  • Bernoulli-Gaussian prior and MMSE denoiser(Review ch17)

    Self-check: Can you compute the posterior inclusion probability Ο€i\pi_i and posterior mean for a Bernoulli-Gaussian prior given a noisy observation?

  • Generalized linear models (GLM)

    Self-check: Can you write the GLM likelihood p(ym∣zm)p(y_m | z_m) for Gaussian, Poisson, and logistic output channels?

Notation for Chapter 19

Symbols used throughout this chapter. The sensing model y=Ac+w\mathbf{y} = \mathbf{A}\mathbf{c} + \mathbf{w} carries through all three sections with different likelihood structures.

SymbolMeaningIntroduced
zm=amTcz_m = \mathbf{a}_m^T \mathbf{c}Linear mixing output (pre-likelihood) at measurement mms01
p(ym∣zm)p(y_m | z_m)Output channel (generalized likelihood)s01
gout(y,p^,Ο„p)g_{\text{out}}(y, \hat{p}, \tau_p)Output function: MMSE of zmz_m given ymy_m and Gaussian cavity N(p^,Ο„p)\mathcal{N}(\hat{p}, \tau_p)s01
gin(r,Ο„r)g_{\text{in}}(r, \tau_r)Input function (prior denoiser): MMSE of xix_i given N(r,Ο„r)\mathcal{N}(r, \tau_r)s01
ΞΈ=(Οƒ2,ρ,Οƒx2)\boldsymbol{\theta} = (\sigma^2, \rho, \sigma_x^2)Unknown hyperparameters: noise variance, sparsity, signal variances01
ΞΈ^(k)\hat{\boldsymbol{\theta}}^{(k)}EM estimate of hyperparameters at outer iteration kks01
Ο€i\pi_iPosterior inclusion probability: p(xiβ‰ 0∣r^i,Ο„r)p(x_i \neq 0 \mid \hat{r}_i, \tau_r)s01
Ο„pt,Ο„rt\tau_p^t, \tau_r^tGAMP extrinsic variances at output and input sides, iteration tts01
Ξ¦(β‹…)\Phi(\cdot)Standard Gaussian CDF (used in probit / 1-bit likelihood)s02
z(β„“)\mathbf{z}^{(\ell)}Hidden activations at layer β„“\ell of a multi-layer generative models03
A(β„“)\mathbf{A}^{(\ell)}Linear mixing matrix at layer β„“\ells03