Prerequisites & Notation

Prerequisites for This Chapter

This chapter connects the optical computer-vision pipeline β€” multi-view geometry, differentiable rendering, and physics-informed networks β€” to the RF imaging framework developed in earlier chapters. We show how the same analysis-through-synthesis philosophy that powers NeRF and 3DGS can be adapted to electromagnetic wave propagation, and how multi-modal fusion combines RF with camera and LiDAR sensing.

  • Electromagnetic scattering and the Born approximation(Review ch06)

    Self-check: Can you write the volume integral equation for a scattered field under the Born approximation?

  • MIMO radar and virtual aperture(Review ch11)

    Self-check: Can you explain how Nt+NrN_t + N_r antennas create NtNrN_t N_r virtual array elements?

  • 3D scene representations (NeRF, SDF, 3DGS)(Review ch24)

    Self-check: Can you describe how NeRF represents a scene as a neural radiance field and renders it via volume rendering?

  • Differentiable rendering and inverse rendering (basics)(Review ch25)

    Self-check: Can you state the rendering equation and explain what makes a renderer differentiable?

Notation for This Chapter

Symbols introduced or heavily used in this chapter. See also the global notation table in the front matter.

SymbolMeaningIntroduced
F\mathbf{F}Fundamental matrix (epipolar geometry)s01
E\mathbf{E}Essential matrix (E=[t]Γ—R\mathbf{E} = [\mathbf{t}]_\times \mathbf{R})s01
K\mathbf{K}Camera intrinsic matrixs01
Lo,Li,LeL_o, L_i, L_eOutgoing, incoming, emitted radiances02
frf_rBidirectional reflectance distribution function (BRDF)s02
Ο‡(r)\chi(\mathbf{r})Contrast function (Ο΅r(r)βˆ’1\epsilon_r(\mathbf{r}) - 1)s03
G(r,rβ€²)G(\mathbf{r}, \mathbf{r}')Free-space Green's functions03
F[u]\mathcal{F}[u]PDE residual operator (PINN loss)s05
KΞΈ\mathcal{K}_\thetaFourier integral kernel operator (FNO)s05