Two Views of the Same Forward Model

The Central Chapter

This is the golden-thread chapter of the entire book. Everything before it builds toward the forward model y=Ac+w\mathbf{y} = \mathbf{A}\mathbf{c} + \mathbf{w}; everything after it develops methods to invert it. The key insight, due to Caire, is that two seemingly different communities β€” diffraction tomography (physics/medical imaging) and radar/wireless sensing β€” have been working with the same mathematical object, viewed from different angles. Understanding both views and their equivalence is what separates a practitioner who can tune algorithms from one who can design imaging systems.

Definition:

Born-Approximation Scattered Field

Consider a transmitter at position s\mathbf{s} radiating a monochromatic signal at frequency f0+ff_0 + f, with wavenumber ΞΊ=2Ο€(f0+f)/c\kappa = 2\pi(f_0 + f)/\text{c}. Under the first Born approximation (single scattering, weak contrast), the signal received at position r\mathbf{r} is

x(s,r;f)∝∫Ωc(p)d(s,p) d(p,r) eβˆ’jΞΊ(d(s,p)+d(p,r)) dp,x(\mathbf{s}, \mathbf{r}; f) \propto \int_{\Omega} \frac{c(\mathbf{p})}{d(\mathbf{s}, \mathbf{p})\, d(\mathbf{p}, \mathbf{r})} \, e^{-j\kappa(d(\mathbf{s}, \mathbf{p}) + d(\mathbf{p}, \mathbf{r}))} \, d\mathbf{p},

where c(p)c(\mathbf{p}) is the complex reflectivity at position p\mathbf{p}, d(a,b)=βˆ₯aβˆ’bβˆ₯d(\mathbf{a}, \mathbf{b}) = \|\mathbf{a} - \mathbf{b}\| is the Euclidean distance, and Ξ©\Omega is the scattering domain.

The 1/(dβ‹…d)1/(d \cdot d) factor is the two-way geometric spreading (path loss). The exponential eβˆ’jΞΊ(d1+d2)e^{-j\kappa(d_1 + d_2)} is the round-trip propagation phase.

This integral is the foundation of all coherent RF imaging. The entire chapter develops two complementary ways of analyzing it β€” one in the spatial-frequency (wavenumber) domain, and one in the space-time domain.

,

View A: Diffraction Tomography

In the diffraction tomography (DT) view, we work in the spatial-frequency (wavenumber) domain. Each measurement triple (transmitter si\mathbf{s}_{i}, receiver rj\mathbf{r}_{j}, frequency fkf_k) contributes a sample at a specific point ΞΊi,j,k\boldsymbol{\kappa}_{i,j,k} in wavenumber space (k-space). The collection of all such samples determines a region of k-space that we have "illuminated."

Imaging in this view = inverting a Fourier transform on irregularly sampled k-space data.

The resolution is determined by the extent of k-space coverage (larger coverage = finer resolution), and artifacts arise from gaps or non-uniform density in the sampling pattern. This view connects RF imaging to X-ray CT (Fourier Slice Theorem), MRI (k-space sampling), and ultrasound diffraction tomography.

,

View B: Radar / Wireless Sensing

In the radar/wireless view, we work in the space-time domain. Each transmitter-receiver pair performs matched filtering against a known transmitted waveform, extracting the range (delay) profile of the scene. The image is formed by coherent combination of these range profiles across all Tx-Rx pairs β€” delay-and-sum beamforming, back-projection, or more sophisticated combining rules.

Imaging in this view = matched filtering + coherent multi-view combination.

This is the language of radar signal processing, SAR/ISAR, and ISAC systems. Resolution is described through the ambiguity function, range resolution Ξ”r=c/(2W)\Delta r = \text{c}/(2W), and angular resolution Ξ”ΞΈβˆΞ»/D\Delta\theta \propto \lambda/D where DD is the aperture.

,

Key Takeaway

Caire's central insight: Both views derive from the same Born-approximation forward model. The difference is purely one of analysis domain β€” whether you examine the scattered-field integral in the spatial-frequency domain (View A) or in the space-time domain (View B). The physics is identical; the mathematics is related by a Fourier transform. This unification means that insights from either community (DT or radar) can be translated to the other, and system design can leverage tools from both.

πŸŽ“CommIT Contribution(2026)

Caire's Unified Illumination and Sensing Model

G. Caire β€” TU Berlin CommIT Group, Internal Research Note

Caire's research note provides the unifying framework that this chapter follows. Starting from the Born-approximation diffraction integral, Caire shows that:

  1. A first-order Taylor expansion of the propagation distances around the target center p0\mathbf{p}_{0} converts the phase term into a linear function of the displacement p~=pβˆ’p0\tilde{\mathbf{p}} = \mathbf{p} - \mathbf{p}_{0}, yielding a Fourier-transform relationship between the reflectivity and the scattered data (View A).

  2. When transmitter and receiver are antenna arrays, the same integral factors into steering vectors, delay terms, and beamforming gains β€” the standard radar/MIMO channel model (View B).

  3. Both views lead to the same linear observation model y=Ac+w\mathbf{y} = \mathbf{A}\mathbf{c} + \mathbf{w}, with the sensing matrix A\mathbf{A} encoding the geometry, frequencies, and antenna configurations.

The note also develops the link-budget normalization, the Kronecker structure of A\mathbf{A} (Ch 07), and the connection between regularized inverse problems and image reconstruction.

rf-imagingforward-modeldiffraction-tomographyunified-framework

Historical Note: Two Communities, One Problem

1969-2025

The diffraction-tomography view of imaging dates to Wolf (1969), who showed that coherent scattered data samples the spatial Fourier transform of the scattering potential along semicircular arcs (the Ewald sphere). Devaney (1982) formalized this as the Fourier Diffraction Theorem, connecting it to the Fourier Slice Theorem of CT. Meanwhile, radar engineers from the 1950s onward developed matched-filter imaging, synthetic aperture radar, and range-Doppler processing in the space-time domain.

For decades, these communities worked largely in parallel. The medical/optical imaging community used k-space language; the radar community used ambiguity functions and point-spread functions. The realization that both are faces of the same Born model became increasingly clear with the rise of MIMO radar in the 2000s and ISAC in the 2020s, where the transmitter is a communications array β€” forcing a merger of the two viewpoints.

, ,

Example: A Simple 1D Example: Two Views of a Single-Frequency Measurement

Consider a 1D scene with a single transmitter at position ss and a single receiver at position rr, both in the far field of a target region Ξ©\Omega centered at the origin. A single frequency f0f_0 is used (wavenumber ΞΊ=2Ο€f0/c\kappa = 2\pi f_0/\text{c}). Show that the measurement is proportional to the spatial Fourier transform of c(p)c(p) evaluated at a specific wavenumber, and also show how matched filtering recovers the range profile.

Diffraction Tomography vs. Radar/Wireless: Two Views of the Same Model

AspectView A: Diffraction TomographyView B: Radar / Wireless
Analysis domainSpatial frequency (k-space)Space-time (range-angle)
Each measurement providesA sample at a k-space pointA range-angle observation
Image formationInverse Fourier transform (NUFFT)Matched filter + coherent combination
Resolution determined byExtent of k-space coverageBandwidth (range) and aperture (cross-range)
Artifacts fromGaps / non-uniformity in k-spaceSidelobes of the ambiguity function
HeritageMedical imaging, optics, ultrasoundRadar, SAR, ISAC
Key referencesWolf 1969, Devaney 1982, Kak & Slaney 2001Cheney & Borden 2009, Richards 2014

Quick Check

A single Tx-Rx pair operating at a single frequency f0f_0 in the far field of a target. In the diffraction tomography view, this measurement provides:

A full image of the scene

A single point in k-space (spatial frequency domain)

A full range profile of the scene

The reflectivity at one spatial location

Diffraction Tomography

An imaging approach that reconstructs a scene by viewing scattered-field data as samples of the scene's spatial Fourier transform. The image is formed by inverting this Fourier relationship, typically on a non-uniform grid. Originated in optics (Wolf 1969) and ultrasound, now applied to RF imaging.

Related: The Ewald Sphere, Fourier Diffraction Theorem

Backpropagation (Imaging)

The standard baseline image formation method. For each pixel p~\tilde{\mathbf{p}} in the image grid, backpropagation coherently sums all measurements after compensating for the known propagation phase and attenuation. Mathematically: c^BP=AHDβˆ’1y\hat{\mathbf{c}}^{\text{BP}} = \mathbf{A}^{H} \mathbf{D}^{-1} \mathbf{y} where D\mathbf{D} accounts for distance-dependent gains. Also called matched-filter imaging or delay-and-sum beamforming.

Related: The Sensing Matrix, Backpropagation (Matched-Filter) Imaging

k-Space

The spatial-frequency domain, also called wavenumber space. Each point ΞΊ=(ΞΊx,ΞΊy,ΞΊz)\boldsymbol{\kappa} = (\kappa_x, \kappa_y, \kappa_z) in k-space corresponds to a spatial frequency component of the scene. The reflectivity's spatial Fourier transform c~(ΞΊ)\tilde{c}(\boldsymbol{\kappa}) lives in k-space.

Related: The Ewald Sphere, Fourier Diffraction Theorem

Common Mistake: Born Approximation Is Not Free

Mistake:

Treating the Born approximation as universally valid for RF imaging, regardless of the scattering strength or object size.

Correction:

The Born approximation requires weak scattering: the accumulated phase perturbation through the object must be small, roughly ∣k2Ο‡V1/3∣β‰ͺ1|k^2 \chi V^{1/3}| \ll 1 where Ο‡\chi is the contrast and VV is the object volume. For strong scatterers (e.g., metal objects, walls at oblique incidence), the Born approximation breaks down and iterative methods (DBIM, contrast source inversion) are needed. In Caire's framework, the Born model linearizes the inverse problem β€” the validity of this linearization must always be checked for the specific scenario.

,