Two Views of the Same Forward Model
The Central Chapter
This is the golden-thread chapter of the entire book. Everything before it builds toward the forward model ; everything after it develops methods to invert it. The key insight, due to Caire, is that two seemingly different communities β diffraction tomography (physics/medical imaging) and radar/wireless sensing β have been working with the same mathematical object, viewed from different angles. Understanding both views and their equivalence is what separates a practitioner who can tune algorithms from one who can design imaging systems.
Definition: Born-Approximation Scattered Field
Born-Approximation Scattered Field
Consider a transmitter at position radiating a monochromatic signal at frequency , with wavenumber . Under the first Born approximation (single scattering, weak contrast), the signal received at position is
where is the complex reflectivity at position , is the Euclidean distance, and is the scattering domain.
The factor is the two-way geometric spreading (path loss). The exponential is the round-trip propagation phase.
This integral is the foundation of all coherent RF imaging. The entire chapter develops two complementary ways of analyzing it β one in the spatial-frequency (wavenumber) domain, and one in the space-time domain.
View A: Diffraction Tomography
In the diffraction tomography (DT) view, we work in the spatial-frequency (wavenumber) domain. Each measurement triple (transmitter , receiver , frequency ) contributes a sample at a specific point in wavenumber space (k-space). The collection of all such samples determines a region of k-space that we have "illuminated."
Imaging in this view = inverting a Fourier transform on irregularly sampled k-space data.
The resolution is determined by the extent of k-space coverage (larger coverage = finer resolution), and artifacts arise from gaps or non-uniform density in the sampling pattern. This view connects RF imaging to X-ray CT (Fourier Slice Theorem), MRI (k-space sampling), and ultrasound diffraction tomography.
View B: Radar / Wireless Sensing
In the radar/wireless view, we work in the space-time domain. Each transmitter-receiver pair performs matched filtering against a known transmitted waveform, extracting the range (delay) profile of the scene. The image is formed by coherent combination of these range profiles across all Tx-Rx pairs β delay-and-sum beamforming, back-projection, or more sophisticated combining rules.
Imaging in this view = matched filtering + coherent multi-view combination.
This is the language of radar signal processing, SAR/ISAR, and ISAC systems. Resolution is described through the ambiguity function, range resolution , and angular resolution where is the aperture.
Key Takeaway
Caire's central insight: Both views derive from the same Born-approximation forward model. The difference is purely one of analysis domain β whether you examine the scattered-field integral in the spatial-frequency domain (View A) or in the space-time domain (View B). The physics is identical; the mathematics is related by a Fourier transform. This unification means that insights from either community (DT or radar) can be translated to the other, and system design can leverage tools from both.
Caire's Unified Illumination and Sensing Model
Caire's research note provides the unifying framework that this chapter follows. Starting from the Born-approximation diffraction integral, Caire shows that:
-
A first-order Taylor expansion of the propagation distances around the target center converts the phase term into a linear function of the displacement , yielding a Fourier-transform relationship between the reflectivity and the scattered data (View A).
-
When transmitter and receiver are antenna arrays, the same integral factors into steering vectors, delay terms, and beamforming gains β the standard radar/MIMO channel model (View B).
-
Both views lead to the same linear observation model , with the sensing matrix encoding the geometry, frequencies, and antenna configurations.
The note also develops the link-budget normalization, the Kronecker structure of (Ch 07), and the connection between regularized inverse problems and image reconstruction.
Historical Note: Two Communities, One Problem
1969-2025The diffraction-tomography view of imaging dates to Wolf (1969), who showed that coherent scattered data samples the spatial Fourier transform of the scattering potential along semicircular arcs (the Ewald sphere). Devaney (1982) formalized this as the Fourier Diffraction Theorem, connecting it to the Fourier Slice Theorem of CT. Meanwhile, radar engineers from the 1950s onward developed matched-filter imaging, synthetic aperture radar, and range-Doppler processing in the space-time domain.
For decades, these communities worked largely in parallel. The medical/optical imaging community used k-space language; the radar community used ambiguity functions and point-spread functions. The realization that both are faces of the same Born model became increasingly clear with the rise of MIMO radar in the 2000s and ISAC in the 2020s, where the transmitter is a communications array β forcing a merger of the two viewpoints.
Example: A Simple 1D Example: Two Views of a Single-Frequency Measurement
Consider a 1D scene with a single transmitter at position and a single receiver at position , both in the far field of a target region centered at the origin. A single frequency is used (wavenumber ). Show that the measurement is proportional to the spatial Fourier transform of evaluated at a specific wavenumber, and also show how matched filtering recovers the range profile.
View A: Fourier domain
Under the far-field Born approximation (Ch 05), the 1D scattered field is
where and are the unit direction vectors from the origin to and . This is a 1D Fourier transform of evaluated at the spatial frequency . A single measurement gives a single point in k-space.
View B: Matched filter
In the time/space domain, the measurement corresponds to the round-trip delay . For a known transmitted pulse, matched filtering (correlating the received signal with a delayed replica) produces a peak at the delay corresponding to each scatterer. The range profile is the set of delays weighted by reflectivity β this is the space-domain description of the same information.
Connection
The two descriptions are Fourier duals: the matched-filter range profile is the inverse Fourier transform of the k-space samples. With a single frequency, we get one k-space point and hence no range resolution (we need bandwidth for that). With multiple frequencies, we sample an interval in k-space, and the inverse Fourier transform yields a range profile with resolution .
Diffraction Tomography vs. Radar/Wireless: Two Views of the Same Model
| Aspect | View A: Diffraction Tomography | View B: Radar / Wireless |
|---|---|---|
| Analysis domain | Spatial frequency (k-space) | Space-time (range-angle) |
| Each measurement provides | A sample at a k-space point | A range-angle observation |
| Image formation | Inverse Fourier transform (NUFFT) | Matched filter + coherent combination |
| Resolution determined by | Extent of k-space coverage | Bandwidth (range) and aperture (cross-range) |
| Artifacts from | Gaps / non-uniformity in k-space | Sidelobes of the ambiguity function |
| Heritage | Medical imaging, optics, ultrasound | Radar, SAR, ISAC |
| Key references | Wolf 1969, Devaney 1982, Kak & Slaney 2001 | Cheney & Borden 2009, Richards 2014 |
Quick Check
A single Tx-Rx pair operating at a single frequency in the far field of a target. In the diffraction tomography view, this measurement provides:
A full image of the scene
A single point in k-space (spatial frequency domain)
A full range profile of the scene
The reflectivity at one spatial location
Correct. Each (Tx, Rx, frequency) triple maps to exactly one point in k-space. We need many such measurements (varying Tx, Rx, and/or frequency) to fill k-space and form an image.
Diffraction Tomography
An imaging approach that reconstructs a scene by viewing scattered-field data as samples of the scene's spatial Fourier transform. The image is formed by inverting this Fourier relationship, typically on a non-uniform grid. Originated in optics (Wolf 1969) and ultrasound, now applied to RF imaging.
Related: The Ewald Sphere, Fourier Diffraction Theorem
Backpropagation (Imaging)
The standard baseline image formation method. For each pixel in the image grid, backpropagation coherently sums all measurements after compensating for the known propagation phase and attenuation. Mathematically: where accounts for distance-dependent gains. Also called matched-filter imaging or delay-and-sum beamforming.
Related: The Sensing Matrix, Backpropagation (Matched-Filter) Imaging
k-Space
The spatial-frequency domain, also called wavenumber space. Each point in k-space corresponds to a spatial frequency component of the scene. The reflectivity's spatial Fourier transform lives in k-space.
Related: The Ewald Sphere, Fourier Diffraction Theorem
Common Mistake: Born Approximation Is Not Free
Mistake:
Treating the Born approximation as universally valid for RF imaging, regardless of the scattering strength or object size.
Correction:
The Born approximation requires weak scattering: the accumulated phase perturbation through the object must be small, roughly where is the contrast and is the object volume. For strong scatterers (e.g., metal objects, walls at oblique incidence), the Born approximation breaks down and iterative methods (DBIM, contrast source inversion) are needed. In Caire's framework, the Born model linearizes the inverse problem β the validity of this linearization must always be checked for the specific scenario.