Preconditioning the Sensing Operator
Why Preconditioning Accelerates Reconstruction
Iterative reconstruction algorithms (gradient descent, ISTA, ADMM, OAMP) converge at a rate determined by the condition number of . When --- typical for RF imaging with incomplete angular coverage --- convergence can be painfully slow: the number of iterations scales linearly with for first-order methods and as for accelerated methods. Preconditioning transforms the problem so that the effective condition number is closer to 1, dramatically reducing the iteration count. The Kronecker structure of makes certain preconditioners computationally cheap.
Definition: Preconditioner
Preconditioner
A preconditioner for the normal equations is an invertible matrix such that the preconditioned system
has a smaller effective condition number .
An ideal preconditioner approximates while being cheap to apply: but costs or .
Definition: Diagonal (Jacobi) Preconditioner
Diagonal (Jacobi) Preconditioner
The simplest preconditioner is the diagonal (Jacobi) preconditioner:
where is the -th column of . This equalizes the column norms, correcting for the non-uniform illumination pattern.
For a Kronecker-structured matrix, the diagonal of is itself a Kronecker product:
requiring storage for only values.
In Caire's framework (Eq. 24--25), the backpropagation operator uses a diagonal matrix that accounts for path loss and antenna gain variations. This is precisely the diagonal preconditioner applied in the measurement domain.
Theorem: Kronecker Preconditioning via Factor Inversion
For , define the Kronecker preconditioner:
where are regularization parameters. Then:
-
can be applied in time by inverting each factor separately.
-
The effective condition number is
-
With , (perfect preconditioning), but this requires each factor to be invertible.
Since factors as a Kronecker product, we can invert it factor by factor instead of inverting the full matrix. The regularization parameters handle the case when individual factors are rank-deficient, at the cost of imperfect preconditioning.
Kronecker inverse
By the Kronecker inverse property:
Each factor inverse has size , so the total computation is compared to for direct inversion.
Example: Preconditioned CG for Tikhonov Reconstruction
A 2D imaging system with has . The Tikhonov solution
is computed by conjugate gradient (CG) on the normal equations.
(a) How many CG iterations are needed without preconditioning? (b) How many with the Kronecker preconditioner, given and ?
Unpreconditioned CG convergence
CG reduces the error by a factor of per iteration, where . For small , . To reduce error by :
Kronecker-preconditioned CG
The Kronecker preconditioner reduces the effective condition number to approximately -- (the residual mismatch from regularization). CG now converges in:
Speedup: fewer iterations, with each iteration costing only marginally more due to the preconditioner application.
Preconditioned CG with Kronecker Structure
Complexity: Each iteration: for the Kronecker matvec, plus for the preconditioner. Total iterations: .The preconditioner application is itself a Kronecker matvec: reshape into a tensor and multiply each mode by . No large matrix is ever formed.
Preconditioning Effect on Convergence
Compare the convergence of unpreconditioned and Kronecker-preconditioned conjugate gradient for a Tikhonov reconstruction problem. Observe how the number of iterations drops dramatically with preconditioning, especially for ill-conditioned systems.
Parameters
k-Space Density Compensation (Reweighting)
When the k-space sampling is non-uniform (typical for multi-static systems), the backprojection image overemphasizes densely sampled regions. Density compensation reweights the measurements before backprojection:
where and is inversely proportional to the local k-space sampling density around the -th measurement.
This is equivalent to a left preconditioner that approximately whitens the normal equations. For Kronecker-structured , the weights factor as when the sampling density is separable.
Quick Check
Which statement about Kronecker preconditioning is correct?
The preconditioner application costs where is the number of voxels.
The Kronecker preconditioner inverts each factor separately, costing .
Preconditioning increases the condition number to improve convergence.
Diagonal preconditioning is always sufficient for RF imaging.
This is the key advantage: inverting three small matrices instead of one enormous one, enabled by the Kronecker product inverse property.
Key Takeaway
Preconditioning reduces the effective condition number and accelerates iterative reconstruction. The Kronecker structure enables a powerful preconditioner that inverts each factor separately at cost instead of . For a typical system, this reduces CG iterations from thousands to tens. k-Space density compensation provides an alternative view as measurement-domain reweighting.