Coherence and RIP for Structured Sensing Matrices

From Random to Structured Compressed Sensing

In FSI Ch 13, compressed sensing theory provides powerful recovery guarantees: if the sensing matrix satisfies the Restricted Isometry Property (RIP), then sparse signals can be recovered from far fewer measurements than the ambient dimension. The catch is that these guarantees are proven for random matrices (Gaussian, sub-Gaussian, random partial Fourier). Our physical sensing matrix A\mathbf{A} is highly structured, and the standard RIP results do not apply directly. This section examines what coherence and RIP-like properties A\mathbf{A} actually possesses, and what this means for sparse reconstruction in RF imaging.

Definition:

Mutual Coherence

The mutual coherence of a matrix ACM×N\mathbf{A} \in \mathbb{C}^{M \times N} with columns A1,,AN\mathbf{A}_{1}, \ldots, \mathbf{A}_{N} is

μ(A)=maxpqAp,AqApAq=maxpq[AHA]pq[AHA]pp[AHA]qq.\mu(\mathbf{A}) = \max_{p \neq q} \frac{|\langle \mathbf{A}_{p}, \mathbf{A}_{q} \rangle|}{\|\mathbf{A}_{p}\| \|\mathbf{A}_{q}\|} = \max_{p \neq q} \frac{|[\mathbf{A}^{H}\mathbf{A}]_{pq}|}{\sqrt{[\mathbf{A}^{H}\mathbf{A}]_{pp} [\mathbf{A}^{H}\mathbf{A}]_{qq}}}.

The coherence satisfies μ[0,1]\mu \in [0, 1], with μ=0\mu = 0 meaning all columns are orthogonal and μ=1\mu = 1 meaning two columns are collinear (indistinguishable).

The Welch bound provides a lower limit:

μ(A)NMM(N1).\mu(\mathbf{A}) \geq \sqrt{\frac{N - M}{M(N - 1)}}.

For the overcomplete case N>MN > M (typical in imaging), the Welch bound is approximately 1/M1/\sqrt{M}. Equiangular tight frames achieve this bound, but physical sensing matrices generally do not.

Theorem: Coherence-Based Recovery Guarantee

If c\mathbf{c} is ss-sparse and the mutual coherence satisfies

s<12(1+1μ(A)),s < \frac{1}{2}\left(1 + \frac{1}{\mu(\mathbf{A})}\right),

then both Basis Pursuit (the 1\ell_1 relaxation) and OMP recover c\mathbf{c} exactly from y=Ac\mathbf{y} = \mathbf{A}\mathbf{c} in the noiseless case.

Low coherence means that no two columns of A\mathbf{A} point in nearly the same direction, so the contribution of any sparse component can be distinguished from every other. For RF imaging, this translates to the requirement that no two voxels produce nearly identical measurement signatures --- which fails when two voxels are too close together (within the resolution cell) or when grating lobes create ambiguities.

Definition:

Restricted Isometry Property (RIP)

A matrix A\mathbf{A} satisfies the Restricted Isometry Property of order ss with constant δs[0,1)\delta_s \in [0, 1) if for all ss-sparse vectors c\mathbf{c}:

(1δs)c22Ac22(1+δs)c22.(1 - \delta_s)\|\mathbf{c}\|_2^2 \leq \|\mathbf{A}\mathbf{c}\|_2^2 \leq (1 + \delta_s)\|\mathbf{c}\|_2^2.

Equivalently, every M×sM \times s sub-matrix formed by selecting ss columns of A\mathbf{A} has all singular values in [1δs,1+δs][\sqrt{1 - \delta_s}, \sqrt{1 + \delta_s}].

The RIP guarantees that A\mathbf{A} approximately preserves the 2\ell_2 norm of every sparse vector --- a far stronger condition than low coherence, which only controls pairwise column correlations. RIP with δ2s<21\delta_{2s} < \sqrt{2} - 1 is sufficient for exact recovery of ss-sparse signals via Basis Pursuit.

Example: Coherence of a ULA Sensing Matrix

A ULA with Nvirt=32N_{\text{virt}} = 32 virtual elements at half-wavelength spacing images a scene of N=64N = 64 angular bins uniformly spaced over [90,90][-90^\circ, 90^\circ]. Compute the mutual coherence μ(A)\mu(\mathbf{A}).

Theorem: RIP for Kronecker-Structured Matrices

Let A=A2A1\mathbf{A} = \mathbf{A}_{2} \otimes \mathbf{A}_{1} where AkCmk×nk\mathbf{A}_{k} \in \mathbb{C}^{m_k \times n_k}. If each factor satisfies the RIP of order sks_k with constant δsk(k)\delta_{s_k}^{(k)}, then the Kronecker product satisfies the RIP of order s1s2s_1 s_2 with constant

δs1s2δs1(1)+δs2(2)+δs1(1)δs2(2).\delta_{s_1 s_2} \leq \delta_{s_1}^{(1)} + \delta_{s_2}^{(2)} + \delta_{s_1}^{(1)}\delta_{s_2}^{(2)}.

In particular, if both factors have δskδ\delta_{s_k} \leq \delta, then δs1s22δ+δ2\delta_{s_1 s_2} \leq 2\delta + \delta^2.

This result (Jokar and Mehta, 2009) shows that Kronecker products inherit RIP from their factors, with the sparsity level being multiplicative: if each factor handles sks_k-sparse signals, the product handles s1s2s_1 s_2-sparse signals. The constant degrades additively, which is acceptable for moderate individual RIP constants.

Common Mistake: RIP Is Practically Unverifiable for Physical Matrices

Mistake:

Attempting to verify the RIP of the physical sensing matrix A\mathbf{A} by checking the condition number of all (Ns)\binom{N}{s} sub-matrices.

Correction:

RIP verification is NP-hard in general. For random matrices, RIP is proven probabilistically (all sub-matrices are well-conditioned with high probability). For the structured physical A\mathbf{A}, no such probabilistic argument is available. Instead, use the restricted eigenvalue condition (REC) or the compatibility condition, which are weaker but sufficient for LASSO recovery, and can be estimated numerically for specific geometries. Alternatively, the Kronecker RIP bound (Theorem 7.3.3) provides a constructive guarantee when the individual factors have verifiable RIP.

Random vs. Physical Sensing Matrix Properties

PropertyRandom GaussianPhysical (Kronecker DFT)
Coherence μ\muO(1/M)O(1/\sqrt{M}) (near Welch bound)O(1/M4)O(1/\sqrt[4]{M}) or worse (high for nearby voxels)
RIP order ssO(M/log(N/s))O(M/\log(N/s))Difficult to bound; depends on geometry
Condition number κ\kappa(N/M+1)/(N/M1)(\sqrt{N/M}+1)/(\sqrt{N/M}-1)Product of factor κ\kappa values; can be very large
AMP convergenceGuaranteed (state evolution holds)Can diverge; use OAMP instead (Ch 17)
StorageO(MN)O(MN)O(mknk)O(\sum m_k n_k) via Kronecker
Matvec costO(MN)O(MN)O(N4/3)O(N^{4/3}) or O(NlogN)O(N\log N) via FFT

Quick Check

A sensing matrix has mutual coherence μ=0.5\mu = 0.5. The coherence-based guarantee ensures exact sparse recovery for sparsity up to what level?

s<1.5s < 1.5 (i.e., only s=1s = 1)

s<3s < 3

s<10s < 10

No guarantee exists

Restricted Isometry Property (RIP)

A property of a matrix requiring that it approximately preserves the 2\ell_2 norm of every ss-sparse vector, with distortion bounded by a constant δs<1\delta_s < 1. The RIP is a sufficient condition for stable sparse recovery via 1\ell_1 minimization and other compressed sensing algorithms.

Related: {{Ref:Gloss Mutual Coherence}}, {{Ref:Fsi:Ch13:Gloss Rip}}

Mutual coherence

The maximum absolute normalized inner product between any two distinct columns of a matrix. Low coherence (μ1\mu \ll 1) indicates that the measurement signatures of different voxels are distinct, which is favorable for sparse recovery.

Related: {{Ref:Gloss Rip}}

Historical Note: Compressed Sensing Meets Imaging

2007--present

The application of compressed sensing to imaging was pioneered independently by Lustig, Donoho, and Pauly (2007) for MRI and by Baraniuk and Steeghs (2007) for radar. The key insight was that natural images are sparse in some transform domain (wavelets, total variation), and structured sensing matrices arising from physics (partial Fourier, steering vectors) can satisfy weakened forms of the RIP. However, the gap between the theoretical guarantees (derived for random matrices) and practical performance (on structured matrices) remains significant. Bridging this gap for RF imaging is an active area of CommIT research.

Key Takeaway

The physical sensing matrix A\mathbf{A} has high mutual coherence (DFT-like columns) compared to random matrices. Standard compressed sensing guarantees based on coherence or RIP are pessimistic for structured operators. For Kronecker products, the RIP constant degrades additively while the sparsity level multiplies across factors. Algorithms designed for random matrices (AMP) can fail on A\mathbf{A}; use OAMP or other structure-aware methods instead.