Spectral Properties and Condition Number

Why Spectral Analysis Matters for Reconstruction

The singular values of A\mathbf{A} determine everything about the reconstruction problem: the condition number controls noise amplification, the spectral gap determines whether regularization is needed, and the singular vector structure reveals which scene features are well-resolved and which are invisible to the measurements. Because A\mathbf{A} has Kronecker structure, its spectral properties are completely determined by the spectra of the three factor matrices --- and these, in turn, are determined by the physical imaging geometry.

Definition:

Condition Number of the Sensing Matrix

The condition number of A∈CMΓ—N\mathbf{A} \in \mathbb{C}^{M \times N} is

ΞΊ(A)=Οƒmax⁑(A)Οƒmin⁑(A)\kappa(\mathbf{A}) = \frac{\sigma_{\max}(\mathbf{A})}{\sigma_{\min}(\mathbf{A})}

where Οƒmax⁑\sigma_{\max} and Οƒmin⁑\sigma_{\min} are the largest and smallest non-zero singular values. For the least-squares solution of the normal equation

c^=(AHA)βˆ’1AHy,\hat{\mathbf{c}} = (\mathbf{A}^{H}\mathbf{A})^{-1}\mathbf{A}^{H}\mathbf{y},

the relative reconstruction error is bounded by

βˆ₯c^βˆ’cβˆ₯βˆ₯cβˆ₯≀κ(A)β‹…βˆ₯wβˆ₯βˆ₯yβˆ₯.\frac{\|\hat{\mathbf{c}} - \mathbf{c}\|}{\|\mathbf{c}\|} \leq \kappa(\mathbf{A}) \cdot \frac{\|\mathbf{w}\|}{\|\mathbf{y}\|}.

For imaging inverse problems, ΞΊ(A)=∞\kappa(\mathbf{A}) = \infty is common when M<NM < N (underdetermined system). Regularization effectively replaces Οƒmin⁑\sigma_{\min} with the regularization parameter Ξ»\lambda, giving an effective condition number ΞΊeff=Οƒmax⁑/Ξ»\kappa_{\text{eff}} = \sigma_{\max}/\lambda.

Theorem: SVD of Kronecker-Structured Sensing Matrices

If A=A3βŠ—A2βŠ—A1\mathbf{A} = \mathbf{A}_{3} \otimes \mathbf{A}_{2} \otimes \mathbf{A}_{1} and each factor has SVD Ak=UkΞ£kVkH\mathbf{A}_{k} = \mathbf{U}_k \boldsymbol{\Sigma}_k \mathbf{V}_k^H, then:

Singular values: The singular values of A\mathbf{A} are all triple products

Οƒ(i,j,l)(A)=Οƒi(A3)β‹…Οƒj(A2)β‹…Οƒl(A1)\sigma_{(i,j,l)}(\mathbf{A}) = \sigma_i(\mathbf{A}_{3}) \cdot \sigma_j(\mathbf{A}_{2}) \cdot \sigma_l(\mathbf{A}_{1})

for all valid multi-indices (i,j,l)(i, j, l).

Condition number: ΞΊ(A)=ΞΊ(A3)β‹…ΞΊ(A2)β‹…ΞΊ(A1).\kappa(\mathbf{A}) = \kappa(\mathbf{A}_{3}) \cdot \kappa(\mathbf{A}_{2}) \cdot \kappa(\mathbf{A}_{1}).

Singular vectors: The left and right singular vectors are Kronecker products of the factor singular vectors: u(i,j,l)=ui(3)βŠ—uj(2)βŠ—ul(1)\mathbf{u}_{(i,j,l)} = \mathbf{u}_i^{(3)} \otimes \mathbf{u}_j^{(2)} \otimes \mathbf{u}_l^{(1)} and similarly for the right singular vectors.

The Kronecker structure means the conditioning of the full imaging system is the product of the conditionings in each dimension. If either the angular coverage or the frequency coverage is poor (large ΞΊ\kappa for that factor), the overall problem is even more ill-conditioned than either dimension alone. This motivates joint optimization of both array geometry and frequency allocation.

Definition:

The Gram Matrix as Point-Spread Function

The Gram matrix G=AHA∈CNΓ—N\mathbf{G} = \mathbf{A}^{H}\mathbf{A} \in \mathbb{C}^{N \times N} acts as the imaging system's point-spread function (PSF) in discrete form. Applying the matched filter to a point scatterer at voxel qq (i.e., c=eq\mathbf{c} = \mathbf{e}_q) gives

c^MF=AH(Aeq+w)=Geq+AHw=gq+AHw\hat{\mathbf{c}}^{\text{MF}} = \mathbf{A}^{H}(\mathbf{A}\mathbf{e}_q + \mathbf{w}) = \mathbf{G}\mathbf{e}_q + \mathbf{A}^{H}\mathbf{w} = \mathbf{g}_q + \mathbf{A}^{H}\mathbf{w}

where gq\mathbf{g}_q is the qq-th column of G\mathbf{G}. The off-diagonal elements GpqG_{pq} for p≠qp \neq q measure the sidelobe leakage from voxel qq into voxel pp.

The mutual coherence is the worst-case normalized sidelobe:

ΞΌ(A)=max⁑pβ‰ q∣Gpq∣GppGqq.\mu(\mathbf{A}) = \max_{p \neq q} \frac{|G_{pq}|}{\sqrt{G_{pp} G_{qq}}}.

The PSF of a Kronecker-structured sensing matrix is itself a Kronecker product of per-dimension PSFs. The main lobe width is inversely proportional to the k-space coverage extent in each dimension, as established in Ch 06.6.

Example: Conditioning of ULA vs. Distributed MIMO

Compare the condition numbers of the sensing matrix for:

(a) A co-located ULA with Nt=4N_t = 4, Nr=8N_r = 8, half-wavelength spacing, Nf=16N_f = 16 subcarriers, imaging a 30βˆ˜Γ—30∘30^\circ \times 30^\circ scene on a 16Γ—1616 \times 16 grid.

(b) A distributed MIMO with the same total number of elements placed randomly in a 10λ×10Ξ»10\lambda \times 10\lambda aperture.

Condition Number vs. Array Geometry

Explore how the condition number of A\mathbf{A} depends on array geometry and frequency allocation. Compare random matrices (well-conditioned), physical ULA matrices (structured), and Kronecker products (condition number is the product of factors). Observe how the singular value distribution changes as you vary the number of antennas and the angular coverage.

Parameters
4
8
16
60

Point-Spread Function from the Sensing Operator

Visualizes the PSF by computing AHAeq\mathbf{A}^{H}\mathbf{A}\mathbf{e}_q for a point scatterer at the grid center. The PSF main lobe width is inversely proportional to the k-space coverage extent. Adjust the array parameters to see how the PSF changes shape.

Parameters
4
8
16
16

Common Mistake: Random Matrix Intuition Fails for Physical Sensing

Mistake:

Applying results from random matrix theory (e.g., Marchenko-Pastur distribution, ΞΊβ‰ˆ(M/N+1)/(M/Nβˆ’1)\kappa \approx (\sqrt{M/N} + 1)/(\sqrt{M/N} - 1)) to the physical sensing matrix A\mathbf{A}.

Correction:

The physical A\mathbf{A} is highly structured: its columns are parameterized by continuous angles and frequencies, creating strong correlations absent in random matrices. The singular value distribution of a Kronecker-structured DFT-like matrix is fundamentally different from Marchenko-Pastur. Algorithms that assume random A\mathbf{A} (such as AMP --- see Ch 17.1) can diverge catastrophically when applied to the physical operator. Always verify spectral properties numerically for the specific geometry.

Why This Matters: Connection to Massive MIMO Channel Estimation

The conditioning problem we face in RF imaging is structurally identical to the channel estimation problem in massive MIMO (Telecom Ch 16): the pilot observation matrix is a partial DFT (steering vectors at discrete angles), and ill-conditioning arises when the angular coverage is incomplete. In massive MIMO, the condition number of the pilot matrix determines the MSE of the LMMSE channel estimator. The same preconditioning techniques we develop in Section 7.4 apply to both problems.

Quick Check

If the Rx factor has ΞΊ(ARx)=5\kappa(\mathbf{A}_{\text{Rx}}) = 5, the Tx factor has ΞΊ(ATx)=3\kappa(\mathbf{A}_{\text{Tx}}) = 3, and the frequency factor has ΞΊ(Af)=2\kappa(\mathbf{A}_{f}) = 2, what is ΞΊ(A)\kappa(\mathbf{A})?

1010

3030

55 (the maximum)

38\sqrt{38}

Key Takeaway

The singular values of A\mathbf{A} are all triple products of the factor singular values. The condition number is the product of the factor condition numbers, so even mild ill-conditioning in one dimension is amplified multiplicatively. The Gram matrix AHA\mathbf{A}^{H}\mathbf{A} is the discrete PSF, and its off-diagonal elements determine the mutual coherence. Physical sensing matrices have fundamentally different spectral properties from random matrices --- algorithms must be designed accordingly.