Exercises
ch16-ex01
EasyA WSS process has autocorrelation for some . Is this process m.s.-continuous?
Check whether is continuous at .
Evaluate at the origin
. Since is continuous, .
Conclude
The autocorrelation is continuous at , so by TCharacterization of Mean-Square Continuity, the process is m.s.-continuous.
ch16-ex02
EasyA random process has PSD for constants . Does the m.s. derivative exist?
Compute and check convergence.
Check the integral
. For large , this behaves as , a constant. The integral diverges.
Conclude
The m.s. derivative does not exist. This is the Lorentzian PSD (corresponding to the OU process), which decays too slowly () for the derivative condition.
ch16-ex03
MediumA WSS process has PSD . Find the PSD of its m.s. derivative and verify that the second derivative also exists.
Apply for the first derivative PSD.
Check for the second derivative.
First derivative PSD
. This decays as for large , which is integrable.
Second derivative existence
. For large , the integrand , so the integral diverges.
Conclusion
The first derivative exists but the second does not. The PSD decays as , so only derivative is possible (requires decay faster than ).
ch16-ex04
MediumLet be a zero-mean WSS process with PSD for and zero otherwise. Compute .
The variance of the derivative is .
Compute the integral
$
Interpret
The derivative power grows as β the wider the bandwidth, the more high-frequency energy is amplified by differentiation. This quantifies how "rough" a bandlimited process becomes as bandwidth increases.
ch16-ex05
EasyShow that for any zero-mean WSS process that has an m.s. derivative.
Use the cross-correlation evaluated at .
Compute the cross-correlation
. Setting : .
Use the symmetry of $ tn{acorr_ct}$
For a real-valued WSS process, , so . At : , hence .
Conclude
. The process and its derivative are uncorrelated at the same time instant. For Gaussian processes, they are also independent.
ch16-ex06
MediumLet where is a positive constant and . Find the m.s. derivative and its PSD.
First compute as a random process and verify it satisfies the m.s. limit definition.
Compute the derivative directly
. This is well-defined pointwise (and hence in m.s.) since is differentiable path-by-path.
Verify via the autocorrelation
. . , confirming existence.
PSD of the derivative
. So .
ch16-ex07
EasyA bandlimited WSS process has PSD for kHz and zero otherwise. What is the minimum sampling rate for perfect reconstruction?
Apply the Nyquist criterion: .
Apply Nyquist
kHz, so kHz (10,000 samples/second). ms.
ch16-ex08
MediumA bandlimited process with kHz has PSD for Hz (triangular PSD). Compute the autocorrelation of the Nyquist-rate samples.
Compute via Fourier transform, then sample at s.
Compute the autocorrelation
The triangular PSD is the convolution of a rectangle with itself in the frequency domain. By the Fourier transform pair, .
Sample at Nyquist rate
s. . For : . For odd : in general. For : .
Interpret
Unlike the flat PSD case, the triangular PSD yields correlated Nyquist-rate samples. The correlation decays but is nonzero β the non-flat PSD shape introduces correlation.
ch16-ex09
MediumA WSS Gaussian process with flat PSD for is sampled at rate (2x oversampling). Find the covariance matrix of two consecutive samples .
Use with .
Compute the autocorrelation
. , so and .
Form the covariance matrix
Since the process is zero-mean, the covariance matrix equals the correlation matrix: The samples are correlated due to oversampling.
ch16-ex10
HardProve that the reconstruction error of the truncated sampling expansion satisfies as for a bandlimited process.
Use Parseval's theorem to work in the frequency domain.
Show the partial sums of the cardinal series converge in .
Express the error spectrally
Define . By the isometry between time and frequency domains, where .
Show pointwise convergence of the sum
The cardinal series converges to for (Poisson summation formula). Therefore pointwise for each .
Apply dominated convergence
for all (triangle inequality, since each term is bounded by 1). Since is integrable on , dominated convergence gives .
ch16-ex11
EasyHow many independent samples (degrees of freedom) does a bandlimited Gaussian process with MHz have on an interval of ms?
The number of degrees of freedom is approximately .
Compute
degrees of freedom. For a flat-PSD Gaussian process, these correspond to 2000 i.i.d. Nyquist-rate samples.
ch16-ex12
MediumFind the KL expansion of on where are independent .
Compute and identify the eigenfunctions by inspection.
Compute the autocorrelation
.
Identify eigenfunctions
On with , the functions and are orthonormal. Computing yields for .
Write the expansion
with , , . This is a two-term KL expansion (the process lives in a 2D subspace).
ch16-ex13
HardFind the KL eigenfunctions and eigenvalues for a WSS process with autocorrelation on .
Convert the Fredholm equation to a differential equation as in the Wiener process example.
The boundary conditions differ from the Wiener case.
Set up the integral equation
. Split at : .
Differentiate twice
Two differentiations yield . Simplifying: where .
Boundary conditions
The boundary conditions are and . The solution requires the transcendental equation to determine the eigenfrequencies .
Eigenvalues
Once is found, . The eigenvalues decay and must be computed numerically for each .
ch16-ex14
MediumShow that the total energy in the KL expansion equals the trace of the autocorrelation operator: .
Use Mercer's theorem.
Apply Mercer's expansion
By Mercer's theorem, with absolute and uniform convergence.
Evaluate the diagonal
Setting : . Integrating over : , since are orthonormal.
ch16-ex15
HardFor a WSS process on with large, show that the KL eigenvalues satisfy and the eigenfunctions approach .
Substitute into the Fredholm equation and use the WSS property.
Substitute the Fourier basis
where .
Approximate for large $T$
For large , the integration domain covers most of the support of (assuming it decays). The integral approaches by the Wiener-Khinchin theorem.
Identify the eigenvalue
This gives where . So , and the eigenfunctions are approximately complex exponentials. (The factor of vs. depends on the normalization convention.)
ch16-ex16
EasyIn the KL expansion of a non-Gaussian process, are the coefficients independent?
What does the KL theory guarantee? What extra property does Gaussianity provide?
Answer
The KL theory guarantees that are uncorrelated: for . However, for non-Gaussian processes, uncorrelatedness does not imply independence. The may have higher-order statistical dependencies. Only for Gaussian processes does uncorrelatedness imply independence (since the joint distribution is fully characterized by second-order statistics).
ch16-ex17
MediumA bandlimited process with Hz on s. Approximately how many significant eigenvalues does the KL expansion have?
Use the degrees-of-freedom formula .
Compute
. The KL expansion has approximately 20 significant eigenvalues, with the remaining eigenvalues being negligibly small. This is consistent with the sampling theorem: 20 Nyquist-rate samples suffice to represent the process on .
ch16-ex18
HardLet be a zero-mean Gaussian random vector with covariance . Show that the KL expansion reduces to the eigendecomposition and the KL coefficients are .
The Fredholm integral equation becomes a matrix eigenvalue problem in the discrete case.
Discrete Fredholm equation
The integral becomes in the discrete case, where is an eigenvector and is the eigenvalue.
Eigenvectors as KL basis
The eigenvectors (columns of ) play the role of the eigenfunctions . The KL coefficients are .
Verify uncorrelatedness
. This is PCA: the eigendecomposition of is the discrete KL expansion.
ch16-ex19
ChallengeProve that the KL expansion achieves the minimum mean-square error among all -term orthonormal expansions by using the Courant-Fischer min-max theorem.
The Courant-Fischer theorem states that .
Apply this to the autocorrelation operator viewed as a self-adjoint operator on .
Formulate the problem
The -term truncation error is , where is the autocorrelation operator and is any orthonormal set. Minimizing is equivalent to maximizing .
Apply Courant-Fischer
By the Courant-Fischer theorem for compact self-adjoint operators, , achieved when , the top eigenfunctions.
Conclude optimality
Therefore , and any other -term expansion has error .
ch16-ex20
ChallengeConsider a MIMO channel with spatial covariance where . Explain how the eigendecomposition of serves as a "spatial KL expansion" and why it leads to the JSDM pre-beamforming architecture.
The eigenvectors of define the dominant spatial directions.
Users with similar covariance eigenspaces can be grouped and served with a common pre-beamformer.
Spatial KL expansion
The eigendecomposition expresses the channel vector in the eigenbasis: where has uncorrelated components with variance . This is the discrete, spatial analogue of the KL expansion.
Dimension reduction
If the eigenvalues decay rapidly ( for ), the channel effectively lives in an -dimensional subspace spanned by the dominant eigenvectors . The pre-beamformer projects onto this subspace.
JSDM architecture
In JSDM, users are grouped by covariance similarity. Each group has pre-beamformer spanning the group's dominant eigenmodes. Since different groups occupy approximately orthogonal eigenspaces, inter-group interference is small. Within each group, a second-stage beamformer handles the remaining instantaneous CSI β but in a reduced dimension , dramatically reducing feedback requirements.