Definition and Finite-Dimensional Distributions
Definition: Gaussian Process
Gaussian Process
A real-valued stochastic process is a Gaussian process if for every and every choice of time instants , the random vector has a joint Gaussian distribution , where and .
Equivalently, every finite linear combination is a Gaussian random variable for all .
The equivalence follows because , and a random vector is jointly Gaussian if and only if every linear combination of its components is Gaussian.
Complete Characterization by First and Second Moments
A Gaussian distribution is uniquely determined by its mean and covariance. Therefore, a Gaussian process is completely characterized by two functions:
- Mean function:
- Covariance function:
This is a remarkable simplification: while a general process requires the full hierarchy of finite-dimensional distributions, for a Gaussian process the mean and covariance functions tell us everything.
Theorem: Closure of Gaussian Processes Under Linear Operations
Let be a Gaussian process and let for a deterministic function (where the integral exists in mean square). Then is a Gaussian random variable.
More generally, if is Gaussian and , then is also a Gaussian process.
Gaussian distributions are closed under linear operations (sums, integrals, limits in mean square). Since an integral is a limit of finite sums, and each finite sum of Gaussian RVs is Gaussian, the limit inherits Gaussianity.
Approximate the integral by a Riemann sum
Partition into subintervals. The Riemann sum is a finite linear combination of jointly Gaussian RVs, hence .
Take the mean-square limit
As , . A mean-square limit of Gaussian RVs is Gaussian: the characteristic function converges pointwise to , which is also Gaussian.
Definition: Properties of the Covariance Function
Properties of the Covariance Function
A function is a valid covariance function if and only if it satisfies:
- Symmetry:
- Positive semi-definiteness: For all , all , and all ,
Any function satisfying these two conditions defines a unique Gaussian process (up to the choice of mean function), by the Kolmogorov extension theorem.
Condition 2 is equivalent to requiring that every finite-dimensional covariance matrix be positive semi-definite.
Example: Is a Gaussian Process?
Let and uniform on , with and independent. Determine whether is a Gaussian process.
Check the definition
For fixed , . Since is uniform and independent of , the product is not Gaussian in general โ the uniform phase introduces nonlinearity.
Consider two time instants
Take . Then . The components are perfectly dependent through and . Their joint distribution is not Gaussian โ it lives on a random curve in .
Conclusion
is not a Gaussian process, even though it is WSS and its marginal distribution may look "bell-shaped" when averaged over . The failure is that the joint distributions are not Gaussian.
Example: Gaussian Process from Random Fourier Coefficients
Let for be independent Gaussian RVs. Show that , where are deterministic functions, is a Gaussian process. Compute its mean and covariance.
Verify Gaussianity
For any times , the vector is a linear transformation of the Gaussian vector . Linear transformations of Gaussian vectors are Gaussian, so is a GP.
Mean function
since each has zero mean.
Covariance function
using independence: .
Theorem: WSS Gaussian Process Is Strictly Stationary
If is a Gaussian process that is wide-sense stationary (WSS) โ i.e., is constant and depends only on โ then is strictly stationary.
A Gaussian distribution is completely determined by its mean and covariance. If these are time-shift invariant (WSS), then all finite-dimensional distributions are time-shift invariant, which is the definition of strict stationarity. For non-Gaussian processes, WSS does not imply strict stationarity because higher-order moments could still depend on absolute time.
Write the finite-dimensional distributions
For any , the vector is where .
Apply a time shift
Shift all times by : the vector is with .
Conclude
Since and the mean is unchanged, the two distributions are identical. This holds for all , all , and all , so is strictly stationary.
Common Mistake: WSS Does Not Imply Strict Stationarity in General
Mistake:
Assuming that any WSS process is strictly stationary.
Correction:
This implication holds only for Gaussian processes. For a general process, WSS constrains only the first two moments; higher-order statistics (skewness, kurtosis, etc.) could still vary with time. A standard counterexample: let be i.i.d. Bernoulli() taking values . This is WSS (constant mean 0, ) and also strictly stationary. But replace with a non-symmetric distribution keeping the same mean and variance โ the process is still WSS but no longer strictly stationary.
Sample Paths of a Gaussian Process
Draw sample paths from a zero-mean GP with different covariance kernels. Observe how the kernel controls smoothness, correlation length, and sample variability.
Parameters
Quick Check
A Gaussian process is completely characterized by which of the following?
Its mean function only
Its mean function and covariance function
All of its finite-dimensional PDFs
Its autocorrelation function and PSD
A Gaussian distribution is uniquely determined by its mean and covariance. Since all finite-dimensional distributions of a GP are Gaussian, specifying the mean and covariance functions determines everything.
Gaussian Process
A stochastic process such that every finite-dimensional marginal is jointly Gaussian. Completely determined by its mean and covariance functions.
Related: Wide-Sense Stationary (WSS), Covariance Function (Kernel)
Covariance Function (Kernel)
A function that must be symmetric and positive semi-definite. For a WSS process, depends only on the time lag.
Related: Gaussian Process, Positive Semi-Definite (Function)
Positive Semi-Definite (Function)
A function is positive semi-definite if for all , all , and all , we have . Equivalently, its Fourier transform (the PSD) is nonnegative.
Related: Covariance Function (Kernel)
Historical Note: Kolmogorov and the Foundation of Stochastic Processes
1930sAndrey Kolmogorov's 1933 monograph Grundbegriffe der Wahrscheinlichkeitsrechnung (Foundations of Probability Theory) placed probability on a rigorous measure-theoretic footing. His extension theorem guarantees the existence of a stochastic process with any consistent family of finite-dimensional distributions. For Gaussian processes, this means that specifying a mean and a positive semi-definite covariance function is sufficient to construct the process โ a fact we use freely throughout this chapter.
Key Takeaway
A Gaussian process is the simplest infinite-dimensional generalization of the multivariate Gaussian distribution: two functions โ the mean and the covariance โ completely determine all statistical properties. This is why Gaussian models dominate engineering: they are rich enough to model complex phenomena yet tractable enough for closed-form analysis.