Definition and Classification
From Random Variables to Random Functions
In Parts I--III we studied random variables: a single measurement, or a finite collection of measurements. But the signals we encounter in communications β thermal noise at a receiver, a fading channel gain, a transmitted waveform β are functions of time, each realization producing an entire waveform. A stochastic process is the mathematical object that captures this: a random variable that takes values not in or , but in a space of functions. The challenge, and the beauty, is that we can still characterize such objects through their finite-dimensional projections.
Definition: Stochastic Process (Definition 48)
Stochastic Process (Definition 48)
Let be a probability space. A stochastic process is a family of random variables indexed by a set , where each is an -measurable mapping with or .
Equivalently, a stochastic process is a function of two variables:
For fixed , is a random variable. For fixed , is a deterministic function called a sample path or realization.
Definition: Classification by Index Set and State Space
Classification by Index Set and State Space
A stochastic process is classified by:
-
Index set :
- Continuous-time (CT): (e.g., or ).
- Discrete-time (DT): (e.g., or ). We write instead of .
-
State space :
- Continuous-valued: or .
- Discrete-valued: is countable (e.g., ).
The Four Classes of Stochastic Processes
Combining the two classifications yields four types:
| Continuous-valued | Discrete-valued | |
|---|---|---|
| CT | Thermal noise | Poisson counting process |
| DT | Sampled fading | Binary sequence |
In communications, we most frequently encounter CT continuous-valued processes (noise, channel fading) and DT continuous-valued processes (sampled signals, symbol sequences after modulation).
Definition: Sample Path (Realization)
Sample Path (Realization)
For a fixed outcome , the function is called a sample path, realization, or trajectory of the process. The set of all sample paths is the ensemble of the process.
A single oscilloscope trace of receiver noise is one sample path. The statistical properties of the process describe the behavior across the entire ensemble of possible traces.
Sample Path Realizations
Visualize sample paths of different stochastic process types: i.i.d. sequence, moving average, and random sinusoid. Each "Generate" changes the random seed.
Parameters
Definition: Finite-Dimensional Distributions (fdds)
Finite-Dimensional Distributions (fdds)
The finite-dimensional distributions (fdds) of a process are the collection of all joint CDFs for every and every choice of time indices .
The fdds completely characterize the probabilistic behavior of the process. In principle, knowing all fdds tells us everything about the process β but in practice, we rarely have access to distributions of order higher than two.
Theorem: Kolmogorov Consistency Conditions
A collection of joint CDFs are the fdds of some stochastic process if and only if they satisfy:
-
Marginalization: For any permutation of ,
-
Compatibility:
The first condition says that relabeling the time indices just relabels the arguments. The second says that "forgetting" the last time point by letting gives back the lower-dimensional distribution. These are the minimal consistency requirements for a coherent probabilistic description.
Necessity
If the fdds come from a process on , both conditions follow directly from the properties of joint CDFs.
Sufficiency (Kolmogorov Extension Theorem)
This is the deep direction. Given consistent fdds, Kolmogorov's extension theorem guarantees the existence of a probability space and a process with those fdds. The proof constructs the process on the product space using the Daniell-Kolmogorov extension. We omit the full measure-theoretic construction, which requires the theory of projective limits.
Example: The Random Sinusoid
Let , where is a positive random variable and is independent of . Find the mean function and the autocorrelation function of .
Mean function
\mu_X(t) = 0t$.
Autocorrelation
\tau = t_1 - t_2r_{XX}(\tau) = \frac{\mathbb{E}[A^2]}{2}\cos(2\pi \nu_0 \tau)$.
Interpretation
The random sinusoid with uniform phase has zero mean and an autocorrelation that depends only on the time difference . As we will see in Β§13.2, this makes it a wide-sense stationary process. The average power is .
Example: i.i.d. Processes
Let be an i.i.d. sequence with and . Find the autocorrelation .
Diagonal vs. off-diagonal
For : .
For : since and are independent, .
Compact form
\delta[k]n - m$, so the i.i.d. process is WSS (in fact, it is strictly stationary by construction).
Definition: Second-Order Process
Second-Order Process
A process is called a second-order process if for all .
Second-order processes are the natural setting for correlation analysis. The autocorrelation is well-defined whenever , by the Cauchy-Schwarz inequality.
Quick Check
A Poisson counting process , where and , is classified as:
Continuous-time, continuous-valued
Continuous-time, discrete-valued
Discrete-time, continuous-valued
Discrete-time, discrete-valued
The index is continuous, and takes values in .
Quick Check
For the random sinusoid with random and , a sample path is obtained by:
Fixing and varying
Fixing and varying
Averaging over all
A sample path is a deterministic function of for a fixed realization .
Historical Note: Kolmogorov and the Foundations of Stochastic Processes
1930sAndrey Kolmogorov's 1933 monograph Grundbegriffe der Wahrscheinlichkeitsrechnung ("Foundations of Probability Theory") placed probability on rigorous measure-theoretic ground and, as a byproduct, provided the extension theorem that guarantees the existence of stochastic processes from consistent finite-dimensional distributions. This resolved a fundamental question: does a "random function" actually exist as a mathematical object? Kolmogorov showed that it does, provided the fdds are consistent. His framework remains the standard foundation for all of modern probability and stochastic process theory.
Stochastic Process
A family of random variables indexed by a parameter set , typically representing time.
Related: Sample Path, Finite-Dimensional Distributions (fdds)
Sample Path
A single realization of a stochastic process, obtained by fixing the outcome and viewing the process as a deterministic function of time.
Related: Stochastic Process
Finite-Dimensional Distributions (fdds)
The collection of all joint distributions of the process evaluated at finitely many time indices. The fdds completely characterize the process.
Related: Stochastic Process
Common Mistake: A Process Is Not Just a Sequence of Random Variables
Mistake:
Treating a stochastic process as merely a collection of independent random variables, ignoring the dependence structure across time.
Correction:
The joint behavior across time indices β captured by the fdds β is the essence of a stochastic process. A process with independent time samples (i.i.d.) is a very special case. In general, and are correlated, and this correlation is what makes the process useful for modeling physical phenomena like fading channels.
Why This Matters: The Wireless Channel as a Stochastic Process
A wireless channel gain between a transmitter and receiver is a stochastic process: at each time , the gain is a complex random variable whose realization depends on the scattering environment and the relative motion of transmitter and receiver. A single "channel trace" measured with a channel sounder is one sample path. The statistical description of β through its fdds and, more practically, its autocorrelation function β determines the coherence time and shapes the design of pilot spacing, coding, and feedback protocols.
See full treatment in Chapter 14
Key Takeaway
A stochastic process is a random function of time, completely characterized by its finite-dimensional distributions. For engineering purposes, we almost always work with the first- and second-order statistics (mean and autocorrelation), which motivates the WSS framework developed in the following sections.