Definition and Classification

From Random Variables to Random Functions

In Parts I--III we studied random variables: a single measurement, or a finite collection of measurements. But the signals we encounter in communications β€” thermal noise at a receiver, a fading channel gain, a transmitted waveform β€” are functions of time, each realization producing an entire waveform. A stochastic process is the mathematical object that captures this: a random variable that takes values not in R\mathbb{R} or Rn\mathbb{R}^n, but in a space of functions. The challenge, and the beauty, is that we can still characterize such objects through their finite-dimensional projections.

Definition:

Stochastic Process (Definition 48)

Let (Ξ©,F,P)(\Omega, \mathcal{F}, \mathbb{P}) be a probability space. A stochastic process is a family of random variables {X(t):t∈T}\{X(t) : t \in \mathcal{T}\} indexed by a set T\mathcal{T}, where each X(t)X(t) is an F\mathcal{F}-measurable mapping X(t):Ξ©β†’KX(t) : \Omega \to \mathbb{K} with K=R\mathbb{K} = \mathbb{R} or C\mathbb{C}.

Equivalently, a stochastic process is a function of two variables: X:TΓ—Ξ©β†’K,(t,Ο‰)↦X(t,Ο‰).X : \mathcal{T} \times \Omega \to \mathbb{K}, \quad (t, \omega) \mapsto X(t, \omega).

For fixed tt, X(t,β‹…)X(t, \cdot) is a random variable. For fixed Ο‰\omega, X(β‹…,Ο‰)X(\cdot, \omega) is a deterministic function called a sample path or realization.

Definition:

Classification by Index Set and State Space

A stochastic process {X(t):t∈T}\{X(t) : t \in \mathcal{T}\} is classified by:

  • Index set T\mathcal{T}:

    • Continuous-time (CT): TβŠ†R\mathcal{T} \subseteq \mathbb{R} (e.g., T=[0,∞)\mathcal{T} = [0, \infty) or R\mathbb{R}).
    • Discrete-time (DT): TβŠ†Z\mathcal{T} \subseteq \mathbb{Z} (e.g., T=Z\mathcal{T} = \mathbb{Z} or {0,1,2,…}\{0, 1, 2, \ldots\}). We write XnX_n instead of X(n)X(n).
  • State space K\mathbb{K}:

    • Continuous-valued: K=R\mathbb{K} = \mathbb{R} or C\mathbb{C}.
    • Discrete-valued: K\mathbb{K} is countable (e.g., {0,1}\{0, 1\}).

The Four Classes of Stochastic Processes

Combining the two classifications yields four types:

Continuous-valued Discrete-valued
CT Thermal noise N(t)N(t) Poisson counting process N(t)N(t)
DT Sampled fading HnH_n Binary sequence Xn∈{0,1}X_n \in \{0,1\}

In communications, we most frequently encounter CT continuous-valued processes (noise, channel fading) and DT continuous-valued processes (sampled signals, symbol sequences after modulation).

Definition:

Sample Path (Realization)

For a fixed outcome Ο‰βˆˆΞ©\omega \in \Omega, the function t↦X(t,Ο‰)t \mapsto X(t, \omega) is called a sample path, realization, or trajectory of the process. The set of all sample paths is the ensemble of the process.

A single oscilloscope trace of receiver noise is one sample path. The statistical properties of the process describe the behavior across the entire ensemble of possible traces.

Sample Path Realizations

Visualize sample paths of different stochastic process types: i.i.d. sequence, moving average, and random sinusoid. Each "Generate" changes the random seed.

Parameters
5
42

Definition:

Finite-Dimensional Distributions (fdds)

The finite-dimensional distributions (fdds) of a process {X(t):t∈T}\{X(t) : t \in \mathcal{T}\} are the collection of all joint CDFs Ft1,…,tN(x1,…,xN)=P(X(t1)≀x1,…,X(tN)≀xN)F_{t_1, \ldots, t_N}(x_1, \ldots, x_N) = \mathbb{P}(X(t_1) \leq x_1, \ldots, X(t_N) \leq x_N) for every Nβ‰₯1N \geq 1 and every choice of time indices t1,…,tN∈Tt_1, \ldots, t_N \in \mathcal{T}.

The fdds completely characterize the probabilistic behavior of the process. In principle, knowing all fdds tells us everything about the process β€” but in practice, we rarely have access to distributions of order higher than two.

Theorem: Kolmogorov Consistency Conditions

A collection of joint CDFs {Ft1,…,tN}\{F_{t_1, \ldots, t_N}\} are the fdds of some stochastic process if and only if they satisfy:

  1. Marginalization: For any permutation Οƒ\sigma of {1,…,N}\{1, \ldots, N\}, FtΟƒ(1),…,tΟƒ(N)(xΟƒ(1),…,xΟƒ(N))=Ft1,…,tN(x1,…,xN).F_{t_{\sigma(1)}, \ldots, t_{\sigma(N)}}(x_{\sigma(1)}, \ldots, x_{\sigma(N)}) = F_{t_1, \ldots, t_N}(x_1, \ldots, x_N).

  2. Compatibility: lim⁑xNβ†’βˆžFt1,…,tN(x1,…,xN)=Ft1,…,tNβˆ’1(x1,…,xNβˆ’1).\lim_{x_N \to \infty} F_{t_1, \ldots, t_N}(x_1, \ldots, x_N) = F_{t_1, \ldots, t_{N-1}}(x_1, \ldots, x_{N-1}).

The first condition says that relabeling the time indices just relabels the arguments. The second says that "forgetting" the last time point by letting xNβ†’βˆžx_N \to \infty gives back the lower-dimensional distribution. These are the minimal consistency requirements for a coherent probabilistic description.

Example: The Random Sinusoid

Let X(t)=Acos⁑(2πν0t+Θ)X(t) = A \cos(2\pi \nu_0 t + \Theta), where AA is a positive random variable and Θ∼Uniform[0,2Ο€)\Theta \sim \text{Uniform}[0, 2\pi) is independent of AA. Find the mean function and the autocorrelation function of X(t)X(t).

Example: i.i.d. Processes

Let {Xn:n∈Z}\{X_n : n \in \mathbb{Z}\} be an i.i.d. sequence with E[Xn]=ΞΌ\mathbb{E}[X_n] = \mu and E[Xn2]=Οƒ2+ΞΌ2\mathbb{E}[X_n^2] = \sigma^2 + \mu^2. Find the autocorrelation rxx[n,m]r_{xx}[n, m].

Definition:

Second-Order Process

A process {X(t):t∈T}\{X(t) : t \in \mathcal{T}\} is called a second-order process if E[∣X(t)∣2]<∞\mathbb{E}[|X(t)|^2] < \infty for all t∈Tt \in \mathcal{T}.

Second-order processes are the natural setting for correlation analysis. The autocorrelation rXX(t1,t2)=E[X(t1)Xβˆ—(t2)]r_{XX}(t_1, t_2) = \mathbb{E}[X(t_1)X^*(t_2)] is well-defined whenever E[∣X(t)∣2]<∞\mathbb{E}[|X(t)|^2] < \infty, by the Cauchy-Schwarz inequality.

Quick Check

A Poisson counting process N(t)N(t), where tβ‰₯0t \geq 0 and N(t)∈{0,1,2,…}N(t) \in \{0, 1, 2, \ldots\}, is classified as:

Continuous-time, continuous-valued

Continuous-time, discrete-valued

Discrete-time, continuous-valued

Discrete-time, discrete-valued

Quick Check

For the random sinusoid X(t)=Acos⁑(2πν0t+Θ)X(t) = A\cos(2\pi \nu_0 t + \Theta) with random AA and Θ\Theta, a sample path is obtained by:

Fixing tt and varying Ο‰\omega

Fixing Ο‰\omega and varying tt

Averaging over all Ο‰\omega

Historical Note: Kolmogorov and the Foundations of Stochastic Processes

1930s

Andrey Kolmogorov's 1933 monograph Grundbegriffe der Wahrscheinlichkeitsrechnung ("Foundations of Probability Theory") placed probability on rigorous measure-theoretic ground and, as a byproduct, provided the extension theorem that guarantees the existence of stochastic processes from consistent finite-dimensional distributions. This resolved a fundamental question: does a "random function" actually exist as a mathematical object? Kolmogorov showed that it does, provided the fdds are consistent. His framework remains the standard foundation for all of modern probability and stochastic process theory.

Stochastic Process

A family of random variables {X(t):t∈T}\{X(t) : t \in \mathcal{T}\} indexed by a parameter set T\mathcal{T}, typically representing time.

Related: Sample Path, Finite-Dimensional Distributions (fdds)

Sample Path

A single realization of a stochastic process, obtained by fixing the outcome Ο‰\omega and viewing the process as a deterministic function of time.

Related: Stochastic Process

Finite-Dimensional Distributions (fdds)

The collection of all joint distributions Ft1,…,tNF_{t_1, \ldots, t_N} of the process evaluated at finitely many time indices. The fdds completely characterize the process.

Related: Stochastic Process

Common Mistake: A Process Is Not Just a Sequence of Random Variables

Mistake:

Treating a stochastic process as merely a collection of independent random variables, ignoring the dependence structure across time.

Correction:

The joint behavior across time indices β€” captured by the fdds β€” is the essence of a stochastic process. A process with independent time samples (i.i.d.) is a very special case. In general, X(t1)X(t_1) and X(t2)X(t_2) are correlated, and this correlation is what makes the process useful for modeling physical phenomena like fading channels.

Why This Matters: The Wireless Channel as a Stochastic Process

A wireless channel gain H(t)H(t) between a transmitter and receiver is a stochastic process: at each time tt, the gain is a complex random variable whose realization depends on the scattering environment and the relative motion of transmitter and receiver. A single "channel trace" measured with a channel sounder is one sample path. The statistical description of H(t)H(t) β€” through its fdds and, more practically, its autocorrelation function β€” determines the coherence time and shapes the design of pilot spacing, coding, and feedback protocols.

See full treatment in Chapter 14

Key Takeaway

A stochastic process {X(t):t∈T}\{X(t) : t \in \mathcal{T}\} is a random function of time, completely characterized by its finite-dimensional distributions. For engineering purposes, we almost always work with the first- and second-order statistics (mean and autocorrelation), which motivates the WSS framework developed in the following sections.