The Poisson Process

Why the Poisson Process?

In Chapter 17 we modeled systems that evolve at discrete time steps. Many real systems — packet arrivals at a router, call arrivals at a base station, photon detections in an optical receiver — evolve in continuous time, with events occurring at random instants. The simplest and most fundamental model for such random arrivals is the Poisson process.

What makes the Poisson process so central is that it arises from minimal assumptions: events occur one at a time, the rate of occurrence is constant, and the process has no memory. These three properties alone determine the entire probabilistic structure. The Poisson process is to continuous-time stochastic modeling what the Bernoulli process is to discrete time.

Definition:

Counting Process

A counting process {N(t):t0}\{N(t) : t \geq 0\} is a stochastic process taking values in {0,1,2,}\{0, 1, 2, \ldots\} with:

  1. N(0)=0N(0) = 0.
  2. N(t)N(s)N(t) \leq N(s) for tst \leq s (non-decreasing).
  3. For s<ts < t, the increment N(t)N(s)N(t) - N(s) counts the number of events in the interval (s,t](s, t].

The value N(t)N(t) is a right-continuous step function that jumps by 1 at each event time. The inter-event times Tk=SkSk1T_k = S_k - S_{k-1} (where SkS_k is the kk-th event time) fully determine the sample path.

Definition:

Poisson Process

A counting process {N(t):t0}\{N(t) : t \geq 0\} is a Poisson process with rate λ>0\lambda > 0 if:

  1. Independent increments: For any 0t1<t2<<tn0 \leq t_1 < t_2 < \cdots < t_n, the increments N(t2)N(t1),N(t3)N(t2),,N(tn)N(tn1)N(t_2)-N(t_1), N(t_3)-N(t_2), \ldots, N(t_n)-N(t_{n-1}) are mutually independent.
  2. Stationary increments: The distribution of N(t+s)N(s)N(t+s) - N(s) depends only on tt, not on ss.
  3. Orderliness: P(N(h)=1)=λh+o(h)\mathbb{P}(N(h) = 1) = \lambda h + o(h) and P(N(h)2)=o(h)\mathbb{P}(N(h) \geq 2) = o(h) as h0h \to 0.

Equivalently, N(t)N(s)Poisson(λ(ts))N(t) - N(s) \sim \text{Poisson}(\lambda(t-s)) for all 0s<t0 \leq s < t.

The three defining properties capture: (1) what happens in non-overlapping intervals is independent, (2) the statistics are time-invariant, and (3) events arrive one at a time at a well-defined rate. The equivalence with the Poisson distribution is the content of the next theorem.

,

Theorem: Distribution of Poisson Counting Process

If {N(t)}\{N(t)\} is a Poisson process with rate λ>0\lambda > 0, then for all t>0t > 0: N(t)Poisson(λt),P(N(t)=k)=(λt)keλtk!,k=0,1,2,N(t) \sim \text{Poisson}(\lambda t), \qquad \mathbb{P}(N(t) = k) = \frac{(\lambda t)^k e^{-\lambda t}}{k!}, \quad k = 0, 1, 2, \ldots In particular, E[N(t)]=λt\mathbb{E}[N(t)] = \lambda t and Var(N(t))=λt\text{Var}(N(t)) = \lambda t.

Divide [0,t][0, t] into nn tiny intervals of length h=t/nh = t/n. Each interval has approximately a λh\lambda h chance of containing one event and these intervals are independent. As nn \to \infty, the Binomial(n,λt/n)(n, \lambda t/n) distribution converges to Poisson(λt)(\lambda t) — this is the classical Poisson limit theorem.

,

Theorem: Inter-Arrival Times Are Exponential

Let S1,S2,S_1, S_2, \ldots be the successive arrival times of a Poisson process with rate λ\lambda, and let Tk=SkSk1T_k = S_k - S_{k-1} (with S0=0S_0 = 0) be the inter-arrival times. Then T1,T2,T_1, T_2, \ldots are i.i.d. Exp(λ)\text{Exp}(\lambda) random variables: P(Tk>t)=eλt,t0,k=1,2,\mathbb{P}(T_k > t) = e^{-\lambda t}, \quad t \geq 0, \quad k = 1, 2, \ldots Conversely, if {Tk}\{T_k\} are i.i.d. Exp(λ)\text{Exp}(\lambda), then the counting process N(t)=max{n:Snt}N(t) = \max\{n : S_n \leq t\} is a Poisson process with rate λ\lambda.

The memoryless property of the exponential distribution is the continuous-time analog of the memoryless property of the geometric distribution. Just as the geometric inter-arrival times define a Bernoulli process, exponential inter-arrival times define a Poisson process.

,

Example: Packet Arrivals at a Router

Packets arrive at a router port according to a Poisson process with rate λ=500\lambda = 500 packets/second. Find: (a) The probability that exactly 3 packets arrive in a 10 ms interval. (b) The expected time until the 100th packet arrives. (c) The probability that no packets arrive in a 5 ms interval.

Example: Superposition and Thinning

(a) Superposition: Two independent Poisson processes with rates λ1=3\lambda_1 = 3 and λ2=7\lambda_2 = 7 are merged. What is the resulting process? (b) Thinning: Each arrival of a Poisson process with rate λ=10\lambda = 10 is independently classified as type A with probability p=0.4p = 0.4 or type B with probability 1p=0.61-p = 0.6. What are the resulting sub-processes?

Poisson Process: Arrivals and Counting Process

Simulate a Poisson process: see the arrival times on a timeline, the staircase counting function N(t)N(t), and the empirical inter-arrival distribution compared to Exp(λ)\text{Exp}(\lambda).

Parameters
5
10
42

The Poisson Process: Arrivals, Counting, and Inter-Arrival Times

An animated visualization showing Poisson arrivals appearing on a timeline, the counting process N(t)N(t) growing as a staircase, and the inter-arrival times being drawn from an exponential distribution.
Poisson process with rate λ\lambda: arrivals on the timeline (top), counting process N(t)N(t) (middle), and exponential inter-arrival density (bottom).

Quick Check

If {N(t)}\{N(t)\} is a Poisson process with rate λ=4\lambda = 4 events/second, what is P(N(2)N(1)=0)\mathbb{P}(N(2) - N(1) = 0)?

e8e^{-8}

e4e^{-4}

1e41 - e^{-4}

4e44 e^{-4}

Common Mistake: Poisson vs. Binomial: When the Approximation Breaks

Mistake:

Treating a Poisson model as exact when the number of potential sources is small and each source contributes events with non-negligible probability. For example, modeling 5 users each transmitting with probability 0.3 per slot as Poisson(1.5).

Correction:

The Poisson approximation to the Binomial is accurate when nn is large and pp is small, with np=λnp = \lambda moderate. For n=5,p=0.3n = 5, p = 0.3, the Binomial(5,0.3)(5, 0.3) distribution differs noticeably from Poisson(1.5): P(Bin=0)=0.168\mathbb{P}(\text{Bin} = 0) = 0.168 vs. P(Poi=0)=0.223\mathbb{P}(\text{Poi} = 0) = 0.223. Use the exact Binomial when nn is small.

Historical Note: Siméon-Denis Poisson and the Law of Small Numbers

19th century

The Poisson distribution first appeared in Siméon-Denis Poisson's 1837 treatise Recherches sur la probabilité des jugements en matière criminelle et en matière civile. Poisson derived it as the limit of the binomial distribution when the number of trials is large and the success probability is small. The name "law of small numbers" (later popularized by von Bortkiewicz, who applied it to Prussian cavalry deaths by horse kicks) captures the key insight: rare events in many trials follow a universal pattern. Today, the Poisson process is the default model for random arrivals in telecommunications, queueing theory, and particle physics.

Counting process

A non-decreasing integer-valued stochastic process {N(t):t0}\{N(t) : t \geq 0\} with N(0)=0N(0) = 0 that counts the number of events occurring up to time tt.

Poisson process

A counting process with independent increments, stationary increments, and orderliness (at most one event per infinitesimal interval). Equivalently, N(t)N(s)Poisson(λ(ts))N(t) - N(s) \sim \text{Poisson}(\lambda(t-s)) for all s<ts < t.

Related: Counting process

Inter-arrival time

The time Tk=SkSk1T_k = S_k - S_{k-1} between the (k1)(k-1)-th and kk-th events in a point process. For a Poisson process with rate λ\lambda, the inter-arrival times are i.i.d. Exp(λ)\text{Exp}(\lambda).

Related: Poisson process

Key Takeaway

The Poisson process is completely characterized by a single parameter λ\lambda: arrivals are independent, the count in any interval of length tt is Poisson(λt)(\lambda t), and inter-arrival times are i.i.d. Exp(λ)(\lambda). Superposition adds rates; thinning multiplies by the retention probability. This makes the Poisson process the universal building block for continuous-time traffic models.