Functions of Two Random Variables

Why Functions of Multiple RVs?

In Chapter 6 we found the distribution of Y=g(X)Y = g(X) for a single RV. Now we need the distribution of functions of two or more RVs: the sum of a signal and noise (Z=X+WZ = X + W), the ratio of signal power to interference (SINR=S/I\text{SINR} = S / I), or more general transformations. Two techniques dominate: the Jacobian method for invertible transformations and the convolution formula for sums of independent RVs.

Theorem: Jacobian Transformation Formula

Let (X,Y)(X, Y) be jointly continuous with PDF fX,Yf_{X,Y} and let g:R2β†’R2g : \mathbb{R}^2 \to \mathbb{R}^2 be a one-to-one (invertible) transformation with inverse gβˆ’1(u,v)=(h1(u,v),h2(u,v))g^{-1}(u, v) = (h_1(u,v), h_2(u,v)). Define (U,V)=g(X,Y)(U, V) = g(X, Y). Then the joint PDF of (U,V)(U, V) is

fU,V(u,v)=fX,Y(h1(u,v), h2(u,v))β‹…βˆ£Jgβˆ’1(u,v)∣,f_{U,V}(u, v) = f_{X,Y}\bigl(h_1(u,v),\, h_2(u,v)\bigr) \cdot |J_{g^{-1}}(u,v)|,

where the Jacobian is

Jgβˆ’1(u,v)=det⁑(βˆ‚h1βˆ‚uβˆ‚h1βˆ‚vβˆ‚h2βˆ‚uβˆ‚h2βˆ‚v).J_{g^{-1}}(u,v) = \det\begin{pmatrix} \dfrac{\partial h_1}{\partial u} & \dfrac{\partial h_1}{\partial v} \\[6pt] \dfrac{\partial h_2}{\partial u} & \dfrac{\partial h_2}{\partial v} \end{pmatrix}.

The Jacobian accounts for the stretching and compression of area elements under the transformation. The absolute value ensures the density remains non-negative regardless of the orientation of the mapping.

Jacobian Method β€” Step by Step

Input: Joint PDF f_{X,Y}, transformation (U,V) = g(X,Y)
1. If g maps R^2 to R (one output), introduce an auxiliary
variable V = Y (or V = X) to make the map 2-to-2.
2. Find the inverse: (X,Y) = g^{-1}(U,V) = (h_1(U,V), h_2(U,V)).
3. Compute the Jacobian matrix of g^{-1} and its determinant J.
4. Write f_{U,V}(u,v) = f_{X,Y}(h_1(u,v), h_2(u,v)) * |J|.
5. If an auxiliary variable was introduced in step 1,
marginalize: f_U(u) = integral of f_{U,V}(u,v) dv.
Output: f_U(u) (or the joint f_{U,V})

Example: Rayleigh Distribution from Two Gaussians

Let X,YX, Y be independent N(0,Οƒ2)\mathcal{N}(0, \sigma^2). Define the polar coordinates R=X2+Y2R = \sqrt{X^2 + Y^2} and Θ=arctan⁑(Y/X)\Theta = \arctan(Y/X). Find the joint PDF of (R,Θ)(R, \Theta) and the marginal PDF of RR.

Theorem: Convolution Formula for Independent Sums

Let XX and YY be independent continuous RVs with PDFs fXf_{X} and fYf_{Y}. Then Z=X+YZ = X + Y has PDF

fZ(z)=(fXβˆ—fY)(z)=βˆ«βˆ’βˆžβˆžfX(u) fY(zβˆ’u) du,f_{Z}(z) = (f_{X} * f_{Y})(z) = \int_{-\infty}^{\infty} f_{X}(u)\,f_{Y}(z - u)\,du,

where βˆ—* denotes convolution.

To find the density of the sum at a point zz, we sweep over all ways of decomposing z=u+(zβˆ’u)z = u + (z - u) and weight each decomposition by the product of the individual densities. The convolution integral performs this sweep.

Example: Sum of Two Independent Exponentials

Let X1∼Exp(λ)X_1 \sim \text{Exp}(\lambda) and X2∼Exp(λ)X_2 \sim \text{Exp}(\lambda) be independent. Find the PDF of Z=X1+X2Z = X_1 + X_2.

Example: Sum of Independent Gaussians

Let X∼N(ΞΌ1,Οƒ12)X \sim \mathcal{N}(\mu_1, \sigma_1^2) and Y∼N(ΞΌ2,Οƒ22)Y \sim \mathcal{N}(\mu_2, \sigma_2^2) be independent. Find the distribution of Z=X+YZ = X + Y.

Convolution of Two Densities

Choose two distribution families and watch the convolution integral sweep out the density of their sum. The sliding density fY(zβˆ’u)f_{Y}(z - u) is shown in red, and the area under the product gives fZ(z)f_{Z}(z).

Parameters
1

The Jacobian Method for Transformations

Step-by-step visualization of the Jacobian transformation applied to the polar coordinate transformation (X,Y)β†’(R,Θ)(X,Y) \to (R, \Theta), showing how area elements are stretched by the factor ∣J∣=r|J| = r.

Convolution

The operation (fXβˆ—fY)(z)=∫fX(u) fY(zβˆ’u) du(f_{X} * f_{Y})(z) = \int f_{X}(u)\,f_{Y}(z-u)\,du that gives the PDF of the sum of two independent random variables.

Related: Joint probability density function, Independent random variables

Jacobian

The determinant of the matrix of partial derivatives of a transformation's inverse. It measures the local scaling of area (or volume) elements under the transformation.

Related: Convolution

Common Mistake: Forgetting the Jacobian

Mistake:

Writing fU,V(u,v)=fX,Y(h1(u,v),h2(u,v))f_{U,V}(u,v) = f_{X,Y}(h_1(u,v), h_2(u,v)) without the Jacobian factor.

Correction:

The Jacobian ∣Jgβˆ’1∣|J_{g^{-1}}| is mandatory. Without it, the density does not integrate to 1. A useful sanity check: always verify that βˆ«β€‰β£β€‰β£βˆ«fU,V(u,v) du dv=1\int\!\!\int f_{U,V}(u,v)\,du\,dv = 1.

πŸ”§Engineering Note

Computing Convolutions via FFT

In numerical simulations, evaluating the convolution integral directly is O(N2)O(N^2) for NN sample points. A much faster approach exploits the convolution theorem: fZ=fXβˆ—fYf_{Z} = f_{X} * f_{Y} corresponds to pointwise multiplication in the frequency domain. Using the FFT, one computes F{fX}β‹…F{fY}\mathcal{F}\{f_{X}\} \cdot \mathcal{F}\{f_{Y}\} and then inverse-transforms, achieving O(Nlog⁑N)O(N \log N) complexity. This is the standard technique for computing the distribution of sums in signal processing and communications simulation.

Key Takeaway

The Jacobian transformation formula gives the joint density of any invertible function of two RVs. For sums of independent RVs, the density is the convolution of the individual densities: fX+Y=fXβˆ—fYf_{X+Y} = f_{X} * f_{Y}. The Gaussian family is closed under convolution β€” sums of independent Gaussians are Gaussian.