Exercises

ex-ch11-01

Easy

Let Xn=Y/nX_n = Y/n where Y∼Exp(1)Y \sim \text{Exp}(1). Show that Xnβ†’a.s.0X_n \xrightarrow{\text{a.s.}} 0, Xnβ†’P0X_n \xrightarrow{P} 0, and Xnβ†’Lr0X_n \xrightarrow{L^r} 0 for all rβ‰₯1r \geq 1.

ex-ch11-02

Easy

Let X1,X2,…X_1, X_2, \ldots be i.i.d. Uniform[0,1]\text{Uniform}[0, 1]. Use the WLLN to show that XΛ‰nβ†’P1/2\bar{X}_n \xrightarrow{P} 1/2.

ex-ch11-03

Easy

If Xn→L2XX_n \xrightarrow{L^2} X, show that E[Xn]→E[X]\mathbb{E}[X_n] \to \mathbb{E}[X] and E[Xn2]→E[X2]\mathbb{E}[X_n^2] \to \mathbb{E}[X^2].

ex-ch11-04

Medium

Let X1,X2,…X_1, X_2, \ldots be i.i.d. with P(Xi=1)=p\mathbb{P}(X_i = 1) = p and P(Xi=0)=1βˆ’p\mathbb{P}(X_i = 0) = 1 - p. Using the CLT, find an approximate expression for

P ⁣(βˆ‘i=1nXi≀k)\mathbb{P}\!\left(\sum_{i=1}^n X_i \leq k\right)

in terms of the Ξ¦\Phi function. How many samples nn are needed so that P(∣XΛ‰nβˆ’pβˆ£β‰€0.01)β‰₯0.99\mathbb{P}(|\bar{X}_n - p| \leq 0.01) \geq 0.99 when p=0.5p = 0.5?

ex-ch11-05

Medium

Prove that convergence in probability implies the existence of a subsequence that converges almost surely. That is, if Xn→PXX_n \xrightarrow{P} X, then there exists a subsequence {Xnk}\{X_{n_k}\} with Xnk→a.s.XX_{n_k} \xrightarrow{\text{a.s.}} X.

ex-ch11-06

Medium

Let X1,X2,…X_1, X_2, \ldots be i.i.d. Exp(Ξ»)\text{Exp}(\lambda) with Ξ»=1\lambda = 1. Use the CLT to approximate P(S100>110)\mathbb{P}(S_{100} > 110) where Sn=βˆ‘i=1nXiS_n = \sum_{i=1}^n X_i. Compare with the exact gamma CDF.

ex-ch11-07

Medium

Let X1,…,XnX_1, \ldots, X_n be i.i.d. Uniform[0,ΞΈ]\text{Uniform}[0, \theta] with ΞΈ>0\theta > 0 unknown. The MLE is ΞΈ^n=max⁑(X1,…,Xn)\hat{\theta}_n = \max(X_1, \ldots, X_n). Show that n(ΞΈβˆ’ΞΈ^n)β†’dExp(1/ΞΈ)n(\theta - \hat{\theta}_n) \xrightarrow{d} \text{Exp}(1/\theta). Why doesn't the CLT apply here?

ex-ch11-08

Medium

(Delta method) Let X1,…,XnX_1, \ldots, X_n be i.i.d. with mean ΞΌ>0\mu > 0 and variance Οƒ2\sigma^2. Find the asymptotic distribution of g(XΛ‰n)=log⁑(XΛ‰n)g(\bar{X}_n) = \log(\bar{X}_n).

ex-ch11-09

Hard

(Lindeberg CLT) Let X1,X2,…X_1, X_2, \ldots be independent (not necessarily identically distributed) with E[Xi]=0\mathbb{E}[X_i] = 0, Var(Xi)=Οƒi2\text{Var}(X_i) = \sigma_i^2, and sn2=βˆ‘i=1nΟƒi2s_n^2 = \sum_{i=1}^n \sigma_i^2. State the Lindeberg condition and show that it is sufficient for

Snsn→dN(0,1).\frac{S_n}{s_n} \xrightarrow{d} \mathcal{N}(0, 1).

ex-ch11-10

Hard

Let X1,X2,…X_1, X_2, \ldots be i.i.d. with mean ΞΌ\mu and variance Οƒ2\sigma^2. Define Yn=(XΛ‰n)2Y_n = (\bar{X}_n)^2. Using the delta method, find the asymptotic distribution of YnY_n and compute the asymptotic relative efficiency of YnY_n as an estimator of ΞΌ2\mu^2.

ex-ch11-11

Hard

(Slutsky application) Let X1,…,XnX_1, \ldots, X_n be i.i.d. with E[X14]<∞\mathbb{E}[X_1^4] < \infty. Define Οƒ^n2=1nβˆ‘i=1n(Xiβˆ’XΛ‰n)2\hat{\sigma}_n^2 = \frac{1}{n}\sum_{i=1}^n (X_i - \bar{X}_n)^2. Show that the "studentized" statistic Tn=n(XΛ‰nβˆ’ΞΌ)/Οƒ^nT_n = \sqrt{n}(\bar{X}_n - \mu)/\hat{\sigma}_n satisfies Tnβ†’dN(0,1)T_n \xrightarrow{d} \mathcal{N}(0, 1).

ex-ch11-12

Hard

(Berry-Esseen application) Let X1,…,XnX_1, \ldots, X_n be i.i.d. Bernoulli(p)\text{Bernoulli}(p). Using the Berry-Esseen theorem, find an upper bound on the error ∣P(XΛ‰n≀p+Ξ΄)βˆ’Ξ¦(Ξ΄n/p(1βˆ’p))∣|\mathbb{P}(\bar{X}_n \leq p + \delta) - \Phi(\delta\sqrt{n}/\sqrt{p(1-p)})|. Evaluate for p=0.5p = 0.5, n=100n = 100, Ξ΄=0.05\delta = 0.05.

,

ex-ch11-13

Challenge

(Multivariate delta method) Let X1,…,Xn\mathbf{X}_1, \ldots, \mathbf{X}_n be i.i.d. in R2\mathbb{R}^2 with E[Xi]=(ΞΌ1,ΞΌ2)T\mathbb{E}[\mathbf{X}_i] = (\mu_1, \mu_2)^\mathsf{T} and covariance matrix Ξ£\boldsymbol{\Sigma}. Consider the ratio statistic Rn=XΛ‰n,1/XΛ‰n,2R_n = \bar{X}_{n,1}/\bar{X}_{n,2}. Find its asymptotic distribution.

ex-ch11-14

Easy

Show that if Xn→dcX_n \xrightarrow{d} c where cc is a constant, then Xn→PcX_n \xrightarrow{P} c.

ex-ch11-15

Medium

(Monte Carlo integration) To compute I=∫01ex2dxI = \int_0^1 e^{x^2} dx, we generate i.i.d. U1,…,Un∼Uniform[0,1]U_1, \ldots, U_n \sim \text{Uniform}[0,1] and estimate I^n=1nβˆ‘i=1neUi2\hat{I}_n = \frac{1}{n}\sum_{i=1}^n e^{U_i^2}.

(a) Justify why I^n→I\hat{I}_n \to I and identify the mode of convergence. (b) Use the CLT to find an approximate 95% confidence interval for II when n=10,000n = 10{,}000.

ex-ch11-16

Challenge

(Convergence of moments under CLT) Let X1,X2,…X_1, X_2, \ldots be i.i.d. with E[∣X1∣2+Ξ΄]<∞\mathbb{E}[|X_1|^{2+\delta}] < \infty for some Ξ΄>0\delta > 0. Let Zn=n(XΛ‰nβˆ’ΞΌ)/ΟƒZ_n = \sqrt{n}(\bar{X}_n - \mu)/\sigma. Show that E[Zn2]β†’1\mathbb{E}[Z_n^2] \to 1. Does Znβ†’L2Z∼N(0,1)Z_n \xrightarrow{L^2} Z \sim \mathcal{N}(0,1)?