Problems on Convergence in Probability - Solutions
1. Show that if $X_n \to c$ in probability, where $c$ is a constant, then $X_n \to c$ in
distribution.
If $X_n$ converges in probability to the constant $c$, then for any $\varepsilon>0$,
$P(|X_n-c|>\varepsilon)\to0$. For any continuity point $x$ of the limit distribution function (here all x\ne
c), if x\varepsilon)\to0. If x>c similarly F_{X_n}(x)\to1. Hence the distribution functions converge to the
degenerate distribution at c, so $X_n\to c$ in distribution.
2. Let $X_n$ be random variables such that $X_n \to X$ almost surely. Show that $X_n \to X$ in
probability.
If $X_n\to X$ a.s., then $P(\lim_{n\to\infty}|X_n-X|=0)=1$. For any $\varepsilon>0$, the event
$\{|X_n-X|>\varepsilon\}$ occurs only finitely often almost surely, so $P(|X_n-X|>\varepsilon)\to0$. Thus
convergence in probability holds.
3. Give an example where $X_n \to X$ in probability but not almost surely.
Let P(X_n=1)=1/n and P(X_n=0)=1-1/n. Then for any \varepsilon\in(0,1/2),
P(|X_n-0|>\varepsilon)=1/n\to0, so X_n\to0 in probability. But with probability 1 there are infinitely many
n with X_n=1 (Borel-Cantelli doesn't apply because sum 1/n diverges), so there is no a.s. convergence
to 0.
4. Suppose $E[|X_n-X|]\to0$. Show that $X_n \to X$ in probability.
Apply Markov's inequality: for any \varepsilon>0, $P(|X_n-X|>\varepsilon)\le
E[|X_n-X|]/\varepsilon\to0$. Hence convergence in probability.
5. If $X_n \to X$ in probability, show that any subsequence $X_{n_k} \to X$ in probability.
For fixed \varepsilon>0, $P(|X_{n_k}-X|>\varepsilon)\le\sup_{m\ge N}P(|X_m-X|>\varepsilon)$ and
since the original sequence probabilities go to 0, the subsequence probabilities also go to 0.
6. Show that if $X_n \to X$ in distribution and $X$ is a constant $c$, then $X_n \to c$ in
probability.
Convergence in distribution to a constant implies the distribution functions converge to the degenerate
distribution at c. For any \varepsilon>0, $P(|X_n-c|>\varepsilon)=1-P(c-\varepsilon\le X_n\le
c+\varepsilon)\to1-1=0$. Thus convergence in probability.
7. Let $X_n \sim \mathrm{Bernoulli}(p_n)$. Find when $X_n \to 0$ in probability.
We have $P(|X_n-0|>\varepsilon)=P(X_n=1)=p_n$ for \varepsilon\in(0,1). So $X_n\to0$ in probability iff
$p_n\to0$.
8. Prove Weak Law of Large Numbers: $\overline{X}_n \to \mu$ in probability.
If $X_i$ i.i.d. with mean \mu and variance \sigma^2<\infty$, then Var(\overline{X}_n)=\sigma^2/n. By
Chebyshev, for any \varepsilon>0, $P(|\overline{X}_n-\mu|>\varepsilon)\le
\sigma^2/(n\varepsilon^2)\to0$. Hence $\overline{X}_n\to\mu$ in probability.
9. If $X_n \to X$ in probability and $g$ is continuous, show $g(X_n) \to g(X)$ in probability.
Fix \varepsilon>0. By continuity of g, for each point there exists \delta>0 s.t. |x-y|<\delta implies
|g(x)-g(y)|<\varepsilon. Then $P(|g(X_n)-g(X)|>\varepsilon)\le P(|X_n-X|>\delta)\to0$, giving the result.
10. Let $X_n \sim N(0, 1/n)$. Show $X_n \to 0$ in probability.
For any \varepsilon>0, $P(|X_n|>\varepsilon)=2(1-\Phi(\varepsilon\sqrt{n}))\to0$ because the tail of the
normal decays and \varepsilon\sqrt{n}\to\infty.
11. Find $X_n$ such that $X_n \to 0$ in probability but $E[X_n]\not\to0$.
Take $X_n=n$ with prob $1/n$ and 0 otherwise. Then $P(|X_n-0|>\varepsilon)=1/n\to0$, so X_n\to0 in
probability, but E[X_n]=1 for all n, so expectations do not converge to 0.
12. Show convergence in probability does NOT imply convergence of expectation.
Use the example in (11): convergence in probability holds but expectations remain 1, not 0, so
expectation need not converge.
13. Let $X_n = X + \varepsilon_n$ with $\varepsilon_n \to 0$ in probability. Show $X_n \to X$ in
probability.
By triangle inequality, $P(|X_n-X|>\varepsilon)=P(|\varepsilon_n|>\varepsilon)\to0$ so $X_n\to X$ in
probability.
14. Show $\min(X_n, M) \to \min(X, M)$ in probability for any constant $M$.
The map x\mapsto\min(x,M) is continuous, so apply continuous mapping theorem: since X_n\to X in
probability, min(X_n,M)\to min(X,M) in probability.
15. Show convergence in $L^2$ implies convergence in probability.
If E[(X_n-X)^2]\to0, then by Chebyshev $P(|X_n-X|>\varepsilon)\le E[(X_n-X)^2]/\varepsilon^2\to0$.
Thus convergence in probability.