# convergence in probability is preserved under continuous transformations

###### Theorem 1.

Let $g\colon\mathbb{R}^{k}\to\mathbb{R}^{l}$ be a continuous function. If $\{X_{n}\}$ are $\mathbb{R}^{k}$-valued random variables converging to $X$ in probability, then $\{g(X_{n})\}$ converge in probability to $g(X)$ also.

###### Proof.

Suppose first that $g$ is uniformly continuous. Given $\epsilon>0$, there is $\delta>0$ such that $\lVert g(X_{n})-g(X)\rVert<\epsilon$ whenever $\lVert X_{n}-X\rVert<\delta$. Therefore,

 $\mathbb{P}\bigl{(}\lVert g(X_{n})-g(X)\rVert\geq\epsilon\bigr{)}\leq\mathbb{P}% \bigl{(}\lVert X_{n}-X\rVert\geq\delta\bigr{)}\to 0$

as $n\to\infty$.

Now suppose $g$ is not necessarily uniformly continuous on $\mathbb{R}^{k}$. But it will be uniformly continuous on any compact set $\{x\in\mathbb{R}^{k}\colon\lVert x\rVert\leq m\}$ for $m\geq 0$. Consequently, if $X_{n}$ and $X$ are bounded (by $m$), then the proof just given is applicable. Thus we attempt to reduce the general case to the case that $X_{n}$ and $X$ are bounded.

Let

 $f_{m}(x)=\begin{cases}x\,,&\lVert x\rVert\leq m\\ mx/\lVert x\rVert\,,&\lVert x\rVert\geq m\end{cases}$

Clearly, $f_{m}\colon\mathbb{R}^{k}\to\mathbb{R}^{k}$ is continuous; in fact, it can be verified that $f_{m}$ is uniformly continuous on $\mathbb{R}^{k}$. (This is geometrically obvious in the one-dimensional case.)

Set $X_{n}^{m}=f_{m}(X_{n})$ and $X^{m}=f_{m}(X)$, so that $X_{n}^{m}$ converge to $X^{m}$ in probability for each $m\geq 0$.

We now show that $g(X_{n})$ converge to $g(X)$ in probability by a four-step estimate. Let $\epsilon>0$ and $\delta>0$ be given. For any $m\geq 0$ (which we will later),

 $\mathbb{P}\bigl{(}\lVert g(X_{n})-g(X)\rVert\geq\delta\bigr{)}\leq\mathbb{P}% \bigl{(}\lVert g(X_{n}^{m})-g(X^{m})\rVert\geq\delta\bigr{)}+\mathbb{P}\bigl{(% }\lVert X_{n}\rVert\geq m\bigr{)}+\mathbb{P}\bigl{(}\lVert X\rVert\geq m\bigr{% )}\,.$

Choose $M$ such that for $m\geq M$,

 $\mathbb{P}\bigl{(}\lVert X\rVert\geq m\bigr{)}\leq\mathbb{P}\bigl{(}\lVert X% \rVert\geq M\bigr{)}<\frac{\epsilon}{4}\,.$

(This is possible since $\lim_{m\to\infty}\mathbb{P}\bigl{(}\lVert X\rVert\geq m\bigr{)}=\mathbb{P}% \bigl{(}\bigcap_{m=0}^{\infty}\{\lVert X\rVert\geq m\}\bigr{)}=\mathbb{P}(% \emptyset)=0$.)

In particular, let $m=M+1$. Since $X_{n}^{m}$ converge in probability to $X^{m}$ and $X_{n}^{m}$, $X^{m}$ are bounded, $g(X_{n}^{m})$ converge in probability to $g(X^{m})$. That means for $n$ large enough,

 $\mathbb{P}\bigl{(}\lVert g(X_{n}^{m})-g(X^{m})\rVert\geq\delta\bigr{)}<\frac{% \epsilon}{4}\,.$

Finally, since $\lVert X_{n}\rVert\leq\lVert X_{n}-X\rVert+\lVert X\rVert$, and $X_{n}$ converge to $X$ in probability, we have

 $\mathbb{P}\bigl{(}\lVert X_{n}\rVert\geq m=M+1\bigr{)}\leq\mathbb{P}\bigl{(}% \lVert X_{n}-X\rVert\geq 1\bigr{)}+\mathbb{P}\bigl{(}\lVert X\rVert\geq M\bigr% {)}<\frac{\epsilon}{4}+\frac{\epsilon}{4}$

for large enough $n$.

Collecting the previous inequalities together, we have

 $\mathbb{P}\bigl{(}\lVert g(X_{n})-g(X)\rVert\geq\delta\bigr{)}<\epsilon$

for large enough $n$. ∎

Title convergence in probability is preserved under continuous transformations ConvergenceInProbabilityIsPreservedUnderContinuousTransformations 2013-03-22 16:15:05 2013-03-22 16:15:05 stevecheng (10074) stevecheng (10074) 10 stevecheng (10074) Theorem msc 60A10