# convergence in probability

Let $\{X_{i}\}$ be a sequence of random variables  defined on a probability space  $(\Omega,\mathcal{F},P)$ taking values in a separable  metric space $(Y,d)$, where $d$ is the metric. Then we say the sequence $X_{i}$ converges in probability or converges in measure to a random variable $X$ if for every $\varepsilon>0$,

 $\lim_{i\rightarrow\infty}P(d(X_{i},X)\geq\varepsilon)=0.$

We denote convergence in probability of $X_{i}$ to $X$ by

 $X_{i}\lx@stackrel{{pr}}{{\longrightarrow}}X.$

Equivalently, $X_{i}\lx@stackrel{{pr}}{{\longrightarrow}}X$ iff every subsequence of $\{X_{i}\}$ contains a subsequence which converges  to $X$ almost surely.

Remarks.

• Unlike ordinary convergence, $X_{i}\lx@stackrel{{pr}}{{\longrightarrow}}X$ and $X_{i}\lx@stackrel{{pr}}{{\longrightarrow}}Y$ only implies that $X=Y$ almost surely.

• The need for separability on $Y$ is to ensure that the metric, $d(X_{i},X)$, is a random variable, for all random variables $X_{i}$ and $X$.

• Convergence almost surely implies convergence in probability but not conversely.

## References

• 1 R. M. Dudley, Real Analysis and Probability, Cambridge University Press (2002).
• 2 W. Feller, An Introduction to Probability Theory and Its Applications. Vol. 1, Wiley, 3rd ed. (1968).
Title convergence in probability ConvergenceInProbability 2013-03-22 15:01:05 2013-03-22 15:01:05 CWoo (3771) CWoo (3771) 7 CWoo (3771) Definition msc 60B10 converge in probability converges in measure converge in measure convergence in measure converges in probability