convergence in probability
Let $\{{X}_{i}\}$ be a sequence of random variables^{} defined on a probability space^{} $(\mathrm{\Omega},\mathcal{F},P)$ taking values in a separable^{} metric space $(Y,d)$, where $d$ is the metric. Then we say the sequence ${X}_{i}$ converges in probability or converges in measure to a random variable $X$ if for every $\epsilon >0$,
$$\underset{i\to \mathrm{\infty}}{lim}P(d({X}_{i},X)\ge \epsilon )=0.$$ 
We denote convergence in probability of ${X}_{i}$ to $X$ by
$${X}_{i}\stackrel{pr}{\u27f6}X.$$ 
Equivalently, ${X}_{i}\stackrel{pr}{\u27f6}X$ iff every subsequence of $\{{X}_{i}\}$ contains a subsequence which converges^{} to $X$ almost surely.
Remarks.

•
Unlike ordinary convergence, ${X}_{i}\stackrel{pr}{\u27f6}X$ and ${X}_{i}\stackrel{pr}{\u27f6}Y$ only implies that $X=Y$ almost surely.

•
The need for separability on $Y$ is to ensure that the metric, $d({X}_{i},X)$, is a random variable, for all random variables ${X}_{i}$ and $X$.

•
Convergence almost surely implies convergence in probability but not conversely.
References
 1 R. M. Dudley, Real Analysis and Probability, Cambridge University Press (2002).
 2 W. Feller, An Introduction to Probability Theory and Its Applications. Vol. 1, Wiley, 3rd ed. (1968).
Title  convergence in probability 

Canonical name  ConvergenceInProbability 
Date of creation  20130322 15:01:05 
Last modified on  20130322 15:01:05 
Owner  CWoo (3771) 
Last modified by  CWoo (3771) 
Numerical id  7 
Author  CWoo (3771) 
Entry type  Definition 
Classification  msc 60B10 
Synonym  converge in probability 
Synonym  converges in measure 
Synonym  converge in measure 
Synonym  convergence in measure 
Defines  converges in probability 