convergence in probability
Let be a sequence of random variables defined on a probability space taking values in a separable metric space , where is the metric. Then we say the sequence converges in probability or converges in measure to a random variable if for every ,
We denote convergence in probability of to by
Equivalently, iff every subsequence of contains a subsequence which converges to almost surely.
Remarks.
-
•
Unlike ordinary convergence, and only implies that almost surely.
-
•
The need for separability on is to ensure that the metric, , is a random variable, for all random variables and .
-
•
Convergence almost surely implies convergence in probability but not conversely.
References
- 1 R. M. Dudley, Real Analysis and Probability, Cambridge University Press (2002).
- 2 W. Feller, An Introduction to Probability Theory and Its Applications. Vol. 1, Wiley, 3rd ed. (1968).
Title | convergence in probability |
---|---|
Canonical name | ConvergenceInProbability |
Date of creation | 2013-03-22 15:01:05 |
Last modified on | 2013-03-22 15:01:05 |
Owner | CWoo (3771) |
Last modified by | CWoo (3771) |
Numerical id | 7 |
Author | CWoo (3771) |
Entry type | Definition |
Classification | msc 60B10 |
Synonym | converge in probability |
Synonym | converges in measure |
Synonym | converge in measure |
Synonym | convergence in measure |
Defines | converges in probability |