expected value
Let us first consider a discrete random variable with values in . Then has values in an at most countable set . For denote the probability that by . If
converges, the sum
is well-defined. Its value is called the expected value, expectation or mean of . It is usually denoted by .
Taking this idea further, we can easily generalize to a continuous random variable with probability density by setting
if this integral exists.
From the above definition it is clear that the expectation is a linear function, i.e. for two random variables we have
for .
Note that the expectation does not always exist (if the corresponding sum or integral does not converge, the expectation does not exist. One example of this situation is the Cauchy random variable).
Using the measure theoretical formulation of stochastics, we can give a more formal definition. Let be a probability space and a random variable. We now define
where the integral is understood as the Lebesgue-integral with respect to the measure .
Title | expected value |
Canonical name | ExpectedValue |
Date of creation | 2013-03-22 11:53:42 |
Last modified on | 2013-03-22 11:53:42 |
Owner | mathwizard (128) |
Last modified by | mathwizard (128) |
Numerical id | 21 |
Author | mathwizard (128) |
Entry type | Definition |
Classification | msc 60-00 |
Classification | msc 46L05 |
Classification | msc 82-00 |
Classification | msc 83-00 |
Classification | msc 81-00 |
Synonym | mean |
Synonym | expectation value |
Synonym | expectation |
Related topic | AverageValueOfFunction |