expected value


Let us first consider a discrete random variable X with values in . Then X has values in an at most countable set 𝒳. For x𝒳 denote the probability that X=x by Px. If

x𝒳|x|Px

converges, the sum

x𝒳xPx

is well-defined. Its value is called the expected valueMathworldPlanetmath, expectation or mean of X. It is usually denoted by E(X).

Taking this idea further, we can easily generalize to a continuous random variable X with probability density ϱ by setting

E(X)=-xϱ(x)𝑑x,

if this integral exists.

From the above definition it is clear that the expectation is a linear function, i.e. for two random variables X,Y we have

E(aX+bY)=aE(X)+bE(Y)

for a,b.

Note that the expectation does not always exist (if the corresponding sum or integral does not converge, the expectation does not exist. One example of this situation is the Cauchy random variable).

Using the measure theoretical formulation of stochastics, we can give a more formal definition. Let (Ω,𝒜,P) be a probability space and X:Ω a random variable. We now define

E(X)=ΩX𝑑P,

where the integral is understood as the Lebesgue-integral with respect to the measure P.

Title expected value
Canonical name ExpectedValue
Date of creation 2013-03-22 11:53:42
Last modified on 2013-03-22 11:53:42
Owner mathwizard (128)
Last modified by mathwizard (128)
Numerical id 21
Author mathwizard (128)
Entry type Definition
Classification msc 60-00
Classification msc 46L05
Classification msc 82-00
Classification msc 83-00
Classification msc 81-00
Synonym mean
Synonym expectation value
Synonym expectation
Related topic AverageValueOfFunction