## You are here

Homeexpected value

## Primary tabs

# expected value

Let us first consider a discrete random variable $X$ with values in $\mathbb{R}$. Then $X$ has values in an at most countable set $\mathcal{X}$. For $x\in\mathcal{X}$ denote the probability that $X=x$ by $P_{x}$. If

$\sum_{{x\in\mathcal{X}}}|x|P_{x}$ |

$\sum_{{x\in\mathcal{X}}}xP_{x}$ |

is well-defined. Its value is called the *expected value*, *expectation* or *mean* of $X$. It is usually denoted by $E(X)$.

Taking this idea further, we can easily generalize to a continuous random variable $X$ with probability density $\varrho$ by setting

$E(X)=\int_{{-\infty}}^{\infty}x\varrho(x)dx,$ |

if this integral exists.

From the above definition it is clear that the expectation is a linear function, i.e. for two random variables $X,Y$ we have

$E(aX+bY)=aE(X)+bE(Y)$ |

for $a,b\in\mathbb{R}$.

Note that the expectation does not always exist (if the corresponding sum or integral does not converge, the expectation does not exist. One example of this situation is the Cauchy random variable).

Using the measure theoretical formulation of stochastics, we can give a more formal definition. Let $(\Omega,\mathcal{A},P)$ be a probability space and $X:\Omega\to\mathbb{R}$ a random variable. We now define

$E(X)=\int_{{\Omega}}XdP,$ |

where the integral is understood as the Lebesgue-integral with respect to the measure $P$.

## Mathematics Subject Classification

60-00*no label found*46L05

*no label found*82-00

*no label found*83-00

*no label found*81-00

*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff
- Corrections