# existence of the conditional expectation

Let $(\mathrm{\Omega},\mathcal{F},\mathbb{P})$ be a probability space^{} and $X$ be a random variable^{}. For any $\sigma $-algebra $\mathcal{G}\subseteq \mathcal{F}$, we show the existence of the conditional expectation $\mathbb{E}[X\mid \mathcal{G}]$. Although it is possible to do this using the Radon-Nikodym theorem^{}, a different approach is used here which relies on the completeness of the vector space^{} ${L}^{2}$.
The defining property of the conditional expectation $Y=\mathbb{E}[X\mid \mathcal{G}]$ is

$$\mathbb{E}[{1}_{G}Y]=\mathbb{E}[{1}_{G}X]$$ | (1) |

for sets $G\in \mathcal{G}$. We shall prove the existence of the conditional expectation for all nonnegative random variables and, more generally, whenever $\mathbb{E}[|X|\mid \mathcal{G}]$ is almost surely finite.

First, the conditional expectation of every square-integrable random variable exists.

###### Theorem 1.

Suppose that $$. Then there is a $\mathrm{G}$-measurable random variable $Y$ satisfying $$ and equation (1) is satisfied for all $G\mathrm{\in}\mathrm{G}$.

###### Proof.

Consider the norm ${\parallel Y\parallel}_{2}\equiv \mathbb{E}{[{Y}^{2}]}^{1/2}$ on the vector space $V={L}^{2}(\mathrm{\Omega},\mathcal{F},\mathbb{P})$ of real valued random variables $Y$ satisfying $$ (up to $\mathbb{P}$ almost everywhere equivalence). This is given by the following inner product^{}

$$\u27e8{Y}_{1},{Y}_{2}\u27e9\equiv \mathbb{E}[{Y}_{1}{Y}_{2}].$$ |

As ${L}^{p}$-spaces are complete^{}, this makes $V$ into a Hilbert space^{} (see also, ${L}^{2}$-spaces are Hilbert spaces (http://planetmath.org/L2SpacesAreHilbertSpaces)). Then, $U\equiv {L}^{2}(\mathrm{\Omega},\mathcal{G},\mathbb{P})$ is a complete, and hence closed, subspace^{} of $V$.

By the existence of orthogonal projections (http://planetmath.org/ProjectionsAndClosedSubspaces) onto closed subspaces of Hilbert spaces, there is an orthogonal projection $\pi :V\to U$. In particular, $\u27e8\pi Y-Y,Z\u27e9=0$ for all $Y\in V$ and $Z\in U$. Setting $Y=\pi X$ gives

$$\mathbb{E}[{1}_{G}Y]-\mathbb{E}[{1}_{G}X]=\u27e8{1}_{G},\pi X-X\u27e9=0$$ |

as required. ∎

We can now prove the existence of conditional expectations of nonnegative random variables. Note that here there are no integrability conditions on $X$.

###### Theorem 2.

Let $X$ be a nonnegative random variable taking values in $\mathrm{R}\mathrm{\cup}\mathrm{\{}\mathrm{\infty}\mathrm{\}}$. Then, there exists a nonnegative $\mathrm{G}$-measurable random variable $Y$ taking values in $\mathrm{R}\mathrm{\cup}\mathrm{\{}\mathrm{\infty}\mathrm{\}}$ and satisfying (1) for all $G\mathrm{\in}\mathrm{G}$. Furthermore, $Y$ is uniquely defined $\mathrm{P}$-almost everywhere (http://planetmath.org/AlmostSurely).

###### Proof.

First, let ${X}_{n}=\mathrm{min}(n,X)$. As this is bounded^{}, theorem 1 says that the conditional expectations ${Y}_{n}=\mathbb{E}[{Y}_{n}\mid \mathcal{G}]$ exist. Furthermore, as ${X}_{0}=0$, we may take ${Y}_{0}=0$.
For any $n$, setting $$ gives

$$\mathbb{E}[{1}_{G}({Y}_{n}-{Y}_{n+1})]=\mathbb{E}[{1}_{G}({X}_{n}-{X}_{n+1})]\le 0.$$ |

So ${1}_{G}({Y}_{n}-{Y}_{n+1})$ is a nonnegative random variable with nonpositive expectation, hence is almost surely equal to zero. Therefore, ${Y}_{n+1}\ge {Y}_{n}$ (almost surely) and, by replacing ${Y}_{n}$ with the maximum of ${Y}_{1},\mathrm{\dots}{Y}_{n}$ we may suppose that $({Y}_{n})$ is an increasing sequence of random variables. Setting $Y={sup}_{n}{Y}_{n}$, the monotone convergence theorem^{} gives

$$\mathbb{E}[{1}_{G}Y]=\underset{n\to \mathrm{\infty}}{lim}\mathbb{E}[{1}_{G}{Y}_{n}]=\underset{n\to \mathrm{\infty}}{lim}\mathbb{E}[{1}_{G}{X}_{n}]=\mathbb{E}[{1}_{G}X]$$ |

as required.

Finally, suppose that $\stackrel{~}{Y}$ is also a nonnegative $\mathcal{G}$-measurable random variable satisfying (1). For any $x\in \mathbb{R}$, setting $G=\{\stackrel{~}{Y}>Y,x>Y\}$ then ${1}_{G}Y$ is bounded and,

$$\mathbb{E}[{1}_{G}(\stackrel{~}{Y}-Y)]=\mathbb{E}[{1}_{G}X]-\mathbb{E}[{1}_{G}X]=0$$ |

showing that $\mathbb{P}(G)=0$. Letting $x$ increase to infinity gives $\stackrel{~}{Y}\le Y$ (almost surely) and, similarly, $Y\le \stackrel{~}{Y}$ so that $Y=\stackrel{~}{Y}$ almost surely. ∎

Finally, we show existence of the conditional expectation of every random variable $X$ satisfying $$ almost surely. Note, in particular, that this is satisfied whenever $X$ is integrable, as

$$ |

###### Theorem 3.

Let $X$ be a random variable such that $$ almost surely. Then, there exists a $\mathrm{G}$-measurable random variable $Y$ such that $$ and (1) is satisfied for every $G\mathrm{\in}\mathrm{G}$ with $$.

Furthermore, $Y$ is uniquely defined up to $\mathrm{P}$-a.e. equivalence.

###### Proof.

The positive and negative parts ${X}_{+},{X}_{-}$ of $X$ satisfy

$$ |

almost surely. We can therefore set ${Y}_{\pm}\equiv \mathbb{E}[{X}_{\pm}\mid \mathcal{G}]$ and $Y={Y}_{+}-{Y}_{-}$.

If $G\in \mathcal{G}$ satisfies $$ then $$, so $$ and,

$$\mathbb{E}[{1}_{G}Y]=\mathbb{E}[{1}_{G}{Y}_{+}]-\mathbb{E}[{1}_{G}{Y}_{-}]=\mathbb{E}[{1}_{G}{X}_{+}]-\mathbb{E}[{1}_{G}{X}_{-}]=\mathbb{E}[{1}_{G}X]$$ |

as required.

Finally, suppose that $\stackrel{~}{Y}$ satisfies the same conditions as $Y$. For any $x\ge 0$ set $G=\{{Y}_{+}+{Y}_{-}\le x,\stackrel{~}{Y}>Y\}\in \mathcal{G}$. Then,

$$ |

So, $\mathbb{E}[{1}_{G}|Y|]$ and $\mathbb{E}[{1}_{G}|\stackrel{~}{Y}|]$ are finite, hence (1) gives

$$\mathbb{E}[{1}_{G}(\stackrel{~}{Y}-Y)]=\mathbb{E}[{1}_{G}X]-\mathbb{E}[{1}_{G}X]=0.$$ |

So $\mathbb{P}(G)=0$ and, letting $x$ increase to infinity, $\stackrel{~}{Y}\le Y$ almost surely. Similarly, $Y\le \stackrel{~}{Y}$ and therefore $\stackrel{~}{Y}=Y$ almost surely. ∎

Title | existence of the conditional expectation |
---|---|

Canonical name | ExistenceOfTheConditionalExpectation |

Date of creation | 2013-03-22 18:39:28 |

Last modified on | 2013-03-22 18:39:28 |

Owner | gel (22282) |

Last modified by | gel (22282) |

Numerical id | 5 |

Author | gel (22282) |

Entry type | Theorem |

Classification | msc 60A10 |