# proof of Kolmogorov’s strong law for IID random variables

Kolmogorov’s strong law for square integrable random variables  states that if $X_{1},X_{2},\ldots$ is a sequence  of independent random variables with $\sum_{n}\operatorname{Var}[X_{n}]/n^{2}<\infty$ then $n^{-1}\sum_{k=1}^{n}(X_{k}-\mathbb{E}[X_{k}])$ converges to zero with probability one as $n\rightarrow\infty$ (see martingale  proof of Kolmogorov’s strong law for square integrable variables). We show that the following version of the strong law for IID random variables follows from this.

###### Theorem (Kolmogorov).

Let $X_{1},X_{2},\ldots$ be independent and identically distributed random variables with $\mathbb{E}[|X_{n}|]<\infty$. Then, $n^{-1}\sum_{k=1}^{n}(X_{k}-\mathbb{E}[X_{k}])\rightarrow 0$ as $n\rightarrow\infty$, with probability one.

Note that here, the random variables $X_{n}$ are not necessarily square integrable. Let us set $\tilde{X}_{n}=X_{n}-\mathbb{E}[X_{n}]$, so that $\tilde{X}_{n}$ are IID random variables with $\mathbb{E}[\tilde{X}_{n}]=0$. Then, set

 $Y_{n}=\left\{\begin{array}[]{ll}\tilde{X}_{n},&\textrm{if }\left|\tilde{X}_{n}% \right|

Using the fact that $X_{n}$ has the same distribution  as $X_{1}$ gives

 $\begin{split}\displaystyle\sum_{n}\mathbb{E}[Y_{n}^{2}]/n^{2}&\displaystyle=% \sum_{n}\mathbb{E}\left[1_{\{|\tilde{X}_{n}| (1)

Letting $N$ be the smallest integer greater than $|\tilde{X}_{1}|$,

 $\begin{split}\displaystyle\sum_{n}1_{\{|\tilde{X}_{1}|

So, putting this into equation (1),

 $\sum_{n}\operatorname{Var}[Y_{n}]/n^{2}\leq\sum_{n}\mathbb{E}[Y_{n}^{2}]/n^{2}% \leq\mathbb{E}[2|\tilde{X}_{1}|]<\infty.$

Therefore, $Y_{n}$ satisfies the required properties to apply the strong law for square integrable random variables,

 $n^{-1}\sum_{k=1}^{n}(Y_{k}-\mathbb{E}[Y_{k}])\rightarrow 0$ (2)

as $n\rightarrow\infty$, with probability one. Also,

 $\mathbb{E}[Y_{n}]=\mathbb{E}[Y_{n}-\tilde{X}_{n}]=-\mathbb{E}[1_{\{|\tilde{X}_% {n}|\geq n\}}\tilde{X}_{n}]=-\mathbb{E}[1_{\{|\tilde{X}_{1}|\geq n\}}\tilde{X}% _{1}]$

converges to $0$ as $n\rightarrow\infty$ (by the dominated convergence theorem). So, the $\mathbb{E}[Y_{k}]$ terms in (2) vanish in the limit, giving

 $n^{-1}\sum_{k=1}^{n}Y_{k}\rightarrow 0$ (3)

as $n\rightarrow\infty$ with probability one.

We finally note that

 $\mathbb{E}\left[\sum_{n}1_{\{\tilde{X}_{n}\not=Y_{n}\}}\right]=\mathbb{E}\left% [\sum_{n}1_{\{|\tilde{X}_{1}|\geq n\}}\right]\leq\mathbb{E}[|\tilde{X}_{1}|]<\infty,$

so $\sum_{n}1_{\{\tilde{X}_{n}\not=Y_{n}\}}<\infty$, and $\tilde{X}_{n}=Y_{n}$ for large $n$ (with probability one). So, $Y_{k}$ can be replaced by $\tilde{X}_{k}$ in (3), giving the result.

## References

• 1 David Williams, Probability with martingales, Cambridge Mathematical Textbooks, Cambridge University Press, 1991.
• 2 Olav Kallenberg, Foundations of modern probability, Second edition. Probability and its Applications. Springer-Verlag, 2002.
Title proof of Kolmogorov’s strong law for IID random variables ProofOfKolmogorovsStrongLawForIIDRandomVariables 2013-03-22 18:33:57 2013-03-22 18:33:57 gel (22282) gel (22282) 7 gel (22282) Proof msc 60F15 KolmogorovsStrongLawOfLargeNumbers MartingaleProofOfKolmogorovsStrongLawForSquareIntegrableVariables