# Chernoff-Cramer bound

The Chernoff-Cramèr inequality^{} is a very general and powerful way of bounding random variables^{}. Compared with the famous Chebyshev bound, which implies inverse^{} polynomial^{} decay inequalities, the Chernoff-Cramèr method yields exponential decay inequalities, at the cost of needing a few more
hypotheses on random variables’ .

Theorem: (Chernoff-Cramèr inequality)

Let ${\{{X}_{i}\}}_{i=1}^{n}$ be a collection of independent^{} random
variables such that $$ $\forall i$ in a right neighborhood^{} of $t=0$, i.e. for any $t\in (0,c)$ (Cramèr condition).

Then a zero-valued for $x=0$, positive, strictly increasing, strictly convex function $\mathrm{\Psi}(x):[0,\mathrm{\infty})\mapsto {R}^{+}$ exists
such that:

$$\mathrm{Pr}\left\{\sum _{i=1}^{n}\left({X}_{i}-E[{X}_{i}]\right)>\epsilon \right\}\le \mathrm{exp}\left(-\mathrm{\Psi}(\epsilon )\right)\forall \epsilon \ge 0$$ |

Namely, one has:

$\mathrm{\Psi}(x)$ | $=$ | $$ | ||

$\psi (t)$ | $=$ | $\sum _{i=1}^{n}}\mathrm{ln}E\left[{e}^{t\left({X}_{i}-E{X}_{i}\right)}\right]={\displaystyle \sum _{i=1}^{n}}\left(\mathrm{ln}E\left[{e}^{t{X}_{i}}\right]-tE\left[{X}_{i}\right]\right)$ |

that is, $\mathrm{\Psi}(x)$ is the Legendre of the cumulant generating function of the ${\sum}_{i=1}^{n}\left({X}_{i}-E[{X}_{i}]\right)$ random variable.

Remarks:

1) Besides its importance for theoretical questions, the Chernoff-Cramér bound is also the starting point to derive many deviation
or concentration inequalities, among which Bernstein (http://planetmath.org/BernsteinInequality), Kolmogorov, Prohorov (http://planetmath.org/ProhorovInequality), Bennett (http://planetmath.org/BennettInequality),
Hoeffding and Chernoff ones are worth mentioning. All of these
inequalities are obtained imposing various further conditions on ${X}_{i}$ random variables, which turn out to affect the general form of the cumulant generating function $\psi (t)$.

2) Sometimes, instead of bounding the sum of n independent random variables, one needs to estimate they ” i.e. the quantity $\frac{1}{n}{\sum}_{i=1}^{n}{X}_{i}$; in to reuse Chernoff-Cramér bound, it’s enough to note that

$$\mathrm{Pr}\left\{\frac{1}{n}\sum _{i=1}^{n}\left({X}_{i}-E[{X}_{i}]\right)>{\epsilon}^{\prime}\right\}=\mathrm{Pr}\left\{\sum _{i=1}^{n}\left({X}_{i}-E[{X}_{i}]\right)>n{\epsilon}^{\prime}\right\}$$ |

so that one has only to replace, in the above stated inequality, $\epsilon $ with $n{\epsilon}^{\prime}$.

3) It turns out that the Chernoff-Cramer bound is asymptotically sharp, as CramÃÂ¨r theorem shows.

Title | Chernoff-Cramer bound |
---|---|

Canonical name | ChernoffCramerBound |

Date of creation | 2013-03-22 16:09:02 |

Last modified on | 2013-03-22 16:09:02 |

Owner | Andrea Ambrosio (7332) |

Last modified by | Andrea Ambrosio (7332) |

Numerical id | 46 |

Author | Andrea Ambrosio (7332) |

Entry type | Theorem |

Classification | msc 60E15 |

Related topic | HoeffdingInequalityForBoundedIndependentRandomVariables |