# Chernoff-Cramer bound

The Chernoff-Cramèr inequality is a very general and powerful way of bounding random variables. Compared with the famous Chebyshev bound, which implies inverse polynomial decay inequalities, the Chernoff-Cramèr method yields exponential decay inequalities, at the cost of needing a few more hypotheses on random variables’ .

Theorem: (Chernoff-Cramèr inequality)

Let $\{X_{i}\}_{i=1}^{n}$ be a collection of independent random variables such that $E[\exp\left(tX_{i}\right)]<+\infty$  $\forall i$ in a right neighborhood of $t=0$, i.e. for any $t\in(0,c)$ (Cramèr condition).

Then a zero-valued for $x=0$, positive, strictly increasing, strictly convex function $\Psi(x):[0,\infty)\mapsto R^{+}$ exists such that:

 $\Pr\left\{\sum_{i=1}^{n}\left(X_{i}-E[X_{i}]\right)>\varepsilon\right\}\leq% \exp\left(-\Psi(\varepsilon)\right)\ \forall\varepsilon\geq 0$

Namely, one has:

 $\displaystyle\Psi(x)$ $\displaystyle=$ $\displaystyle\sup_{0 $\displaystyle\psi(t)$ $\displaystyle=$ $\displaystyle\sum_{i=1}^{n}\ln E\left[e^{t\left(X_{i}-EX_{i}\right)}\right]=% \sum_{i=1}^{n}\left(\ln E\left[e^{tX_{i}}\right]-tE\left[X_{i}\right]\right)$

that is, $\Psi(x)$ is the Legendre of the cumulant generating function of the $\sum_{i=1}^{n}\left(X_{i}-E[X_{i}]\right)$ random variable.

Remarks:

1) Besides its importance for theoretical questions, the Chernoff-Cramér bound is also the starting point to derive many deviation or concentration inequalities, among which Bernstein (http://planetmath.org/BernsteinInequality), Kolmogorov, Prohorov (http://planetmath.org/ProhorovInequality), Bennett (http://planetmath.org/BennettInequality), Hoeffding and Chernoff ones are worth mentioning. All of these inequalities are obtained imposing various further conditions on $X_{i}$ random variables, which turn out to affect the general form of the cumulant generating function $\psi(t)$.
2) Sometimes, instead of bounding the sum of n independent random variables, one needs to estimate they ” i.e. the quantity $\frac{1}{n}\sum_{i=1}^{n}X_{i}$; in to reuse Chernoff-Cramér bound, it’s enough to note that

 $\Pr\left\{\frac{1}{n}\sum_{i=1}^{n}\left(X_{i}-E[X_{i}]\right)>\varepsilon^{% \prime}\right\}=\Pr\left\{\sum_{i=1}^{n}\left(X_{i}-E[X_{i}]\right)>n% \varepsilon^{\prime}\right\}$

so that one has only to replace, in the above stated inequality, $\varepsilon$ with $n\varepsilon^{\prime}$.
3) It turns out that the Chernoff-Cramer bound is asymptotically sharp, as CramÃÂ¨r theorem shows.

Title Chernoff-Cramer bound ChernoffCramerBound 2013-03-22 16:09:02 2013-03-22 16:09:02 Andrea Ambrosio (7332) Andrea Ambrosio (7332) 46 Andrea Ambrosio (7332) Theorem msc 60E15 HoeffdingInequalityForBoundedIndependentRandomVariables