# proof of Bennett inequality

By Chernoff-Cramèr inequality (http://planetmath.org/ChernoffCramerBound), we have:

 $\Pr\left\{\sum_{i=1}^{n}\left(X_{i}-E[X_{i}]\right)>\varepsilon\right\}\leq% \exp\left[-\sup_{t\geq 0}\left(t\varepsilon-\psi(t)\right)\right]$

where

 $\psi(t)=\sum_{i=1}^{n}\left(\ln E\left[e^{tX_{i}}\right]-tE\left[X_{i}\right]% \right).$

Keeping in mind that the condition

 $\Pr\left\{\left|X_{i}\right|\leq M\right\}=1\text{ \ }\forall i$

implies that, for all $i$,

 $E[\left|X_{i}\right|^{k}]\leq M^{k}\text{ \ }\forall k\geq 0$

(see here (http://planetmath.org/RelationBetweenAlmostSurelyAbsolutelyBoundedRandomVariablesAndTheirAbsoluteMoments) for a proof) and since $\ln x\leq x-1$ $\forall x>0$, and

 $E[\left|X\right|^{k}]\leq M^{k}\text{ \ }\Longrightarrow\text{ \ }E\left[\left% |X\right|^{k}\right]\leq E\left[X^{2}\right]M^{k-2}\text{ \ \ \ \ \ \ \ }% \forall k\geq 2,k\in N$

(see here (http://planetmath.org/AbsoluteMomentsBoundingNecessaryAndSufficientCondition) for a proof), one has:

 $\displaystyle\psi(t)$ $\displaystyle=$ $\displaystyle\sum_{i=1}^{n}\left(\ln E\left[e^{tX_{i}}\right]-tE\left[X_{i}% \right]\right)$ $\displaystyle\leq$ $\displaystyle\sum_{i=1}^{n}E\left[e^{tX_{i}}\right]-tE\left[X_{i}\right]-1$ $\displaystyle=$ $\displaystyle\sum_{i=1}^{n}E\left[\sum_{k=0}^{\infty}\frac{\left(tX_{i}\right)% ^{k}}{k!}\right]-tE\left[X_{i}\right]-1$ $\displaystyle=$ $\displaystyle\sum_{i=1}^{n}\left(\sum_{k=0}^{\infty}\frac{t^{k}E\left[X_{i}^{k% }\right]}{k!}\right)-tE\left[X_{i}\right]-1$ $\displaystyle=$ $\displaystyle\sum_{i=1}^{n}\left(\sum_{k=2}^{\infty}\frac{t^{k}E\left[X_{i}^{k% }\right]}{k!}\right)$ $\displaystyle\leq$ $\displaystyle\sum_{i=1}^{n}\left(\sum_{k=2}^{\infty}\frac{t^{k}E\left[\left|X_% {i}\right|^{k}\right]}{k!}\right)$ $\displaystyle\leq$ $\displaystyle\sum_{i=1}^{n}\left(\sum_{k=2}^{\infty}\frac{t^{k}E\left[X_{i}^{2% }\right]M^{k-2}}{k!}\right)$ $\displaystyle=$ $\displaystyle\sum_{k=2}^{\infty}\frac{t^{k}M^{k-2}\sum_{i=1}^{n}E\left[X_{i}^{% 2}\right]}{k!}$ $\displaystyle=$ $\displaystyle\frac{v^{2}}{M^{2}}\sum_{k=2}^{\infty}\frac{\left(tM\right)^{k}}{% k!}$ $\displaystyle=$ $\displaystyle\frac{v^{2}}{M^{2}}\left[\exp\left(tM\right)-tM-1\right]$

One can now write

 $\sup_{t\geq 0}\left(t\varepsilon-\psi(t)\right)\geq\sup_{t\geq 0}\left(t% \varepsilon-\frac{v^{2}}{M^{2}}\left(e^{tM}-tM-1\right)\right)=\sup_{t>0}\left% [\frac{v^{2}}{M^{2}}\left(\frac{M^{2}\varepsilon}{v^{2}}t-\left(e^{tM}-tM-1% \right)\right)\right].$

By elementary calculus, we obtain the value of $t$ that maximizes the expression in round brackets:

 $t_{opt}=\frac{1}{M}\ln\left(1+\frac{M\varepsilon}{v^{2}}\right)$

which, once plugged into the bound, yields

 $\Pr\left\{\sum_{i=1}^{n}\left(X_{i}-E[X_{i}]\right)>\varepsilon\right\}\leq% \exp\left[-\frac{v^{2}}{M^{2}}\left(\left(1+\frac{M\varepsilon}{v^{2}}\right)% \ln\left(1+\frac{M\varepsilon}{v^{2}}\right)-\frac{M\varepsilon}{v^{2}}\right)% \right].$

Observing that $\left(1+x\right)\ln\left(1+x\right)-x\geq\frac{x}{2}\ln\left(1+x\right)$ $\forall x\geq 0$ (see here (http://planetmath.org/ASimpleMethodForComparingRealFunctions)), one gets the sub-optimal yet more easily manageable formula:

 $\Pr\left\{\sum_{i=1}^{n}\left(X_{i}-E[X_{i}]\right)>\varepsilon\right\}\leq% \exp\left[-\frac{\varepsilon}{2M}\ln\left(1+\frac{\varepsilon M}{v^{2}}\right)% \right].$
Title proof of Bennett inequality ProofOfBennettInequality 2013-03-22 16:12:27 2013-03-22 16:12:27 Andrea Ambrosio (7332) Andrea Ambrosio (7332) 22 Andrea Ambrosio (7332) Proof msc 60E15