# Lehmann-Scheffé theorem

A statistic $S(\boldsymbol{X})$ on a random sample of data $\boldsymbol{X}=(X_{1},\ldots,X_{n})$ is said to be a complete statistic if for any Borel measurable function $g$,

 $E(g(S))=0\quad\mbox{implies}\quad P(g(S)=0)=1.$

In other words, $g(S)=0$ almost everywhere whenever the expected value of $g(S)$ is $0$. If $S(\boldsymbol{X})$ is associated with a family $f(x\mid\theta)$ of probability density functions (or mass function in the discrete case), then completeness of $S$ means that $g(S)=0$ almost everywhere whenever $E_{\theta}(g(S))=0$ for every $\theta$.

###### Theorem 1 (Lehmann-Scheffé).

If $S(\boldsymbol{X})$ is a complete sufficient statistic and $h(\boldsymbol{X})$ is an unbiased estimator for $\theta$, then, given

 $h_{0}(s)=E(h(\boldsymbol{X})|S(\boldsymbol{X})=s),$

$h_{0}(S)=h_{0}(S(\boldsymbol{X}))$ is a uniformly minimum variance unbiased estimator of $\theta$. Furthermore, $h_{0}(S)$ is unique almost everywhere for every $\theta$.

Title Lehmann-Scheffé theorem LehmannScheffeTheorem 2013-03-22 16:31:59 2013-03-22 16:31:59 CWoo (3771) CWoo (3771) 13 CWoo (3771) Theorem msc 62F10 Lehmann-Scheffe theorem complete statistic