# Lehmann-Scheffé theorem

A statistic^{} $S(\bm{X})$ on a random sample of data $\bm{X}=({X}_{1},\mathrm{\dots},{X}_{n})$ is said to be a *complete statistic* if for any Borel measurable function $g$,

$$E(g(S))=0\mathit{\hspace{1em}}\text{implies}\mathit{\hspace{1em}}P(g(S)=0)=1.$$ |

In other words, $g(S)=0$ almost everywhere whenever the expected value^{} of $g(S)$ is $0$. If $S(\bm{X})$ is associated with a family $f(x\mid \theta )$ of probability density functions^{} (or mass function in the discrete case), then completeness of $S$ means that $g(S)=0$ almost everywhere whenever ${E}_{\theta}(g(S))=0$ for every $\theta $.

###### Theorem 1 (Lehmann-Scheffé).

If $S\mathit{}\mathrm{(}\mathbf{X}\mathrm{)}$ is a complete sufficient statistic and $h\mathit{}\mathrm{(}\mathbf{X}\mathrm{)}$ is an unbiased estimator^{} for $\theta $, then, given

$${h}_{0}(s)=E(h(\bm{X})|S(\bm{X})=s),$$ |

${h}_{0}(S)={h}_{0}(S(\bm{X}))$ is a uniformly minimum variance unbiased estimator^{} of $\theta $. Furthermore, ${h}_{\mathrm{0}}\mathit{}\mathrm{(}S\mathrm{)}$ is unique almost everywhere for every $\theta $.

Title | Lehmann-Scheffé theorem |
---|---|

Canonical name | LehmannScheffeTheorem |

Date of creation | 2013-03-22 16:31:59 |

Last modified on | 2013-03-22 16:31:59 |

Owner | CWoo (3771) |

Last modified by | CWoo (3771) |

Numerical id | 13 |

Author | CWoo (3771) |

Entry type | Theorem |

Classification | msc 62F10 |

Synonym | Lehmann-Scheffe theorem |

Defines | complete statistic |