Samiksha Jaiswal (Editor)

Lehmann–Scheffé theorem

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistics, the Lehmann–Scheffé theorem is a prominent statement, tying together the ideas of completeness, sufficiency, uniqueness, and best unbiased estimation. The theorem states that any estimator which is unbiased for a given unknown quantity and that depends on the data only through a complete, sufficient statistic is the unique best unbiased estimator of that quantity. The Lehmann–Scheffé theorem is named after Erich Leo Lehmann and Henry Scheffé, given their two early papers.

Contents

If T is a complete sufficient statistic for θ and E(g(T)) = τ(θ) then g(T) is the uniformly minimum-variance unbiased estimator (UMVUE) of τ(θ).

Statement

Let X = X 1 , X 2 , , X n be a random sample from a distribution that has p.d.f (or p.m.f in the discrete case) f ( x : θ ) where θ Ω is a parameter in the parameter space. Suppose Y = u ( X ) is a sufficient statistic for θ, and let { f Y ( y : θ ) : θ Ω } be a complete family. If φ : E [ φ ( Y ) ] = θ then φ ( Y ) is the unique MVUE of θ.

Proof

By the Rao–Blackwell theorem, if Z is an unbiased estimator of θ then φ ( Y ) := E [ Z Y ] defines an unbiased estimator of θ with the property that its variance is not greater than that of Z .

Now we show that this function is unique. Suppose W is another candidate MVUE estimator of θ. Then again ψ ( Y ) := E [ W Y ] defines an unbiased estimator of θ with the property that its variance is not greater than that of W . Then

E [ φ ( Y ) ψ ( Y ) ] = 0 , θ Ω .

Since { f Y ( y : θ ) : θ Ω } is a complete family

E [ φ ( Y ) ψ ( Y ) ] = 0 φ ( y ) ψ ( y ) = 0 , θ Ω

and therefore the function φ is the unique function of Y with variance not greater than that of any other unbiased estimator. We conclude that φ ( Y ) is the MVUE.

Example for when using a non-complete minimal sufficient statistic

An example of an improvable Rao–Blackwell improvement, when using a minimal sufficient statistic that is not complete, was provided by Galili and Meilijson in 2016. Let X 1 , , X n be a random sample from a scale-uniform distribution X U ( ( 1 k ) θ , ( 1 + k ) θ ) , with unknown mean E [ X ] = θ and known design parameter k ( 0 , 1 ) . In the search for "best" possible unbiased estimators for θ , it is natural to consider X 1 as an initial (crude) unbiased estimator for θ and then try to improve it. Since X 1 is not a function of T = ( X ( 1 ) , X ( n ) ) , the minimal sufficient statistic for θ (where X ( 1 ) = min i X i and X ( n ) = max i X i ), it may be improved using the Rao–Blackwell theorem as follows:

θ ^ R B = E θ [ X 1 X ( 1 ) , X ( n ) ] = X ( 1 ) + X ( n ) 2 .

However, the following unbiased estimator can be shown to have lower variance:

θ ^ L V = 1 k 2 n 1 n + 1 + 1 ( 1 k ) X ( 1 ) + ( 1 + k ) X ( n ) 2 .

And in fact, it could be even further improved when using the following estimator:

θ ^ BAYES = n + 1 n [ 1 X ( 1 ) ( 1 + k ) X ( n ) ( 1 k ) 1 ( X ( 1 ) ( 1 + k ) X ( n ) ( 1 k ) ) n + 1 1 ] X ( n ) 1 + k

References

Lehmann–Scheffé theorem Wikipedia