In statistics a uniformly minimum-variance unbiased estimator or minimum-variance unbiased estimator (UMVUE or MVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter.
Contents
For practical statistics problems, it is important to determine the UMVUE if one exists, since less-than-optimal procedures would naturally be avoided, other things being equal. This has led to substantial development of statistical theory related to the problem of optimal estimation. While the particular specification of "optimal" here — requiring unbiasedness and measuring "goodness" using the variance — may not always be what is wanted for any given practical situation, it is one where useful and generally applicable results can be found.
Definition
Consider estimation of
for any other unbiased estimator
If an unbiased estimator of
Further, by the Lehmann–Scheffé theorem, an unbiased estimator that is a function of a complete, sufficient statistic is the UMVUE estimator.
Put formally, suppose
is the MVUE for
A Bayesian analog is a Bayes estimator, particularly with minimum mean square error (MMSE).
Estimator selection
An efficient estimator need not exist, but if it does and if it is unbiased, it is the MVUE. Since the mean squared error (MSE) of an estimator δ is
the MVUE minimizes MSE among unbiased estimators. In some cases biased estimators have lower MSE because they have a smaller variance than does any unbiased estimator; see estimator bias.
Example
Consider the data to be a single observation from an absolutely continuous distribution on
and we wish to find the UMVU estimator of
First we recognize that the density can be written as
Which is an exponential family with sufficient statistic
Therefore
Clearly
This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU, as Lehmann–Scheffé theorem states.