Suvarna Garge (Editor)

Point estimation

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistics, point estimation involves the use of sample data to calculate a single value (known as a statistic) which is to serve as a "best guess" or "best estimate" of an unknown (fixed or random) population parameter.

Contents

More formally, it is the application of a point estimator to the data.

In general, point estimation should be contrasted with interval estimation: such interval estimates are typically either confidence intervals in the case of frequentist inference, or credible intervals in the case of Bayesian inference.

Point estimators

There are a variety of point estimators, each with different properties.

  • minimum-variance mean-unbiased estimator (MVUE), minimizes the risk (expected loss) of the squared-error loss-function.
  • best linear unbiased estimator (BLUE)
  • minimum mean squared error (MMSE)
  • median-unbiased estimator, minimizes the risk of the absolute-error loss function
  • maximum likelihood (ML)
  • method of moments, generalized method of moments
  • Bayesian point-estimation

    Bayesian inference is typically based on the posterior distribution. Many Bayesian point-estimators are the posterior distribution's statistics of central tendency, e.g., its mean, median, or mode:

  • Posterior mean, which minimizes the (posterior) risk (expected loss) for a squared-error loss function; in Bayesian estimation, the risk is defined in terms of the posterior distribution, as observed by Gauss.
  • Posterior median, which minimizes the posterior risk for the absolute-value loss function, as observed by Laplace.
  • maximum a posteriori (MAP), which finds a maximum of the posterior distribution; for a uniform prior probability, the MAP estimator coincides with the maximum-likelihood estimator;
  • The MAP estimator has good asymptotic properties, even for many difficult problems, on which the maximum-likelihood estimator has difficulties. For regular problems, where the maximum-likelihood estimator is consistent, the maximum-likelihood estimator ultimately agrees with the MAP estimator. Bayesian estimators are admissible, by Wald's theorem.

    The Minimum Message Length (MML) point estimator is based in Bayesian information theory and is not so directly related to the posterior distribution.

    Special cases of Bayesian filters are important:

  • Kalman filter
  • Wiener filter
  • Several methods of computational statistics have close connections with Bayesian analysis:

  • particle filter
  • Markov chain Monte Carlo (MCMC)
  • Properties of point estimates

  • bias of an estimator
  • Cramér–Rao bound
  • References

    Point estimation Wikipedia