Supriya Ghosh (Editor)

Shrinkage (statistics)

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistics, shrinkage has two meanings:

  • In relation to the general observation that, in regression analysis, a fitted relationship appears to perform less well on a new data set than on the data set used for fitting. In particular the value of the coefficient of determination 'shrinks'. This idea is complementary to overfitting and, separately, to the standard adjustment made in the coefficient of determination to compensate for the subjunctive effects of further sampling, like controlling for the potential of new explanatory terms improving the model by chance: that is, the adjustment formula itself provides "shrinkage." But the adjustment formula yields an artificial shrinkage, in contrast to the first definition.
  • To describe general types of estimators, or the effects of some types of estimation, whereby a naive or raw estimate is improved by combining it with other information (see shrinkage estimator). The term relates to the notion that the improved estimate is at a reduced distance from the value supplied by the 'other information' than is the raw estimate. In this sense, shrinkage is used to regularize ill-posed inference problems.
  • A common idea underlying both of these meanings is the reduction in the effects of sampling variation.

    References

    Shrinkage (statistics) Wikipedia


    Similar Topics