Harman Patil (Editor)

Precision (statistics)

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistics, precision is the reciprocal of the variance, and the precision matrix is the matrix inverse of the covariance matrix. Some particular statistical models define the term precision differently.

One particular use of the precision matrix is in the context of Bayesian analysis of the multivariate normal distribution: for example, Bernardo & Smith prefer to parameterise the multivariate normal distribution in terms of the precision matrix, rather than the covariance matrix, because of certain simplifications that then arise.

In general, statisticians prefer to use the dual term variability rather than precision. Variability is the amount of imprecision.

History

The term precision in this sense (“mensura praecisionis observationum”) first appeared in the works of Gauss (1809) “Theoria motus corporum coelestium in sectionibus conicis solem ambientium” (page 212). Gauss’s definition differs from the modern one by a factor of 2 . He writes, for the density function of a normal random variable with precision h,

φ Δ = h π e h h Δ Δ .

Later Whittaker & Robinson (1924) “Calculus of observations” called this quantity the modulus, but this term has dropped out of use.

References

Precision (statistics) Wikipedia