Trisha Shetty (Editor)

Hannan–Quinn information criterion

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistics, the Hannan–Quinn information criterion (HQC) is a criterion for model selection. It is an alternative to Akaike information criterion (AIC) and Bayesian information criterion (BIC). It is given as

H Q C = 2 L m a x + 2 k ln ( ln ( n ) ) ,  

where L m a x is the log-likelihood, k is the number of parameters, and n is the number of observations.

Burnham & Anderson (2002, p. 287) say that HQC, "while often cited, seems to have seen little use in practice". They also note that HQC, like BIC, but unlike AIC, is not an estimator of Kullback–Leibler divergence. Claeskens & Hjort (2008, ch. 4) note that HQC, like BIC, but unlike AIC, is not asymptotically efficient, and further point out that whatever method is being used for fine-tuning the criterion will be more important in practice than the term ln ln n, since this latter number is small even for very large n.

References

Hannan–Quinn information criterion Wikipedia