Supriya Ghosh (Editor)

Observed information

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistics, the observed information, or observed Fisher information, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood" (the logarithm of the likelihood function). It is a sample-based version of the Fisher information.

Contents

Definition

Suppose we observe random variables X 1 , , X n , independent and identically distributed with density f(X; θ), where θ is a (possibly unknown) vector. Then the log-likelihood of the parameters θ given the data X 1 , , X n is

( θ | X 1 , , X n ) = i = 1 n log f ( X i | θ ) .

We define the observed information matrix at θ as

J ( θ ) = ( θ ) | θ = θ

In many instances, the observed information is evaluated at the maximum-likelihood estimate.

Fisher information

The Fisher information I ( θ ) is the expected value of the observed information given a single observation X distributed according to the hypothetical model with parameter θ :

I ( θ ) = E ( J ( θ ) ) .

Applications

In a notable article, Bradley Efron and David V. Hinkley argued that the observed information should be used in preference to the expected information when employing normal approximations for the distribution of maximum-likelihood estimates.

References

Observed information Wikipedia