Kalpana Kalpana (Editor)

Berndt–Hall–Hall–Hausman algorithm

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

The Berndt–Hall–Hall–Hausman (BHHH) algorithm is a numerical optimization algorithm similar to the Gauss–Newton algorithm. It is named after the four originators: Ernst R. Berndt, B. Hall, Robert Hall, and Jerry Hausman.

Usage

If a nonlinear model is fitted to the data one often needs to estimate coefficients through optimization. A number of optimisation algorithms have the following general structure. Suppose that the function to be optimized is Q(β). Then the algorithms are iterative, defining a sequence of approximations, βk given by

β k + 1 = β k λ k A k Q β ( β k ) , ,

where β k is the parameter estimate at step k, and λ k is a parameter (called step size) which partly determines the particular algorithm. For the BHHH algorithm λk is determined by calculations within a given iterative step, involving a line-search until a point βk+1 is found satisfying certain criteria. In addition, for the BHHH algorithm, Q has the form

Q = i = 1 N Q i

and A is calculated using

A k = [ i = 1 N ln Q i β ( β k ) ln Q i β ( β k ) ] 1 .

In other cases, e.g. Newton–Raphson, A k can have other forms. The BHHH algorithm has the advantage that, if certain conditions apply, convergence of the iterative procedure is guaranteed.

References

Berndt–Hall–Hall–Hausman algorithm Wikipedia