Suvarna Garge (Editor)

LogitBoost

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In machine learning and computational learning theory, LogitBoost is a boosting algorithm formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The original paper casts the AdaBoost algorithm into a statistical framework. Specifically, if one considers AdaBoost as a generalized additive model and then applies the cost functional of logistic regression, one can derive the LogitBoost algorithm.

Minimizing the LogitBoost cost function

LogitBoost can be seen as a convex optimization. Specifically, given that we seek an additive model of the form

f = t α t h t

the LogitBoost algorithm minimizes the logistic loss:

i log ( 1 + e y i f ( x i ) )

References

LogitBoost Wikipedia