AdaBoost

What is AdaBoost?

Short for Adaptive Boosting, AdaBoost is an ensemble learning algorithm that combines multiple 'weak classifiers' (typically simple decision stumps) to create a strong classifier. It works iteratively: after each round, it increases the weights of misclassified data points, forcing the next classifier to focus on the hard-to-classify examples. The final prediction is a weighted sum of the weak classifiers' outputs. While powerful, it is sensitive to noisy data and outliers compared to newer methods like Gradient Boosting.

Where did the term "AdaBoost" come from?

Introduced by Yoav Freund and Robert Schapire in 1996, earning them the Gödel Prize.

How is "AdaBoost" used today?

Gained massive popularity via the Viola-Jones face detection framework (2001), proving it could run in real-time.

Related Terms