Bagging (Bootstrap Aggregating)

What is Bagging (Bootstrap Aggregating)?

Bagging is an ensemble meta-algorithm that improves stability and accuracy. It works by training multiple versions of the same model on different random subsets of the training data (bootstrapping) and then averaging their predictions (aggregating) to reduce variance.

Where did the term "Bagging (Bootstrap Aggregating)" come from?

Introduced by Leo Breiman in 1996, it laid the groundwork for the Random Forest algorithm.

How is "Bagging (Bootstrap Aggregating)" used today?

Ideally suited for high-variance models like Decision Trees, preventing them from overfitting to specific noise in the training set.

Related Terms