Bagging (Bootstrap Aggregating)

What is Bagging (Bootstrap Aggregating)?

Bagging is an ensemble technique that trains multiple models in parallel on different random subsets of the data (bootstrapping) and averages their predictions to reduce variance.

Where did the term "Bagging (Bootstrap Aggregating)" come from?

The foundation of Random Forests.

How is "Bagging (Bootstrap Aggregating)" used today?

Effective for stabilizing high-variance models like Decision Trees.

Related Terms