Bagging is an ensemble technique that trains multiple models in parallel on different random subsets of the data (bootstrapping) and averages their predictions to reduce variance.
The foundation of Random Forests.
Effective for stabilizing high-variance models like Decision Trees.