Gradient Boosting

What is Gradient Boosting?

Gradient Boosting is a boosting technique where new models are trained to predict the residuals (errors) of prior models, effectively performing gradient descent in function space.

Where did the term "Gradient Boosting" come from?

The engine behind XGBoost and LightGBM.

How is "Gradient Boosting" used today?

The state-of-the-art for structured/tabular data.

Related Terms