Gradient Boosting is a boosting technique where new models are trained to predict the residuals (errors) of prior models, effectively performing gradient descent in function space.
The engine behind XGBoost and LightGBM.
The state-of-the-art for structured/tabular data.