Regularization

What is Regularization?

Regularization adds a penalty to the model's loss function to prevent overfitting by discouraging overly complex models (e.g., L1 and L2 regularization).

Where did the term "Regularization" come from?

Key concept to improve model generalization.

How is "Regularization" used today?

Used in almost all modern ML configurations.

Related Terms