Empirical Risk Minimization (ERM)

What is Empirical Risk Minimization (ERM)?

Empirical Risk Minimization is a principle which defines a family of learning algorithms based on minimizing the error on the training set, used as a proxy for the true error.

Where did the term "Empirical Risk Minimization (ERM)" come from?

A principle in statistical learning theory.

How is "Empirical Risk Minimization (ERM)" used today?

The theoretical basis for how most machine learning models are trained.

Related Terms