Empirical Risk Minimization is a principle which defines a family of learning algorithms based on minimizing the error on the training set, used as a proxy for the true error.
A principle in statistical learning theory.
The theoretical basis for how most machine learning models are trained.