Variance (Error)

What is Variance (Error)?

In machine learning, variance refers to the model's sensitivity to small fluctuations or noise in the training set. A model with high variance pays too much attention to the specific training data (overfitting), failing to generalize well to new, unseen data.

Where did the term "Variance (Error)" come from?

One of the three components of error (Bias + Variance + Irreducible Error).

How is "Variance (Error)" used today?

Managing variance is a primary goal of regularization techniques like Dropout and Weight Decay.

Related Terms