Variance refers to the model's sensitivity to small fluctuations in the training set. High variance can cause an algorithm to model the random noise in the training data rather than the intended outputs.
A core statistical concept in error decomposition.
A key component of the Bias-Variance tradeoff in machine learning model selection.