In machine learning, variance refers to the model's sensitivity to small fluctuations or noise in the training set. A model with high variance pays too much attention to the specific training data (overfitting), failing to generalize well to new, unseen data.
One of the three components of error (Bias + Variance + Irreducible Error).
Managing variance is a primary goal of regularization techniques like Dropout and Weight Decay.