GENAIWIKI

Model Evaluation

Variance

Variance measures a model's sensitivity to fluctuations in the training data, contributing to overfitting when high.

Expanded definition

Variance refers to the variability of model predictions for a given data point, which can indicate how much the model's performance will change with different training data. A model with high variance pays too much attention to the training data, leading to overfitting and poor generalization to new, unseen data. Understanding the trade-off between bias and variance is crucial for creating robust machine learning models.

Related terms

Explore adjacent ideas in the knowledge graph.