machine learning
out-of-distribution-generalization
The ability of a model to perform well on unseen data that is different from the training set.
Expanded definition
Out-of-distribution (OOD) generalization addresses the challenge of ensuring that machine learning models can perform accurately on data that differs from the training distribution. This capability is crucial in real-world applications where models encounter novel scenarios. A common misconception is that high performance on in-distribution data guarantees OOD performance, which is often not the case.
Related terms
Explore adjacent ideas in the knowledge graph.