GENAIWIKI

Modeling

Regularization

A technique used to prevent overfitting by adding a penalty to the loss function.

Expanded definition

Regularization involves adding constraints or penalties to the model to discourage it from fitting too closely to the training data. This helps to improve the model's ability to generalize to unseen data. Common regularization techniques include L1 and L2 regularization, dropout, and early stopping. By controlling the complexity of the model, regularization helps to ensure robust learning and better performance.

Related terms

Explore adjacent ideas in the knowledge graph.