Regularization
Dropout
A regularization technique used to prevent overfitting in neural networks by randomly deactivating a fraction of neurons during training.
Expanded definition
Dropout works by randomly setting a portion of the neurons in a layer to zero during each training iteration, which forces the network to learn more robust features that are useful in conjunction with different subsets of neurons. This technique has been shown to significantly improve the performance of deep learning models.
Related terms
Explore adjacent ideas in the knowledge graph.