Deep Learning
Batch Normalization
A technique to improve training speed and stability in deep neural networks by normalizing the inputs of each layer.
Expanded definition
Batch normalization helps to reduce internal covariate shift by normalizing layer inputs, which allows for higher learning rates and can lead to improved convergence. By applying this technique, model training becomes more stable and efficient, often leading to better overall performance.
Related terms
Explore adjacent ideas in the knowledge graph.