Training Techniques
Backpropagation
An algorithm used for training neural networks by calculating gradients of the loss function.
Expanded definition
Backpropagation involves a forward pass where predictions are made, followed by a backward pass where gradients are computed. This method updates the weights of the neural network to minimize the error, allowing the model to learn from its mistakes. It is a cornerstone of modern machine learning.
Related terms
Explore adjacent ideas in the knowledge graph.