Optimization
Gradient Descent
An optimization algorithm used to minimize the loss function in machine learning.
Expanded definition
Gradient descent is an iterative optimization algorithm used to minimize a function by updating parameters in the opposite direction of the gradient. It is commonly employed in training machine learning models, where the goal is to minimize the loss function. Variants of gradient descent include stochastic gradient descent (SGD) and mini-batch gradient descent, which differ in how they process the training data. Effective use of gradient descent is crucial for achieving convergence and optimal model performance.
Related terms
Explore adjacent ideas in the knowledge graph.