Machine Learning
Gradient Boosting
A machine learning technique that builds models in a sequential manner.
Expanded definition
Gradient boosting is a powerful ensemble technique that builds models sequentially, where each new model attempts to correct the errors made by the previous ones. This method optimizes a loss function by adding new predictors based on the gradient of the loss, leading to better performance on complex datasets. Widely used in various applications, gradient boosting has become a cornerstone of predictive modeling, particularly in tasks like regression and classification.
Related terms
Explore adjacent ideas in the knowledge graph.