Machine Learning
Bagging
An ensemble method that improves the stability and accuracy of machine learning algorithms.
Expanded definition
Bagging, or Bootstrap Aggregating, is an ensemble technique that reduces variance by training multiple models on different subsets of the training dataset and then averaging their predictions. This approach helps to mitigate overfitting, as it combines the predictions of numerous weak learners to form a stronger overall model. Bagging is commonly used with decision trees, leading to more robust models like Random Forests.
Related terms
Explore adjacent ideas in the knowledge graph.