Similarities of Bagging and Boosting
- Both are the ensemble methods to get N learns from 1 learner.
- Both generate several training data sets with random sampling.
- Both generate the final result by taking the average of N learners.
- Both reduce variance and provide higher scalability.
Differences between Bagging and Boosting
- Although they are built independently, but for Bagging, Boosting tries to add new models which perform well where previous models fail.
- Only Boosting determines the weight for the data to tip the scales in favor of the most challenging cases.
- Only Boosting tries to reduce bias. Instead, Bagging may solve the problem of over-fitting while boosting can increase it.