Gradient boosting Advantages and Disadvantages

Advantages of gradient boosting trees

There are several reasons as to why you would consider using gradient boosting tree algorithms:

  • generally more accurate compare to other modes,
  • train faster especially on larger datasets,
  • most of them provide support handling categorical features,
  • some of them handle missing values natively.

Disadvantages of gradient boosting trees

Let’s now address some of the challenges faced when using gradient boosted trees:

  • prone to overfitting: this can be solved by applying L1 and L2 regularization penalties. You can try a low learning rate as well;
  • models can be computationally expensive and take a long time to train, especially on CPUs;
  • hard to interpret the final models.