LightGBM Algorithm

LightGBM is different from other gradient boosting frameworks because it uses a leaf-wise tree growth algorithm. Leaf-wise tree growth algorithms are known to converge faster than depth-wise growth algorithms. However, they’re more prone to overfitting.

The algorithm is histogram-based, so it places continuous values into discrete bins. This leads to faster training and efficient memory utilization.

Other notable features from this algorithm include:

  • support for GPU training,
  • native support for categorical features,
  • ability to handle large-scale data,
  • handles missing values by default.

Let’s take a look at some of the main parameters of this algorithm:

  • max_depth the maximum depth of each tree;
  • objective which defaults to regression;
  • learning_rate the boosting learning rate;
  • n_estimators the number of decision trees to fit;
  • device_type whether you’re working on a CPU or GPU.