Hyperparameter tuning is a process of selecting the best parameters of the model so that our prediction accuracy will be good. For example- When we use Random Forest Classifier to train the model, The question arises that what parameters of the Random Forest Classifier we use so that accuracy of the model will be high. The main parameters used by a Random Forest Classifier are:
- criterion = the function used to evaluate the quality of a split.
- max_depth = maximum number of levels allowed in each tree.
- max_features = maximum number of features considered when splitting a node.
- min_samples_leaf = minimum number of samples which can be stored in a tree leaf.
- min_samples_split = minimum number of samples necessary in a node to cause node splitting.
- n_estimators = number of trees in the ensemble.
To solve this issue we use different methods.
- Manual Search.
- Random Search.
- Grid Search.
- Automated Hyperparameter Tuning (Bayesian Optimization, Genetic Algorithms )
- Artificial Neural Networks (ANNs) Tuning.