What is hyper-parameter tuning?

Hyper-Parameter Tuning

When creating a machine learning model, we have to define our model architecture. Oftentimes, we don’t immediately know what the optimal model architecture should be for a given model, and thus we’d like to be able to explore a range of possibilities. In true machine learning fashion, we’ll ideally ask the machine to perform this exploration and select the optimal model architecture automatically. Parameters that define the model architecture are referred to as hyperparameters and thus this process of searching for the ideal model architecture is referred to as hyperparameter tuning.

Hyper-parameter optimization is the task of finding an optimal or near optimal (locally) set of hyper-parameters (or free parameters that are set manually or externally outside of a learning algorithm’s self-adjustment of its internal parameters). In other words, we are searching for a good configuration of the various “knobs” one must set to achieve good generalization/out-of-sample performance.

Hyperparameters are the various knobs that need to be set to right level for the classifier to work with a high rate of accuracy .