What are different algorithms used for hyperparameter optimization?

The following are the three primary hyperparameter optimization algorithms:

Grid Search
It’s a method of detecting a family of models with a grid of parameters. From the values of hyperparameters supplied, it trains the model for all conceivable combinations.

Random Search:
It does this by searching the sample space at random and evaluating the sets using a probability distribution. The model is only run a set number of times in this case.

Bayesian Optimization:
It uses Bayes’ theorem to guide the search for the smallest or largest goal function. It’s best for objective functions that are difficult to assess because they’re complicated, noisy, or costly.