The Differential Evolution global optimization algorithm is available in Python via the differential_evolution() SciPy function.
The function takes the name of the objective function and the bounds of each input variable as minimum arguments for the search.
1…
2# perform the differential evolution search
3result = differential_evolution(objective, bounds)
There are a number of additional hyperparameters for the search that have default values, although you can configure them to customize the search.
A key hyperparameter is the “strategy” argument that controls the type of differential evolution search that is performed. By default, this is set to “best1bin” (DE/best/1/bin), which is a good configuration for most problems. It creates new candidate solutions by selecting random solutions from the population, subtracting one from the other, and adding a scaled version of the difference to the best candidate solution in the population.
- new = best + (mutation * (rand1 – rand2))
The “popsize” argument controls the size or number of candidate solutions that are maintained in the population. It is a factor of the number of dimensions in candidate solutions and by default, it is set to 15. That means for a 2D objective function that a population size of (2 * 15) or 30 candidate solutions will be maintained.
The total number of iterations of the algorithm is maintained by the “maxiter” argument and defaults to 1,000.
The “mutation” argument controls the number of changes made to candidate solutions each iteration. By default, this is set to 0.5. The amount of recombination is controlled via the “recombination” argument, which is set to 0.7 (70 percent of a given candidate solution) by default.
Finally, a local search is applied to the best candidate solution found at the end of the search. This is controlled via the “polish” argument, which by default is set to True.
The result of the search is an OptimizeResult object where properties can be accessed like a dictionary. The success (or not) of the search can be accessed via the ‘success‘ or ‘message‘ key.
The total number of function evaluations can be accessed via ‘nfev‘ and the optimal input found for the search is accessible via the ‘x‘ key.