Global Search With SciPy

Global Search or global function optimization refers to algorithms that seek the input to a function that results in the minimum or maximum output where the function or constrained region being searched is assumed to have multiple local optima, e.g. multimodal.

The function that is being optimized is typically nonlinear, nonconvex, and may have one or more than one input variable.

Global search algorithms are typically stochastic, meaning that they make use of randomness in the search process and may or may not manage a population of candidate solutions as part of the search.

The SciPy library provides a number of stochastic global optimization algorithms, each via different functions. They are:

Basin Hopping Optimization via the basinhopping() function.
Differential Evolution Optimization via the differential_evolution() function.
Simulated Annealing via the dual_annealing() function.
The library also provides the shgo() function for sequence optimization and the brute() for grid search optimization.

Each algorithm returns an OptimizeResult object that summarizes the success or failure of the search and the details of the solution if found.

The example below demonstrates how to solve a two-dimensional multimodal function using simulated annealing.

simulated annealing global optimization for a multimodal objective function

from scipy.optimize import dual_annealing

objective function

def objective(v):
x, y = v
return (x**2 + y - 11)2 + (x + y2 -7)**2

define range for input

r_min, r_max = -5.0, 5.0

define the bounds on the search

bounds = [[r_min, r_max], [r_min, r_max]]

perform the simulated annealing search

result = dual_annealing(objective, bounds)

summarize the result

print(‘Status : %s’ % result[‘message’])
print(‘Total Evaluations: %d’ % result[‘nfev’])

evaluate solution

solution = result[‘x’]
evaluation = objective(solution)
print(‘Solution: f(%s) = %.5f’ % (solution, evaluation))

simulated annealing global optimization for a multimodal objective function

from scipy.optimize import dual_annealing

objective function

def objective(v):
x, y = v
return (x**2 + y - 11)2 + (x + y2 -7)**2

define range for input

r_min, r_max = -5.0, 5.0

define the bounds on the search

bounds = [[r_min, r_max], [r_min, r_max]]

perform the simulated annealing search

result = dual_annealing(objective, bounds)

summarize the result

print(‘Status : %s’ % result[‘message’])
print(‘Total Evaluations: %d’ % result[‘nfev’])

evaluate solution

solution = result[‘x’]
evaluation = objective(solution)
print(‘Solution: f(%s) = %.5f’ % (solution, evaluation))
Running the example performs the optimization and reports the success or failure of the search, the number of function evaluations performed, and the input that resulted in the optima of the function.

Status : [‘Maximum number of iteration reached’]
Total Evaluations: 4028
Solution: f([-3.77931027 -3.283186 ]) = 0.00000

Status : [‘Maximum number of iteration reached’]
Total Evaluations: 4028
Solution: f([-3.77931027 -3.283186 ]) = 0.00000