Multimodal Optimization With Multiple Global Optima
The Himmelblau function is an example of an objective function that has multiple global optima.
Specifically, it has four optima and each has the same objective function evaluation. It is a two-dimensional objective function that has a global optima at [3.0, 2.0], [-2.805118, 3.131312], [-3.779310, -3.283186], [3.584428, -1.848126].
This means each run of a global optimization algorithm may find a different global optima.
The example below implements the Himmelblau and creates a three-dimensional surface plot to give an intuition for the objective function.
himmelblau multimodal test function
from numpy import arange
from numpy import meshgrid
from matplotlib import pyplot
from mpl_toolkits.mplot3d import Axes3D
objective function
def objective(x, y):
return (x**2 + y - 11)2 + (x + y2 -7)**2
define range for input
r_min, r_max = -5.0, 5.0
sample input range uniformly at 0.1 increments
xaxis = arange(r_min, r_max, 0.1)
yaxis = arange(r_min, r_max, 0.1)
create a mesh from the axis
x, y = meshgrid(xaxis, yaxis)
compute targets
results = objective(x, y)
create a surface plot with the jet color scheme
figure = pyplot.figure()
axis = figure.gca(projection=‘3d’)
axis.plot_surface(x, y, results, cmap=‘jet’)
show the plot
pyplot.show()
himmelblau multimodal test function
from numpy import arange
from numpy import meshgrid
from matplotlib import pyplot
from mpl_toolkits.mplot3d import Axes3D
objective function
def objective(x, y):
return (x**2 + y - 11)2 + (x + y2 -7)**2
define range for input
r_min, r_max = -5.0, 5.0
sample input range uniformly at 0.1 increments
xaxis = arange(r_min, r_max, 0.1)
yaxis = arange(r_min, r_max, 0.1)
create a mesh from the axis
x, y = meshgrid(xaxis, yaxis)
compute targets
results = objective(x, y)
create a surface plot with the jet color scheme
figure = pyplot.figure()
axis = figure.gca(projection=‘3d’)
axis.plot_surface(x, y, results, cmap=‘jet’)
show the plot
pyplot.show()
Running the example creates the surface plot of the Himmelblau function showing the four global optima as dark blue basins.
3D Surface Plot of the Himmelblau Multimodal Function
3D Surface Plot of the Himmelblau Multimodal Function
We can apply the basin hopping algorithm to the Himmelblau objective function.
As in the previous example, we will start the search using a random point drawn from the input domain between -5 and 5.
We will use a step size of 0.5, 200 iterations, and the default local search algorithm. At the end of the search, we will report the input for the best located optima,
basin hopping global optimization for the himmelblau multimodal objective function
from scipy.optimize import basinhopping
from numpy.random import rand
objective function
def objective(v):
x, y = v
return (x**2 + y - 11)2 + (x + y2 -7)**2
define range for input
r_min, r_max = -5.0, 5.0
define the starting point as a random sample from the domain
pt = r_min + rand(2) * (r_max - r_min)
perform the basin hopping search
result = basinhopping(objective, pt, stepsize=0.5, niter=200)
summarize the result
print(‘Status : %s’ % result[‘message’])
print(‘Total Evaluations: %d’ % result[‘nfev’])
evaluate solution
solution = result[‘x’]
evaluation = objective(solution)
print(‘Solution: f(%s) = %.5f’ % (solution, evaluation))
basin hopping global optimization for the himmelblau multimodal objective function
from scipy.optimize import basinhopping
from numpy.random import rand
objective function
def objective(v):
x, y = v
return (x**2 + y - 11)2 + (x + y2 -7)**2
define range for input
r_min, r_max = -5.0, 5.0
define the starting point as a random sample from the domain
pt = r_min + rand(2) * (r_max - r_min)
perform the basin hopping search
result = basinhopping(objective, pt, stepsize=0.5, niter=200)
summarize the result
print(‘Status : %s’ % result[‘message’])
print(‘Total Evaluations: %d’ % result[‘nfev’])
evaluate solution
solution = result[‘x’]
evaluation = objective(solution)
print(‘Solution: f(%s) = %.5f’ % (solution, evaluation))
Running the example executes the optimization, then reports the results.
In this case, we can see that the algorithm located an optima at about [3.0, 2.0].
We can see that 200 iterations of the algorithm resulted in 7,660 function evaluations.
Status : [‘requested number of basinhopping iterations completed successfully’]
Total Evaluations: 7660
Solution: f([3. 1.99999999]) = 0.00000
If we run the search again, we may expect a different global optima to be located.
For example, below, we can see an optima located at about [-2.805118, 3.131312], different from the previous run.
Status : [‘requested number of basinhopping iterations completed successfully’]
Total Evaluations: 7636
Solution: f([-2.80511809 3.13131252]) = 0.00000