Nelder-Mead on Challenging Functions-2

Multimodal Optimization Problem
Many nonlinear objective functions may have multiple optima, referred to as multimodal problems.

The problem may be structured such that it has multiple global optima that have an equivalent function evaluation, or a single global optima and multiple local optima where algorithms like the Nelder-Mead can get stuck in search of the local optima.

The Ackley function is an example of the latter. It is a two-dimensional objective function that has a global optima at [0,0] but has many local optima.

The example below implements the Ackley and creates a three-dimensional plot showing the global optima and multiple local optima.

ackley multimodal function

from numpy import arange
from numpy import exp
from numpy import sqrt
from numpy import cos
from numpy import e
from numpy import pi
from numpy import meshgrid
from matplotlib import pyplot
from mpl_toolkits.mplot3d import Axes3D

objective function

def objective(x, y):
return -20.0 * exp(-0.2 * sqrt(0.5 * (x2 + y2))) - exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20

define range for input

r_min, r_max = -5.0, 5.0

sample input range uniformly at 0.1 increments

xaxis = arange(r_min, r_max, 0.1)
yaxis = arange(r_min, r_max, 0.1)

create a mesh from the axis

x, y = meshgrid(xaxis, yaxis)

compute targets

results = objective(x, y)

create a surface plot with the jet color scheme

figure = pyplot.figure()
axis = figure.gca(projection=‘3d’)
axis.plot_surface(x, y, results, cmap=‘jet’)

show the plot

pyplot.show()

ackley multimodal function

from numpy import arange
from numpy import exp
from numpy import sqrt
from numpy import cos
from numpy import e
from numpy import pi
from numpy import meshgrid
from matplotlib import pyplot
from mpl_toolkits.mplot3d import Axes3D

objective function

def objective(x, y):
return -20.0 * exp(-0.2 * sqrt(0.5 * (x2 + y2))) - exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20

define range for input

r_min, r_max = -5.0, 5.0

sample input range uniformly at 0.1 increments

xaxis = arange(r_min, r_max, 0.1)
yaxis = arange(r_min, r_max, 0.1)

create a mesh from the axis

x, y = meshgrid(xaxis, yaxis)

compute targets

results = objective(x, y)

create a surface plot with the jet color scheme

figure = pyplot.figure()
axis = figure.gca(projection=‘3d’)
axis.plot_surface(x, y, results, cmap=‘jet’)

show the plot

pyplot.show()
Running the example creates the surface plot of the Ackley function showing the vast number of local optima.

3D Surface Plot of the Ackley Multimodal Function
3D Surface Plot of the Ackley Multimodal Function

We would expect the Nelder-Mead function to get stuck in one of the local optima while in search of the global optima.

Initially, when the simplex is large, the algorithm may jump over many local optima, but as it contracts, it will get stuck.

We can explore this with the example below that demonstrates the Nelder-Mead algorithm on the Ackley function.

nelder-mead for multimodal function optimization

from scipy.optimize import minimize
from numpy.random import rand
from numpy import exp
from numpy import sqrt
from numpy import cos
from numpy import e
from numpy import pi

objective function

def objective(v):
x, y = v
return -20.0 * exp(-0.2 * sqrt(0.5 * (x2 + y2))) - exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20

define range for input

r_min, r_max = -5.0, 5.0

define the starting point as a random sample from the domain

pt = r_min + rand(2) * (r_max - r_min)

perform the search

result = minimize(objective, pt, method=‘nelder-mead’)

summarize the result

print(‘Status : %s’ % result[‘message’])
print(‘Total Evaluations: %d’ % result[‘nfev’])

evaluate solution

solution = result[‘x’]
evaluation = objective(solution)
print(‘Solution: f(%s) = %.5f’ % (solution, evaluation))

nelder-mead for multimodal function optimization

from scipy.optimize import minimize
from numpy.random import rand
from numpy import exp
from numpy import sqrt
from numpy import cos
from numpy import e
from numpy import pi

objective function

def objective(v):
x, y = v
return -20.0 * exp(-0.2 * sqrt(0.5 * (x2 + y2))) - exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20

define range for input

r_min, r_max = -5.0, 5.0

define the starting point as a random sample from the domain

pt = r_min + rand(2) * (r_max - r_min)

perform the search

result = minimize(objective, pt, method=‘nelder-mead’)

summarize the result

print(‘Status : %s’ % result[‘message’])
print(‘Total Evaluations: %d’ % result[‘nfev’])

evaluate solution

solution = result[‘x’]
evaluation = objective(solution)
print(‘Solution: f(%s) = %.5f’ % (solution, evaluation))
Running the example executes the optimization, then reports the results.

Note: Your results may vary given the stochastic nature of the algorithm or evaluation procedure, or differences in numerical precision. Consider running the example a few times and compare the average outcome.

In this case, we can see that the search completed successfully but did not locate the global optima. It got stuck and found a local optima.

Each time we run the example, we will find a different local optima given the different random starting point for the search.

Status: Optimization terminated successfully.
Total Evaluations: 62
Solution: f([-4.9831427 -3.98656015]) = 11.90126