This example shows you how to run a local optimisation with the Nelder-Mead downhill simplex method.
The Nelder-Mead method is a classical (deterministic) derivative-free optimisation method. It can be very fast if started near the true solution, but can easily get stuck on difficult problems. Nelder-Mead is essentially sequential in nature and can not easily be parallelised.
In [1]:
import pints
import pints.toy
# Create Rosenbrock error (optimum is at 1,1)
f = pints.toy.RosenbrockError()
# Pick starting point
x0 = [-0.75, 3.5]
x, fx = pints.optimise(f, x0, method=pints.NelderMead)
print('Found: ' + str(x))
We can use the ask-and-tell interface to visualise its progress through the search-space:
In [3]:
import numpy as np
import matplotlib.pyplot as plt
# Create figure
fig = plt.figure(figsize=(12, 12))
ax = fig.add_subplot(1, 1, 1)
ax.set_xlabel('x')
ax.set_ylabel('y')
# Show function
x = np.linspace(-1.5, 1.5, 150)
y = np.linspace(-0.5, 4, 225)
X, Y = np.meshgrid(x, y)
Z = [[np.log(f([i, j])) for i in x] for j in y]
levels = np.linspace(np.min(Z), np.max(Z), 20)
ax.contour(X, Y, Z, levels = levels)
# Show initial position
ax.plot(x0[0], x0[1], 'ks', markersize=10)
ax.text(x0[0] + 0.05, x0[1] - 0.03, 'Initial position', fontsize=18)
# Run 400 iterations with ask-and-tell, storing all positions
e = pints.SequentialEvaluator(f)
nm = pints.NelderMead(x0)
path = [x0]
for i in range(400):
xs = nm.ask()
fs = e.evaluate(xs)
nm.tell(fs)
path.append(nm.xbest())
# Plot path
path = np.array(path).T
ax.plot(path[0], path[1], 'x-', color='tab:orange', markersize=15)
plt.show()