Optimization Exercise 1

Imports


In [16]:
%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
import scipy.optimize as opt
from scipy.optimize import minimize, rosen, rosen_der

Hat potential

The following potential is often used in Physics and other fields to describe symmetry breaking and is often known as the "hat potential":

$$ V(x) = -a x^2 + b x^4 $$

Write a function hat(x,a,b) that returns the value of this function:


In [35]:
def hat(x,a,b):
    return -a*(x**2) + b*(x**4)

In [36]:
assert hat(0.0, 1.0, 1.0)==0.0
assert hat(0.0, 1.0, 1.0)==0.0
assert hat(1.0, 10.0, 1.0)==-9.0

Plot this function over the range $x\in\left[-3,3\right]$ with $b=1.0$ and $a=5.0$:


In [37]:
a = 5.0
b = 1.0

In [38]:
x = np.arange(-3, 3, 0.1)
y = hat(x,a,b)
plt.plot(x,y)


Out[38]:
[<matplotlib.lines.Line2D at 0x7f7404f8ddd8>]

In [39]:
assert True # leave this to grade the plot

Write code that finds the two local minima of this function for $b=1.0$ and $a=5.0$.

  • Use scipy.optimize.minimize to find the minima. You will have to think carefully about how to get this function to find both minima.
  • Print the x values of the minima.
  • Plot the function as a blue line.
  • On the same axes, show the minima as red circles.
  • Customize your visualization to make it beatiful and effective.

In [41]:
n = minimize(hat, x, (a,b), method='BFGS')
n.x


---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-41-e2c0b0b9e980> in <module>()
----> 1 n = minimize(hat, x, (a,b), method='BFGS')
      2 n.x

/usr/local/lib/python3.4/dist-packages/scipy/optimize/_minimize.py in minimize(fun, x0, args, method, jac, hess, hessp, bounds, constraints, tol, callback, options)
    417         return _minimize_cg(fun, x0, args, jac, callback, **options)
    418     elif meth == 'bfgs':
--> 419         return _minimize_bfgs(fun, x0, args, jac, callback, **options)
    420     elif meth == 'newton-cg':
    421         return _minimize_newtoncg(fun, x0, args, jac, hess, hessp, callback,

/usr/local/lib/python3.4/dist-packages/scipy/optimize/optimize.py in _minimize_bfgs(fun, x0, args, jac, callback, gtol, norm, eps, maxiter, disp, return_all, **unknown_options)
    835     else:
    836         grad_calls, myfprime = wrap_function(fprime, args)
--> 837     gfk = myfprime(x0)
    838     k = 0
    839     N = len(x0)

/usr/local/lib/python3.4/dist-packages/scipy/optimize/optimize.py in function_wrapper(*wrapper_args)
    280     def function_wrapper(*wrapper_args):
    281         ncalls[0] += 1
--> 282         return function(*(wrapper_args + args))
    283 
    284     return ncalls, function_wrapper

/usr/local/lib/python3.4/dist-packages/scipy/optimize/optimize.py in approx_fprime(xk, f, epsilon, *args)
    614 
    615     """
--> 616     return _approx_fprime_helper(xk, f, epsilon, args=args)
    617 
    618 

/usr/local/lib/python3.4/dist-packages/scipy/optimize/optimize.py in _approx_fprime_helper(xk, f, epsilon, args, f0)
    554         ei[k] = 1.0
    555         d = epsilon * ei
--> 556         grad[k] = (f(*((xk + d,) + args)) - f0) / d[k]
    557         ei[k] = 0.0
    558     return grad

ValueError: setting an array element with a sequence.

In [ ]:
assert True # leave this for grading the plot

To check your numerical results, find the locations of the minima analytically. Show and describe the steps in your derivation using LaTeX equations. Evaluate the location of the minima using the above parameters.

YOUR ANSWER HERE