EXERCISE: Solve the constrained programming problem by any of the means above.
Minimize: f = -1x[0] + 4x[1]
Subject to: 
-3x[0] + 1x[1] <= 6 
1x[0] + 2x[1] <= 4 
x[1] >= -3 
where: -inf <= x[0] <= inf
In [1]:
    
import cvxopt as cvx
from cvxopt import solvers as cvx_solvers
Q = cvx.matrix([[0.,0.],[0.,0.]])
p = cvx.matrix([-1., 4.])
G = cvx.matrix([[-3., 1., 0.],[1., 2., -1.]])
h = cvx.matrix([6., 4., 3.])
sol = cvx_solvers.qp(Q, p, G, h)
print(sol['x'])
    
    
EXERCISE: Use any of the solvers we've seen thus far to find the minimum of the zimmermann function (i.e. use mystic.models.zimmermann as the objective).  Use the bounds suggested below, if your choice of solver allows it.
In [2]:
    
import scipy.optimize as opt
import mystic.models
result = opt.minimize(mystic.models.zimmermann, [10., 1.], method='powell')
print(result.x)
    
    
EXERCISE: Do the same for the fosc3d function found at mystic.models.fosc3d, using the bounds suggested by the documentation, if your chosen solver accepts bounds or constraints.
In [3]:
    
import scipy.optimize as opt
import mystic.models
result = opt.minimize(mystic.models.fosc3d, [-5., 0.5], method='powell')
print(result.x)
    
    
EXERCISE: Use mystic to find the minimum for the peaks test function, with the bound specified by the mystic.models.peaks documentation.
In [4]:
    
import mystic
import mystic.models
result = mystic.solvers.fmin_powell(mystic.models.peaks, [0., -2.], bounds=[(-5.,5.)]*2)
print(result)
    
    
EXERCISE: Use mystic to do a fit to the noisy data in the scipy.optimize.curve_fit example (the least squares fit).
In [5]:
    
import numpy as np
import scipy.stats as stats
from mystic.solvers import fmin_powell
from mystic import reduced
# Define the function to fit.
def function(coeffs, x):
    a,b,f,phi = coeffs
    return a * np.exp(-b * np.sin(f * x + phi))
# Create a noisy data set around the actual parameters
true_params = [3, 2, 1, np.pi/4]
print("target parameters: {}".format(true_params))
x = np.linspace(0, 2*np.pi, 25)
exact = function(true_params, x)
noisy = exact + 0.3*stats.norm.rvs(size=len(x))
# Define an objective that fits against the noisy data
@reduced(lambda x,y: abs(x)+abs(y))
def objective(coeffs, x, y):
    return function(coeffs, x) - y
# Use curve_fit to estimate the function parameters from the noisy data.
initial_guess = [1,1,1,1]
args = (x, noisy)
estimated_params = fmin_powell(objective, initial_guess, args=args)
print("solved parameters: {}".format(estimated_params))
    
    
EXERCISE: Solve the chebyshev8.cost example exactly, by applying the knowledge that the last term in the chebyshev polynomial will always be be one. Use numpy.round or mystic.constraints.integers or to constrain solutions to the set of integers.  Does using mystic.suppressed to supress small numbers accelerate the solution?
In [6]:
    
# Differential Evolution solver
from mystic.solvers import DifferentialEvolutionSolver2
# Chebyshev polynomial and cost function
from mystic.models.poly import chebyshev8, chebyshev8cost
from mystic.models.poly import chebyshev8coeffs
# tools
from mystic.termination import VTR, CollapseAt, Or
from mystic.strategy import Best1Exp
from mystic.monitors import VerboseMonitor
from mystic.tools import random_seed
from mystic.math import poly1d
import numpy as np
if __name__ == '__main__':
    print("Differential Evolution")
    print("======================")
    ndim = 9
    random_seed(123)
    # configure monitor
    stepmon = VerboseMonitor(50,50)
    # build a constraints function
    def constraints(x):
        x[-1] = 1.
        return np.round(x)
    stop = Or(VTR(0.0001), CollapseAt(0.0, generations=2))
    # use DE to solve 8th-order Chebyshev coefficients
    npop = 10*ndim
    solver = DifferentialEvolutionSolver2(ndim,npop)
    solver.SetRandomInitialPoints(min=[-100]*ndim, max=[100]*ndim)
    solver.SetGenerationMonitor(stepmon)
    solver.SetConstraints(constraints)
    solver.enable_signal_handler()
    solver.Solve(chebyshev8cost, termination=stop, strategy=Best1Exp, \
                 CrossProbability=1.0, ScalingFactor=0.9)
    solution = solver.Solution()
    # use monitor to retrieve results information
    iterations = len(stepmon)
    cost = stepmon.y[-1]
    print("Generation %d has best Chi-Squared: %f" % (iterations, cost))
    # use pretty print for polynomials
    print(poly1d(solution))
    # compare solution with actual 8th-order Chebyshev coefficients
    print("\nActual Coefficients:\n %s\n" % poly1d(chebyshev8coeffs))
    
    
EXERCISE: Replace the symbolic constraints in the following "Pressure Vessel Design" code with explicit penalty functions (i.e. use a compound penalty built with mystic.penalty.quadratic_inequality).
In [7]:
    
"Pressure Vessel Design"
def objective(x):
    x0,x1,x2,x3 = x
    return 0.6224*x0*x2*x3 + 1.7781*x1*x2**2 + 3.1661*x0**2*x3 + 19.84*x0**2*x2
bounds = [(0,1e6)]*4
# with penalty='penalty' applied, solution is:
xs = [0.72759093, 0.35964857, 37.69901188, 240.0]
ys = 5804.3762083
from mystic.constraints import as_constraint
from mystic.penalty import quadratic_inequality
def penalty1(x): # <= 0.0
    return -x[0] + 0.0193*x[2]
def penalty2(x): # <= 0.0
    return -x[1] + 0.00954*x[2]
def penalty3(x): # <= 0.0
    from math import pi
    return -pi*x[2]**2*x[3] - (4/3.)*pi*x[2]**3 + 1296000.0
def penalty4(x): # <= 0.0
    return x[3] - 240.0
@quadratic_inequality(penalty1, k=1e12)
@quadratic_inequality(penalty2, k=1e12)
@quadratic_inequality(penalty3, k=1e12)
@quadratic_inequality(penalty4, k=1e12)
def penalty(x):
    return 0.0
if __name__ == '__main__':
    from mystic.solvers import diffev2
    from mystic.math import almostEqual
    result = diffev2(objective, x0=bounds, bounds=bounds, penalty=penalty,
                     npop=40, gtol=500, disp=True, full_output=True)
    print(result[0])
    
    
EXERCISE: Solve the cvxopt "qp" example with mystic.  Use symbolic constaints, penalty functions, or constraints operators.  If you get it quickly, do all three methods.
In [8]:
    
def objective(x):
    x0,x1 = x
    return 2*x0**2 + x1**2 + x0*x1 + x0 + x1
bounds = [(0.0, None),(0.0, None)]
# with penalty='penalty' applied, solution is:
xs = [0.25, 0.75]
ys = 1.875
from mystic.math.measures import normalize
def constraint(x): # impose exactly
    return normalize(x, 1.0)
if __name__ == '__main__':
    from mystic.solvers import diffev2, fmin_powell
    result = diffev2(objective, x0=bounds, bounds=bounds, npop=40,
                     constraints=constraint, disp=False, full_output=True)
    print(result[0])
    
    
EXERCISE: Convert one of our previous mystic examples to use parallel computing.  Note that if the solver has a SetMapper method, it can take a parallel map.
In [9]:
    
from mystic.termination import VTR, ChangeOverGeneration, And, Or
stop = Or(And(VTR(), ChangeOverGeneration()), VTR(1e-8))
from mystic.models import rosen
from mystic.monitors import VerboseMonitor
from mystic.solvers import DifferentialEvolutionSolver2
from pathos.pools import ThreadPool
if __name__ == '__main__':
    solver = DifferentialEvolutionSolver2(3,40)
    solver.SetRandomInitialPoints([-10,-10,-10],[10,10,10])
    solver.SetGenerationMonitor(VerboseMonitor(10))
    solver.SetMapper(ThreadPool().map) #NOTE: evaluation of objective in parallel
    solver.SetTermination(stop)
    solver.SetObjective(rosen)
    solver.SetStrictRanges([-10,-10,-10],[10,10,10])
    solver.SetEvaluationLimits(generations=600)
    solver.Solve()
    print(solver.bestSolution)
    
    
In [ ]: