In [1]:
%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
import scipy.optimize as opt
For this problem we are going to work with the following model:
$$ y_{model}(x) = a x^2 + b x + c $$The true values of the model parameters are as follows:
In [2]:
a_true = 0.5
b_true = 2.0
c_true = -4.0
First, generate a dataset using this model using these parameters and the following characteristics:
size
argument of np.random.normal
).After you generate the data, make a plot of the raw data (use points).
In [8]:
X=np.linspace(-5,5,30)
Y=a_true*X**2+b_true*X+c_true+np.random.normal(0,2.0,size=30)
f=plt.figure(figsize=(15,10))
plt.plot(X,Y, "b*");
In [9]:
assert True # leave this cell for grading the raw data generation and plot
Now fit the model to the dataset to recover estimates for the model's parameters:
In [14]:
opt.curve_fit?
In [31]:
best_guess=[0,0,0]
popt, popc=opt.curve_fit(lambda x,a,b,c:a*x**2+b*x+c,X,Y, best_guess);
print(list(zip(popt,np.diag(popc))))
f=plt.figure(figsize=(15,10))
plt.plot(X,Y,"b*");
plt.plot(X,popt[0]*X**2+popt[1]*X+popt[2],"r-");
In [ ]:
assert True # leave this cell for grading the fit; should include a plot and printout of the parameters+errors