In [ ]:
%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
import scipy.optimize as opt
For this problem we are going to work with the following model:
$$ y_{model}(x) = a x^2 + b x + c $$The true values of the model parameters are as follows:
In [ ]:
a_true = 0.5
b_true = 2.0
c_true = -4.0
First, generate a dataset using this model using these parameters and the following characteristics:
size argument of np.random.normal).After you generate the data, make a plot of the raw data (use points).
In [ ]:
# YOUR CODE HERE
raise NotImplementedError()
In [ ]:
assert True # leave this cell for grading the raw data generation and plot
Now fit the model to the dataset to recover estimates for the model's parameters:
In [ ]:
# YOUR CODE HERE
raise NotImplementedError()
In [ ]:
assert True # leave this cell for grading the fit; should include a plot and printout of the parameters+errors