In [1]:
import pints
Next, we need a model: any class that implements the pints.ForwardModel interface.
Usually, you'd write a class for this purpose (that wrapped around whatever simulation package you wanted to use to generate your time series data). But you could also use a pure-Python model.
In the example, we use a logistic model, provided by Pints's toy model module.
In [3]:
import pints.toy as toy
model = toy.LogisticModel()
This model has two parameters: A growth rate (which determines the steepness of the curve) and a carrying capacity (which determines the number the curve converges to). For the example, we simply pick some nice values:
In [4]:
real_parameters = [0.015, 500]
Finally, we create a list of times (in a real experiment, these would be the times at which the time series was sampled)
In [5]:
import numpy as np
times = np.linspace(0, 1000, 1000)
We now have everything we need to run a simulation and generate some toy data:
In [6]:
values = model.simulate(real_parameters, times)
We can use Matplotlib (or any other plotting package) to have a look at the generated data:
In [7]:
import matplotlib.pyplot as plt
In [8]:
plt.figure(figsize=(14, 6))
plt.xlabel('Time')
plt.ylabel('Values')
plt.plot(times, values)
plt.show()
If you like, you can make it more realistic at this point by adding some noise:
In [9]:
values += np.random.normal(size=values.shape) * 10
plt.figure(figsize=(14, 6))
plt.xlabel('Time')
plt.ylabel('Values')
plt.plot(times, values)
plt.show()
We now set up an optimisation, to see if we can recover our original parameters from this data.
First, we define a problem (in this case a single-valued time series fitting problem):
In [11]:
problem = pints.SingleOutputProblem(model, times, values)
We then define a score function on this problem:
In [12]:
score = pints.SumOfSquaresError(problem)
A lot of real problems have physical constraints on the values the parameters can take, so in this example we add them in the form of boundaries:
In [13]:
boundaries = pints.RectangularBoundaries([0, 200], [1, 1000])
Finally, we define an initial position to start searching at
In [14]:
x0 = np.array([0.5, 500])
In [15]:
found_parameters, found_value = pints.optimise(
score,
x0,
boundaries=boundaries,
method=pints.XNES,
)
In our toy model example, we can compare the parameters with the known true parameters:
In [16]:
print('Score at true solution: ')
print(score(real_parameters))
print('Found solution: True parameters:' )
for k, x in enumerate(found_parameters):
print(pints.strfloat(x) + ' ' + pints.strfloat(real_parameters[k]))
And we can also plot the score between the found solution and the true parameters.
In [17]:
import pints.plot
fig, axes = pints.plot.function_between_points(score, point_1=found_parameters, point_2=real_parameters)
axes.set_ylabel('score')
plt.show()
In real life, we might compare the fit by running a simulation and comparing with the data:
In [18]:
values2 = model.simulate(found_parameters, times)
plt.figure(figsize=(14, 6))
plt.xlabel('Time')
plt.ylabel('Values')
plt.plot(times, values, label='Experimental data')
plt.plot(times, values2, label='Simulated data')
plt.legend()
plt.show()