Fitzhugh-Nagumo simplified action-potential model

This example shows how the Fitzhugh-Nagumo simplified action potential (AP) model can be used.

The model is based on a simplification and state-reduction of the original squid axon model by Hodgkind and Huxley. It has two state variables, a voltage-like variable and a recovery variable.


In [1]:
import pints
import pints.toy
import numpy as np
import matplotlib.pyplot as plt

# Create a model
model = pints.toy.FitzhughNagumoModel()

# Run a simulation
parameters = [0.1, 0.5, 3]
times = np.linspace(0, 20, 200)
values = model.simulate(parameters, times)

# Plot the results
plt.figure()
plt.xlabel('Time')
plt.ylabel('Value')
plt.plot(times, values)
plt.legend(['Voltage', 'Recovery'])
plt.show()


With these parameters, the model creates wide AP waveforms that are more reminiscent of muscle cells than neurons.

We now set up a simple optimisation problem with the model.


In [2]:
# First add some noise
sigma = 0.5
noisy = values + np.random.normal(0, sigma, values.shape)

# Plot the results
plt.figure()
plt.xlabel('Time')
plt.ylabel('Noisy values')
plt.plot(times, noisy)
plt.show()


Next, we set up a problem. Because this model has multiple outputs (2), we use a MultiOutputProblem.


In [3]:
problem = pints.MultiOutputProblem(model, times, noisy)
score = pints.SumOfSquaresError(problem)

Finally, we choose a wide set of boundaries and run!


In [4]:
# Select boundaries
boundaries = pints.RectangularBoundaries([0., 0., 0.], [10., 10., 10.])

# Select a starting point
x0 = [1, 1, 1]

# Perform an optimization
found_parameters, found_value = pints.optimise(score, x0, boundaries=boundaries)

print('Score at true solution:')
print(score(parameters))

print('Found solution:          True parameters:' )
for k, x in enumerate(found_parameters):
    print(pints.strfloat(x) + '    ' + pints.strfloat(parameters[k]))


Minimising error measure
Using Covariance Matrix Adaptation Evolution Strategy (CMA-ES)
Running in sequential mode.
Population size: 7
Iter. Eval. Best      Time m:s
0     7      943.7289   0:00.2
1     14     908.4755   0:00.2
2     21     908.4755   0:00.3
3     28     190.9373   0:00.3
20    147    152.225    0:01.7
40    287    100.9938   0:03.9
60    427    99.87823   0:06.2
80    567    99.87628   0:08.5
100   707    99.87628   0:10.8
120   847    99.87628   0:13.0
140   987    99.87628   0:15.2
160   1127   99.87628   0:17.4
180   1267   99.87628   0:19.6
200   1407   99.87628   0:21.9
220   1547   99.87628   0:24.1
240   1687   99.87628   0:26.3
260   1827   99.87628   0:28.6
280   1967   99.87628   0:30.9
300   2107   99.87628   0:33.1
320   2247   99.87628   0:35.4
340   2387   99.87628   0:37.6
360   2527   99.87628   0:39.8
380   2667   99.87628   0:42.1
400   2807   99.87628   0:44.4
420   2947   99.87628   0:46.6
440   3087   99.87628   0:48.9
451   3157   99.87628   0:50.0
Halting: No significant change for 200 iterations.
Score at true solution:
99.9625517283
Found solution:          True parameters:
 9.17776261399958215e-02     1.00000000000000006e-01
 5.26806733867043353e-01     5.00000000000000000e-01
 2.98412261791647238e+00     3.00000000000000000e+00

In [5]:
# Plot the results
plt.figure()
plt.xlabel('Time')
plt.ylabel('Values')
plt.plot(times, noisy, '-', alpha=0.25, label='noisy signal')
plt.plot(times, values, alpha=0.4, lw=5, label='original signal')
plt.plot(times, problem.evaluate(found_parameters), 'k--', label='recovered signal')
plt.legend()
plt.show()


This shows the parameters are not retrieved entirely correctly, but the traces still strongly overlap.

Sampling with Monomial-gamma HMC

The Fitzhugh-Nagumo model has sensitivities calculated by the forward sensitivities approach, so we can use samplers that use gradients (although they will be slower per iteration; although perhaps not by ESS per second!), like Monomial-gamma HMC.


In [16]:
import time

problem = pints.MultiOutputProblem(model, times, noisy)

# Create a log-likelihood function (adds an extra parameter!)
log_likelihood = pints.GaussianLogLikelihood(problem)

# Create a uniform prior over both the parameters and the new noise variable
log_prior = pints.UniformLogPrior(
    [0, 0, 0, 0, 0],
    [10, 10, 10, 20, 20]
)

# Create a posterior log-likelihood (log(likelihood * prior))
log_posterior = pints.LogPosterior(log_likelihood, log_prior)

# Choose starting points for 3 mcmc chains
real_parameters1 = np.array(parameters + [sigma, sigma])
xs = [
    real_parameters1 * 1.1,
    real_parameters1 * 0.9,
    real_parameters1 * 1.15,
    real_parameters1 * 1.5,
]

# Create mcmc routine
mcmc = pints.MCMCController(log_posterior, 4, xs, method=pints.MonomialGammaHamiltonianMCMC)

# Add stopping criterion
mcmc.set_max_iterations(200)
mcmc.set_log_interval(1)

# Run in parallel
mcmc.set_parallel(True)

for sampler in mcmc.samplers():
    sampler.set_leapfrog_step_size([0.05, 0.2, 0.2, 0.1, 0.1])
    sampler.set_leapfrog_steps(10)

# time start
start = time.time()

# Run!
print('Running...')
chains = mcmc.run()
print('Done!')

# end time
end = time.time()
time = end - start


Running...
Using Monomial-Gamma Hamiltonian Monte Carlo
Generating 4 chains.
Running in parallel with 4 worker processess.
Iter. Eval. Accept.   Accept.   Accept.   Accept.   Time m:s
0     4      0         0         0         0          0:14.6
1     44     0.333     0.333     0.333     0.333      0:15.5
2     84     0.5       0.5       0.5       0.5        0:16.3
3     124    0.6       0.6       0.6       0.6        0:17.1
4     164    0.667     0.667     0.667     0.667      0:18.0
5     204    0.714     0.714     0.714     0.714      0:18.8
6     244    0.75      0.75      0.75      0.625      0:19.7
7     284    0.667     0.778     0.778     0.556      0:20.5
8     324    0.7       0.8       0.8       0.6        0:21.3
9     364    0.727     0.818     0.818     0.636      0:22.2
10    404    0.75      0.833     0.833     0.667      0:23.0
11    444    0.769     0.846     0.846     0.692      0:23.8
12    484    0.786     0.857     0.857     0.714      0:24.7
13    524    0.8       0.867     0.867     0.733      0:25.5
14    564    0.8125    0.875     0.875     0.75       0:26.3
15    604    0.824     0.882     0.882     0.765      0:27.2
16    644    0.833     0.889     0.889     0.778      0:28.0
17    684    0.842     0.895     0.895     0.789      0:28.8
18    724    0.85      0.9       0.9       0.8        0:29.7
19    764    0.857     0.905     0.905     0.81       0:30.5
20    804    0.864     0.909     0.909     0.818      0:31.3
21    844    0.87      0.913     0.913     0.826087   0:32.1
22    884    0.875     0.917     0.917     0.833      0:32.9
23    924    0.88      0.92      0.92      0.84       0:33.7
24    964    0.885     0.923     0.923     0.846      0:34.6
25    1004   0.889     0.926     0.926     0.852      0:35.4
26    1044   0.893     0.929     0.929     0.857      0:36.2
27    1084   0.897     0.931     0.931     0.862069   0:37.0
28    1124   0.9       0.933     0.933     0.867      0:37.8
29    1164   0.871     0.935     0.935     0.871      0:38.6
30    1204   0.875     0.9375    0.9375    0.875      0:39.4
31    1244   0.879     0.939     0.939     0.879      0:40.3
32    1284   0.882     0.941     0.941     0.882      0:41.1
33    1324   0.886     0.943     0.943     0.886      0:41.9
34    1364   0.889     0.944     0.944     0.889      0:42.7
35    1404   0.892     0.946     0.946     0.892      0:43.6
36    1444   0.895     0.947     0.947     0.895      0:44.4
37    1484   0.897     0.949     0.949     0.872      0:45.2
38    1524   0.9       0.95      0.95      0.875      0:46.0
39    1564   0.902439  0.951     0.951     0.878      0:46.8
40    1604   0.905     0.952381  0.952381  0.881      0:47.6
41    1644   0.907     0.953     0.953     0.884      0:48.4
42    1684   0.909     0.955     0.955     0.886      0:49.3
43    1724   0.911     0.956     0.956     0.889      0:50.1
44    1764   0.913     0.957     0.957     0.891      0:50.9
45    1804   0.915     0.957     0.957     0.893617   0:51.7
46    1844   0.917     0.958     0.958     0.896      0:52.5
47    1884   0.918     0.959     0.959     0.898      0:53.3
48    1924   0.92      0.96      0.96      0.9        0:54.1
49    1964   0.922     0.961     0.961     0.902      0:54.9
50    2004   0.923     0.962     0.962     0.904      0:56.0
51    2044   0.925     0.962     0.962     0.906      0:56.8
52    2084   0.926     0.962963  0.962963  0.907      0:57.6
53    2124   0.927     0.964     0.964     0.909      0:58.3
54    2164   0.929     0.964     0.964     0.911      0:59.1
55    2204   0.93      0.965     0.965     0.912      0:59.9
56    2244   0.931     0.966     0.966     0.914      1:00.7
57    2284   0.932     0.966     0.966     0.915      1:01.5
58    2324   0.933     0.967     0.967     0.917      1:02.3
59    2364   0.934     0.967     0.967     0.918      1:03.1
60    2404   0.935     0.968     0.968     0.919      1:03.9
61    2444   0.937     0.968254  0.968254  0.921      1:04.7
62    2484   0.9375    0.96875   0.96875   0.921875   1:05.5
63    2524   0.938     0.969     0.969     0.923      1:06.3
64    2564   0.939     0.969697  0.969697  0.924      1:07.2
65    2604   0.94      0.97      0.97      0.925      1:08.0
66    2644   0.941     0.971     0.971     0.926      1:08.8
67    2684   0.942029  0.971     0.971     0.928      1:09.5
68    2724   0.943     0.971     0.957     0.929      1:10.3
69    2764   0.943662  0.971831  0.958     0.93       1:11.1
70    2804   0.944     0.972     0.958     0.931      1:11.9
71    2844   0.945     0.973     0.959     0.932      1:12.8
72    2884   0.946     0.972973  0.959     0.932      1:13.6
73    2924   0.947     0.973     0.96      0.933      1:14.3
74    2964   0.947     0.974     0.961     0.934      1:15.2
75    3004   0.948     0.974026  0.961039  0.935      1:16.0
76    3044   0.949     0.974359  0.962     0.936      1:16.8
77    3084   0.949     0.975     0.962     0.937      1:17.6
78    3124   0.95      0.975     0.9625    0.9375     1:18.4
79    3164   0.951     0.975     0.962963  0.938      1:19.2
80    3204   0.951     0.976     0.963     0.939      1:20.0
81    3244   0.952     0.976     0.964     0.939759   1:20.8
82    3284   0.952381  0.976     0.964     0.94       1:21.6
83    3324   0.953     0.976     0.965     0.941      1:22.4
84    3364   0.953     0.977     0.965     0.942      1:23.2
85    3404   0.954023  0.977     0.966     0.943      1:24.0
86    3444   0.955     0.977     0.966     0.943      1:24.8
87    3484   0.955     0.978     0.966     0.944      1:25.5
88    3524   0.956     0.978     0.967     0.944      1:26.3
89    3564   0.956044  0.978022  0.967033  0.945      1:27.1
90    3604   0.957     0.978     0.967     0.946      1:27.9
91    3644   0.957     0.978     0.968     0.946      1:28.7
92    3684   0.957     0.979     0.968     0.947      1:29.5
93    3724   0.958     0.979     0.958     0.947      1:30.3
94    3764   0.958     0.979     0.958     0.948      1:31.1
95    3804   0.959     0.979     0.959     0.948      1:31.9
96    3844   0.959     0.98      0.959     0.949      1:32.7
97    3884   0.959596  0.979798  0.959596  0.949      1:33.5
98    3924   0.96      0.98      0.96      0.95       1:34.3
99    3964   0.960396  0.980198  0.960396  0.950495   1:35.1
100   4004   0.961     0.98      0.961     0.951      1:36.0
101   4044   0.961165  0.981     0.961165  0.951      1:36.8
102   4084   0.962     0.981     0.962     0.952      1:37.7
103   4124   0.962     0.981     0.962     0.952381   1:38.5
104   4164   0.962     0.981     0.962     0.953      1:39.3
105   4204   0.963     0.981     0.963     0.953271   1:40.1
106   4244   0.962963  0.981     0.962963  0.954      1:40.9
107   4284   0.963     0.982     0.963     0.954      1:41.7
108   4324   0.964     0.982     0.964     0.955      1:42.5
109   4364   0.963964  0.981982  0.963964  0.954955   1:43.3
110   4404   0.964     0.982     0.964     0.955      1:44.2
111   4444   0.965     0.982     0.965     0.956      1:45.0
112   4484   0.965     0.982     0.965     0.956      1:45.8
113   4524   0.965     0.983     0.965     0.957      1:46.6
114   4564   0.966     0.983     0.966     0.957      1:47.4
115   4604   0.965812  0.982906  0.965812  0.957265   1:48.2
116   4644   0.966     0.983     0.966     0.958      1:49.0
117   4684   0.966     0.983     0.966     0.958      1:49.8
118   4724   0.967     0.983     0.967     0.958      1:50.6
119   4764   0.967     0.983     0.967     0.959      1:51.4
120   4804   0.967     0.984     0.967     0.959      1:52.2
121   4844   0.967     0.984     0.967     0.959      1:53.0
122   4884   0.968     0.983871  0.968     0.96       1:53.8
123   4924   0.968     0.984     0.968     0.96       1:54.6
124   4964   0.968254  0.984127  0.968254  0.96       1:55.4
125   5004   0.969     0.984252  0.969     0.961      1:56.2
126   5044   0.96875   0.984375  0.96875   0.961      1:57.0
127   5084   0.969     0.984     0.969     0.961      1:57.8
128   5124   0.969     0.985     0.969     0.962      1:58.6
129   5164   0.969     0.985     0.969     0.962      1:59.4
130   5204   0.969697  0.985     0.969697  0.962      2:00.2
131   5244   0.97      0.985     0.97      0.962406   2:01.0
132   5284   0.97      0.985     0.97      0.963      2:01.8
133   5324   0.97      0.985     0.97      0.962963   2:02.5
134   5364   0.971     0.985     0.963     0.963      2:03.3
135   5404   0.971     0.985     0.964     0.964      2:04.2
136   5444   0.971     0.986     0.964     0.964      2:05.0
137   5484   0.971223  0.986     0.964     0.964      2:05.8
138   5524   0.971     0.986     0.964     0.964      2:06.6
139   5564   0.972     0.986     0.964539  0.964539   2:07.4
140   5604   0.971831  0.986     0.965     0.965      2:08.1
141   5644   0.972028  0.986014  0.965035  0.965035   2:08.9
142   5684   0.972     0.986     0.965     0.965      2:09.7
143   5724   0.972     0.986     0.966     0.959      2:10.5
144   5764   0.973     0.986     0.966     0.959      2:11.3
145   5804   0.973     0.986     0.966     0.959      2:12.1
146   5844   0.972973  0.986     0.966     0.959      2:12.9
147   5884   0.973     0.987     0.966443  0.96       2:13.7
148   5924   0.973     0.987     0.967     0.96       2:14.5
149   5964   0.974     0.986755  0.967     0.96       2:15.3
150   6004   0.974     0.987     0.967     0.961      2:16.2
151   6044   0.967     0.987     0.967     0.961      2:17.0
152   6084   0.968     0.987013  0.968     0.961039   2:17.8
153   6124   0.968     0.987     0.968     0.961      2:18.5
154   6164   0.968     0.987     0.968     0.962      2:19.3
155   6204   0.968     0.981     0.968     0.962      2:20.1
156   6244   0.968     0.981     0.968     0.962      2:20.9
157   6284   0.969     0.981     0.969     0.962      2:21.7
158   6324   0.96875   0.98125   0.96875   0.9625     2:22.5
159   6364   0.969     0.981     0.969     0.963      2:23.3
160   6404   0.969     0.981     0.969     0.962963   2:24.1
161   6444   0.969     0.982     0.969     0.963      2:24.9
162   6484   0.97      0.982     0.97      0.963      2:25.7
163   6524   0.969697  0.982     0.969697  0.964      2:26.5
164   6564   0.97      0.982     0.97      0.964      2:27.3
165   6604   0.97      0.982     0.97      0.964      2:28.1
166   6644   0.97      0.982     0.97      0.964      2:28.9
167   6684   0.97      0.982     0.97      0.964497   2:29.8
168   6724   0.971     0.982     0.971     0.965      2:30.6
169   6764   0.971     0.982     0.971     0.965      2:31.4
170   6804   0.971     0.983     0.971     0.965      2:32.2
171   6844   0.971     0.982659  0.971     0.965      2:33.0
172   6884   0.971     0.983     0.971     0.966      2:33.8
173   6924   0.971     0.983     0.971     0.966      2:34.6
174   6964   0.972     0.983     0.972     0.966      2:35.5
175   7004   0.972     0.983     0.972     0.966      2:36.3
176   7044   0.972     0.983     0.972     0.966      2:37.1
177   7084   0.972067  0.983     0.966     0.966      2:37.9
178   7124   0.972     0.983     0.967     0.967      2:38.7
179   7164   0.972     0.983     0.967     0.961326   2:39.6
180   7204   0.973     0.984     0.967033  0.956044   2:40.4
181   7244   0.973     0.984     0.967     0.956      2:41.2
182   7284   0.973     0.984     0.967     0.957      2:42.0
183   7324   0.972973  0.984     0.968     0.957      2:42.8
184   7364   0.973     0.983871  0.968     0.957      2:43.6
185   7404   0.973262  0.984     0.968     0.957      2:44.5
186   7444   0.973     0.984     0.968     0.957      2:45.3
187   7484   0.973545  0.984127  0.968254  0.957672   2:46.1
188   7524   0.974     0.984     0.968     0.958      2:46.9
189   7564   0.973822  0.984     0.969     0.958      2:47.7
190   7604   0.974     0.984375  0.96875   0.958      2:48.5
191   7644   0.974     0.984456  0.969     0.959      2:49.4
192   7684   0.974     0.985     0.969     0.959      2:50.2
193   7724   0.974359  0.985     0.969     0.959      2:51.0
194   7764   0.974     0.985     0.969     0.959      2:51.8
195   7804   0.975     0.98      0.97      0.959      2:52.6
196   7844   0.975     0.979798  0.969697  0.959596   2:53.4
197   7884   0.975     0.98      0.97      0.959799   2:54.2
198   7924   0.975     0.98      0.97      0.96       2:55.1
199   7964   0.975     0.98      0.97      0.960199   2:55.9
200   7964   0.975     0.98      0.97      0.960199   2:55.9
Halting: Maximum number of iterations (200) reached.
Done!

In [18]:
import pints.plot
pints.plot.trace(chains)
plt.show()


Print results.


In [17]:
results = pints.MCMCSummary(chains=chains, time=time, parameter_names=["a", "b", "c", "sigma_V", "sigma_R"])
print(results)


param    mean    std.    2.5%    25%    50%    75%    97.5%    rhat    ess     ess per sec.
-------  ------  ------  ------  -----  -----  -----  -------  ------  ------  --------------
a        0.08    0.02    0.05    0.07   0.09   0.10   0.12     1.03    109.80  0.62
b        0.56    0.08    0.37    0.52   0.57   0.61   0.67     1.00    41.35   0.23
c        2.95    0.14    2.82    2.89   2.94   2.98   3.13     1.01    43.38   0.25
sigma_V  0.56    0.07    0.50    0.53   0.55   0.57   0.69     1.03    43.73   0.25
sigma_R  0.50    0.04    0.45    0.48   0.49   0.51   0.57     1.00    84.21   0.48

Plot the few posterior predictive simulations versus data.


In [19]:
import pints.plot
pints.plot.series(np.vstack(chains), problem)
plt.show()