Noise Model Diagnostics: Autocorrelation of the Residuals

This example shows how to use the autocorrelation plots of the residuals to check assumptions of the noise model.

Three cases are shown. In the first two, optimisation is used to obtain a best-fit parameter vector in a single output problem. In the first case the noise is correctly specified and in the second case the noise is misspecified. The third case demonstrates the same method in a multiple output problem with Bayesian inference.

Case 1: Correctly specified noise

For the first example, we will use optimisation to obtain the best-fit parameter vector. See Optimisation First Example for more details. We begin with a problem in which the noise is correctly specified: both the data generation and the model use independent Gaussian noise.


In [5]:
from __future__ import print_function
import pints
import pints.toy as toy
import pints.plot
import numpy as np
import matplotlib.pyplot as plt

# Use the toy logistic model
model = toy.LogisticModel()

real_parameters = [0.015, 500]
times = np.linspace(0, 1000, 100)
org_values = model.simulate(real_parameters, times)

# Add independent Gaussian noise
noise = 50
values = org_values + np.random.normal(0, noise, org_values.shape)

# Set up the problem and run the optimisation 
problem = pints.SingleOutputProblem(model, times, values)

score = pints.SumOfSquaresError(problem)
boundaries = pints.RectangularBoundaries([0, 200], [1, 1000])
x0 = np.array([0.5, 500])

found_parameters, found_value = pints.optimise(
    score,
    x0,
    boundaries=boundaries,
    method=pints.XNES,
    )

print('Score at true solution: ')
print(score(real_parameters))

print('Found solution:          True parameters:' )
for k, x in enumerate(found_parameters):
    print(pints.strfloat(x) + '    ' + pints.strfloat(real_parameters[k]))


Minimising error measure
Using Exponential Natural Evolution Strategy (xNES)
Running in sequential mode.
Population size: 6
Iter. Eval. Best      Time m:s
0     6      7064890    0:00.0
1     12     6184924    0:00.0
2     18     4864105    0:00.0
3     24     4864105    0:00.0
20    126    1653682    0:00.0
40    246    285069.4   0:00.1
60    366    250586.6   0:00.1
80    486    250569     0:00.1
100   606    250569     0:00.1
120   726    250569     0:00.2
140   846    250569     0:00.2
160   966    250569     0:00.2
180   1086   250569     0:00.2
200   1206   250569     0:00.3
220   1326   250569     0:00.3
240   1446   250569     0:00.3
260   1566   250569     0:00.3
280   1686   250569     0:00.4
300   1806   250569     0:00.4
320   1926   250569     0:00.4
340   2046   250569     0:00.4
360   2166   250569     0:00.5
380   2286   250569     0:00.5
400   2406   250569     0:00.5
406   2436   250569     0:00.5
Halting: No significant change for 200 iterations.
Score at true solution: 
267817.710741
Found solution:          True parameters:
 1.49241301904894375e-02     1.49999999999999994e-02
 5.19119837056626011e+02     5.00000000000000000e+02

Visualisation of the data

After obtaining these parameters, it is useful to visualise the data and the fit.


In [6]:
fig, ax = pints.plot.series(np.array([found_parameters]), problem, ref_parameters=real_parameters)
fig.set_size_inches(15, 7.5)
plt.show()


Plotting autocorrelation of the residuals

Next we look at the autocorrelation plot of the residuals to evaluate the noise model (using pints.residuals_diagnostics.plot_residuals_autocorrelation).


In [7]:
from pints.residuals_diagnostics import plot_residuals_autocorrelation

# Plot the autocorrelation
fig = plot_residuals_autocorrelation(np.array([found_parameters]),
                                     problem)

plt.show()


The figure shows no significant autocorrelation in the residuals. Therefore, the assumption of independent noise may be valid.

Case 2: Incorrectly specified noise

For the next case, we generate data with an AR(1) (first order autoregressive) noise model. However, we deliberately misspecify the model and assume independent Gaussian noise (as before) when fitting the parameters.


In [8]:
import pints.noise

# Use the toy logistic model
model = toy.LogisticModel()

real_parameters = [0.015, 500]
times = np.linspace(0, 1000, 100)
org_values = model.simulate(real_parameters, times)

# Add AR(1) noise
rho = 0.75
sigma = 50
values = org_values + pints.noise.ar1(rho, sigma, len(org_values))

# Set up the problem and run the optimisation 
problem = pints.SingleOutputProblem(model, times, values)

score = pints.SumOfSquaresError(problem)
boundaries = pints.RectangularBoundaries([0, 200], [1, 1000])
x0 = np.array([0.5, 500])

found_parameters, found_value = pints.optimise(
    score,
    x0,
    boundaries=boundaries,
    method=pints.XNES,
    )

print('Score at true solution: ')
print(score(real_parameters))

print('Found solution:          True parameters:' )
for k, x in enumerate(found_parameters):
    print(pints.strfloat(x) + '    ' + pints.strfloat(real_parameters[k]))


Minimising error measure
Using Exponential Natural Evolution Strategy (xNES)
Running in sequential mode.
Population size: 6
Iter. Eval. Best      Time m:s
0     6      3584749    0:00.0
1     12     3584749    0:00.0
2     18     3455978    0:00.0
3     24     3319796    0:00.0
20    126    1336661    0:00.0
40    246    751489.9   0:00.1
60    366    339807.7   0:00.1
80    486    240680.3   0:00.1
100   606    240576.2   0:00.1
120   726    240576.2   0:00.2
140   846    240576.2   0:00.2
160   966    240576.2   0:00.2
180   1086   240576.2   0:00.2
200   1206   240576.2   0:00.3
220   1326   240576.2   0:00.3
240   1446   240576.2   0:00.3
260   1566   240576.2   0:00.3
280   1686   240576.2   0:00.4
300   1806   240576.2   0:00.4
320   1926   240576.2   0:00.4
340   2046   240576.2   0:00.4
360   2166   240576.2   0:00.5
380   2286   240576.2   0:00.5
393   2358   240576.2   0:00.5
Halting: No significant change for 200 iterations.
Score at true solution: 
297737.553666
Found solution:          True parameters:
 1.38395297101625570e-02     1.49999999999999994e-02
 4.97898768582010973e+02     5.00000000000000000e+02

Visualisation of the data

As before we plot the data and the inferred trajectory.


In [9]:
fig, ax = pints.plot.series(np.array([found_parameters]), problem, ref_parameters=real_parameters)
fig.set_size_inches(15, 7.5)
plt.show()



In [10]:
# Plot the autocorrelation
fig = plot_residuals_autocorrelation(np.array([found_parameters]),
                                     problem)

plt.show()


Now the autocorrelation plot of the residuals shows high autocorrelation at small lags, which is typical of AR(1) noise. Therefore, this visualisation suggests that the assumption of independent Gaussian noise which we made during inference is invalid.

Case 3: Multiple output Bayesian inference problem

The plot_residuals_autocorrelation function also works with Bayesian inference and multiple output problems. For the final example, we demonstrate the same strategy in this setting.

For this example, the Lotka-Volterra model is used. See the Lotka-Volterra example for more details. As in Case 1, the true data is generated with independent Gaussian noise.


In [11]:
import numpy as np
import matplotlib.pyplot as plt
import pints
import pints.toy

model = pints.toy.LotkaVolterraModel()

times = np.linspace(0, 3, 50)
parameters = model.suggested_parameters()
model.set_initial_conditions([2, 2])
org_values = model.simulate(parameters, times)

# Add noise
sigma = 0.1
values = org_values + np.random.normal(0, sigma, org_values.shape)

# Create an object with links to the model and time series
problem = pints.MultiOutputProblem(model, times, values)

# Create a log posterior
log_prior = pints.UniformLogPrior([0, 0, 0, 0, 0, 0], [6, 6, 6, 6, 1, 1])
log_likelihood = pints.GaussianLogLikelihood(problem)
log_posterior = pints.LogPosterior(log_likelihood, log_prior)

# Run MCMC on the noisy data
x0 = [[4, 1, 2, 3, .1, .1]]*3
mcmc = pints.MCMCController(log_posterior, 3, x0)
mcmc.set_max_iterations(4000)

print('Running')
chains = mcmc.run()
print('Done!')


Running
Using Haario-Bardenet adaptive covariance MCMC
Generating 3 chains.
Running in sequential mode.
Iter. Eval. Accept.   Accept.   Accept.   Time m:s
0     3      0         0         0          0:00.0
1     6      0         0         0.5        0:00.0
2     9      0.333     0.333     0.333      0:00.0
3     12     0.25      0.5       0.25       0:00.0
20    63     0.286     0.476     0.476      0:00.2
40    123    0.317     0.537     0.463      0:00.4
60    183    0.311     0.442623  0.328      0:00.5
80    243    0.247     0.432     0.284      0:00.6
100   303    0.198     0.386     0.228      0:00.8
120   363    0.174     0.331     0.19       0:00.9
140   423    0.156     0.291     0.163      0:01.1
160   483    0.143     0.261     0.143      0:01.2
180   543    0.127     0.238     0.127      0:01.3
Initial phase completed.
200   603    0.124     0.219     0.114      0:01.5
220   663    0.127     0.217     0.122      0:01.6
240   723    0.153527  0.245     0.166      0:01.8
260   783    0.183908  0.268     0.183908   0:01.9
280   843    0.221     0.292     0.199      0:02.1
300   903    0.226     0.289     0.209      0:02.2
320   963    0.234     0.29      0.227      0:02.3
340   1023   0.243     0.296     0.232      0:02.5
360   1083   0.249     0.291     0.235      0:02.6
380   1143   0.255     0.283     0.239      0:02.8
400   1203   0.249     0.276808  0.232      0:02.9
420   1263   0.249     0.278     0.228      0:03.1
440   1323   0.24      0.276644  0.238      0:03.2
460   1383   0.234     0.273     0.234      0:03.4
480   1443   0.235     0.266     0.231      0:03.5
500   1503   0.236     0.261477  0.23       0:03.6
520   1563   0.228     0.259     0.23       0:03.8
540   1623   0.226     0.261     0.226      0:03.9
560   1683   0.225     0.257     0.223      0:04.1
580   1743   0.225     0.258     0.224      0:04.2
600   1803   0.221     0.255     0.22       0:04.4
620   1863   0.217     0.254     0.215781   0:04.5
640   1923   0.223     0.254     0.214      0:04.6
660   1983   0.219     0.257     0.214826   0:04.8
680   2043   0.219     0.263     0.217      0:04.9
700   2103   0.22      0.262     0.215      0:05.1
720   2163   0.223301  0.260749  0.214      0:05.2
740   2223   0.224     0.256     0.208      0:05.4
760   2283   0.225     0.255     0.206      0:05.5
780   2343   0.223     0.254     0.205      0:05.6
800   2403   0.222     0.256     0.205      0:05.8
820   2463   0.219     0.255     0.201      0:05.9
840   2523   0.219     0.249     0.203      0:06.1
860   2583   0.216     0.242741  0.202      0:06.2
880   2643   0.215     0.24      0.199773   0:06.3
900   2703   0.216     0.235     0.198      0:06.5
920   2763   0.214     0.23      0.197      0:06.6
940   2823   0.218     0.227     0.196      0:06.8
960   2883   0.219     0.226847  0.197      0:06.9
980   2943   0.221     0.225     0.198      0:07.1
1000  3003   0.221     0.227     0.196      0:07.2
1020  3063   0.223     0.227     0.198      0:07.4
1040  3123   0.224     0.225     0.198      0:07.5
1060  3183   0.222     0.226     0.2        0:07.7
1080  3243   0.226642  0.223     0.201      0:07.8
1100  3303   0.228     0.222     0.203      0:08.0
1120  3363   0.227     0.224     0.202      0:08.1
1140  3423   0.227     0.222     0.204      0:08.2
1160  3483   0.229     0.22      0.205857   0:08.4
1180  3543   0.228     0.218     0.207      0:08.5
1200  3603   0.226     0.217     0.207      0:08.7
1220  3663   0.229     0.217     0.206      0:08.8
1240  3723   0.228     0.22      0.204      0:09.0
1260  3783   0.227     0.217     0.204      0:09.1
1280  3843   0.226     0.214676  0.205      0:09.2
1300  3903   0.22598   0.213     0.202      0:09.4
1320  3963   0.225     0.211     0.204      0:09.5
1340  4023   0.224     0.21      0.204      0:09.7
1360  4083   0.22      0.209     0.204      0:09.8
1380  4143   0.219     0.209     0.204      0:09.9
1400  4203   0.218     0.21      0.206      0:10.1
1420  4263   0.215     0.209     0.206      0:10.2
1440  4323   0.214     0.21      0.207      0:10.4
1460  4383   0.215     0.21013   0.207      0:10.5
1480  4443   0.214     0.209     0.209318   0:10.7
1500  4503   0.215     0.21      0.212      0:10.8
1520  4563   0.214     0.209073  0.213      0:11.0
1540  4623   0.213     0.21      0.213      0:11.1
1560  4683   0.215     0.21      0.212      0:11.2
1580  4743   0.214     0.211     0.213      0:11.4
1600  4803   0.214     0.211     0.212      0:11.5
1620  4863   0.215     0.210364  0.215      0:11.7
1640  4923   0.216     0.21      0.216      0:11.8
1660  4983   0.217     0.21      0.217339   0:11.9
1680  5043   0.214     0.211     0.218      0:12.1
1700  5103   0.216     0.211     0.216      0:12.2
1720  5163   0.216     0.212     0.216      0:12.4
1740  5223   0.215     0.209     0.217      0:12.5
1760  5283   0.215     0.207     0.216      0:12.6
1780  5343   0.214     0.206     0.217      0:12.8
1800  5403   0.214     0.204     0.216      0:12.9
1820  5463   0.215     0.202     0.215      0:13.1
1840  5523   0.216     0.202     0.215      0:13.2
1860  5583   0.215     0.199     0.214      0:13.4
1880  5643   0.215311  0.199     0.216      0:13.5
1900  5703   0.216728  0.198     0.216202   0:13.6
1920  5763   0.218     0.197     0.217      0:13.8
1940  5823   0.218     0.198     0.216      0:13.9
1960  5883   0.217746  0.199     0.218256   0:14.1
1980  5943   0.218     0.197     0.218      0:14.2
2000  6003   0.217     0.197     0.217      0:14.4
2020  6063   0.216     0.195     0.217714   0:14.5
2040  6123   0.215     0.195     0.218      0:14.6
2060  6183   0.217     0.194     0.216885   0:14.8
2080  6243   0.218     0.194618  0.218      0:14.9
2100  6303   0.218     0.196     0.218      0:15.1
2120  6363   0.218     0.196     0.217      0:15.2
2140  6423   0.219     0.195     0.215787   0:15.4
2160  6483   0.218     0.19528   0.215      0:15.5
2180  6543   0.216     0.195     0.215      0:15.7
2200  6603   0.215     0.195     0.218      0:15.8
2220  6663   0.215     0.196     0.22       0:15.9
2240  6723   0.215     0.194556  0.22       0:16.1
2260  6783   0.216276  0.194     0.219372   0:16.2
2280  6843   0.217     0.194     0.219      0:16.4
2300  6903   0.216     0.195     0.218166   0:16.5
2320  6963   0.217     0.196     0.218      0:16.7
2340  7023   0.217     0.196     0.21871    0:16.8
2360  7083   0.216     0.197     0.218      0:16.9
2380  7143   0.215     0.199496  0.218      0:17.1
2400  7203   0.215     0.2       0.217      0:17.2
2420  7263   0.216     0.203     0.216      0:17.4
2440  7323   0.218     0.204     0.218      0:17.5
2460  7383   0.216985  0.204     0.218204   0:17.7
2480  7443   0.217     0.206     0.218      0:17.8
2500  7503   0.216     0.206     0.217      0:17.9
2520  7563   0.217     0.207854  0.216      0:18.1
2540  7623   0.215     0.209     0.216      0:18.2
2560  7683   0.216     0.21      0.216      0:18.4
2580  7743   0.216     0.21      0.217      0:18.5
2600  7803   0.215     0.211     0.216      0:18.6
2620  7863   0.214422  0.211     0.216      0:18.8
2640  7923   0.214     0.210905  0.21507    0:18.9
2660  7983   0.214581  0.210823  0.215      0:19.1
2680  8043   0.214     0.21      0.216      0:19.2
2700  8103   0.214     0.211     0.215      0:19.3
2720  8163   0.214     0.212     0.213892   0:19.5
2740  8223   0.214     0.214     0.214      0:19.6
2760  8283   0.214     0.213     0.213      0:19.8
2780  8343   0.215     0.213     0.214      0:19.9
2800  8403   0.215     0.213     0.214      0:20.0
2820  8463   0.216     0.214     0.216      0:20.2
2840  8523   0.215     0.216     0.217      0:20.3
2860  8583   0.214     0.216     0.216      0:20.5
2880  8643   0.214     0.216     0.217      0:20.6
2900  8703   0.214     0.216     0.216      0:20.8
2920  8763   0.214     0.217049  0.215      0:20.9
2940  8823   0.215     0.216253  0.215      0:21.0
2960  8883   0.215     0.216     0.216      0:21.2
2980  8943   0.214     0.216     0.216      0:21.3
3000  9003   0.216     0.216     0.217      0:21.5
3020  9063   0.216     0.216     0.217      0:21.6
3040  9123   0.215     0.217     0.219      0:21.8
3060  9183   0.217     0.217     0.22       0:21.9
3080  9243   0.217     0.217     0.220383   0:22.1
3100  9303   0.217     0.217     0.22       0:22.2
3120  9363   0.218     0.217     0.219      0:22.3
3140  9423   0.217     0.217765  0.219      0:22.5
3160  9483   0.217     0.217     0.219      0:22.6
3180  9543   0.217     0.217     0.22       0:22.8
3200  9603   0.217     0.216     0.22       0:22.9
3220  9663   0.218     0.216     0.22       0:23.0
3240  9723   0.217     0.215     0.22       0:23.2
3260  9783   0.218     0.216498  0.219      0:23.3
3280  9843   0.218     0.217     0.219      0:23.5
3300  9903   0.218     0.216601  0.22       0:23.6
3320  9963   0.218     0.217     0.22       0:23.8
3340  10023  0.218     0.218     0.22       0:23.9
3360  10083  0.219     0.218     0.21928    0:24.0
3380  10143  0.219     0.22      0.219      0:24.2
3400  10203  0.219     0.221     0.217289   0:24.3
3420  10263  0.219     0.221     0.217      0:24.5
3440  10323  0.218     0.220866  0.217      0:24.6
3460  10383  0.22      0.221     0.216      0:24.8
3480  10443  0.221     0.221     0.216      0:24.9
3500  10503  0.22      0.221651  0.217      0:25.0
3520  10563  0.22      0.22096   0.217      0:25.2
3540  10623  0.219     0.22      0.217      0:25.3
3560  10683  0.218     0.219     0.217      0:25.5
3580  10743  0.217     0.22      0.217      0:25.6
3600  10803  0.218     0.22      0.216      0:25.7
3620  10863  0.218     0.22      0.216      0:25.9
3640  10923  0.218     0.221     0.215      0:26.0
3660  10983  0.217     0.222     0.213876   0:26.2
3680  11043  0.217     0.222     0.214      0:26.3
3700  11103  0.217779  0.222     0.214      0:26.4
3720  11163  0.218     0.223     0.213921   0:26.6
3740  11223  0.217     0.222     0.213      0:26.7
3760  11283  0.217     0.222     0.213507   0:26.9
3780  11343  0.217     0.221     0.214      0:27.0
3800  11403  0.217     0.222     0.214      0:27.1
3820  11463  0.217     0.222     0.214      0:27.3
3840  11523  0.216     0.222     0.214      0:27.4
3860  11583  0.217     0.223     0.214      0:27.6
3880  11643  0.217     0.224169  0.214      0:27.7
3900  11703  0.216     0.224     0.215      0:27.8
3920  11763  0.216     0.223     0.215      0:28.0
3940  11823  0.215935  0.224     0.216      0:28.1
3960  11883  0.216     0.224     0.217      0:28.3
3980  11943  0.217     0.224     0.217      0:28.4
4000  12000  0.21675   0.224     0.21675    0:28.5
Halting: Maximum number of iterations (4000) reached.
Done!

Visualisation of the data

As before we plot the data and the inferred trajectories.


In [12]:
# Get the first MCMC chain
chain1 = chains[0]

# Cut off the burn-in samples
chain1 = chain1[2500:]

fig, ax = pints.plot.series(chain1, problem, ref_parameters=parameters)
fig.set_size_inches(15, 7.5)
plt.show()



In [13]:
# Plot the autocorrelation
fig = plot_residuals_autocorrelation(chain1, problem)

plt.show()


The plot_residuals_autocorrelation function generates one residuals plot for each output. Additionally, since Bayesian inference was performed and an MCMC chain was provided to the function, it draws a diagram of the distribution of the autocorrelations at each lag over the MCMC samples. Each dot indicates the median autocorrelation, and the bars show the extent of the 95% posterior interval.

In both outputs, no significant autocorrelation in the residuals is seen, as expected since independent Gaussian noise was used to generate the data.