Ellipsoidal nested sampling

This example demonstrates how to use ellipsoidal nested rejection sampling [1] to sample from the posterior distribution for a logistic model fitted to model-simulated data.

[1] "A nested sampling algorithm for cosmological model selection", Pia Mukherjee, David Parkinson and Andrew R. Liddle, arXiv:astro-ph/0508461v2.

First create fake data.


In [1]:
import pints
import pints.toy as toy
import numpy as np
import matplotlib.pyplot as plt

# Load a forward model
model = toy.LogisticModel()

# Create some toy data
r = 0.015
k = 500
real_parameters = [r, k]
times = np.linspace(0, 1000, 100)
signal_values = model.simulate(real_parameters, times)

# Add independent Gaussian noise
sigma = 10
observed_values = signal_values + pints.noise.independent(sigma, signal_values.shape)

# Plot
plt.plot(times,signal_values,label = 'signal')
plt.plot(times,observed_values,label = 'observed')
plt.xlabel('Time')
plt.ylabel('Values')
plt.legend()
plt.show()


Create the nested sampler that will be used to sample from the posterior.


In [2]:
# Create an object with links to the model and time series
problem = pints.SingleOutputProblem(model, times, observed_values)

# Create a log-likelihood function (adds an extra parameter!)
log_likelihood = pints.GaussianLogLikelihood(problem)

# Create a uniform prior over both the parameters and the new noise variable
log_prior = pints.UniformLogPrior(
    [0.01, 400, sigma * 0.5],
    [0.02, 600, sigma * 1.5])

# Create a nested ellipsoidal rejectection sampler
sampler = pints.NestedController(log_likelihood, log_prior, method=pints.NestedEllipsoidSampler)

# Set number of iterations
sampler.set_iterations(8000)

# Set the number of posterior samples to generate
sampler.set_n_posterior_samples(1600)

# Do proposals in parallel
sampler.set_parallel(True)

# Use dynamic enlargement factor
sampler._sampler.set_dynamic_enlargement_factor(1)

Run the sampler!


In [3]:
samples = sampler.run()
print('Done!')


Running Nested ellipsoidal sampler
Number of active points: 400
Total number of iterations: 8000
Total number of posterior samples: 1600
Iter. Eval. Time m:s Delta_log(z) Acceptance rate
0     1       0:05.6 -inf          1             
0     2       0:05.6 -inf          1             
0     21      0:05.6 -inf          1             
0     41      0:05.7 -inf          1             
0     61      0:05.7 -inf          1             
0     81      0:05.8 -inf          1             
0     101     0:05.8 -inf          1             
0     121     0:05.8 -inf          1             
0     141     0:05.9 -inf          1             
0     161     0:05.9 -inf          1             
0     181     0:06.0 -inf          1             
0     201     0:06.0 -inf          1             
0     221     0:06.1 -inf          1             
0     241     0:06.1 -inf          1             
0     261     0:06.2 -inf          1             
0     281     0:06.2 -inf          1             
0     301     0:06.3 -inf          1             
0     321     0:06.3 -inf          1             
0     341     0:06.4 -inf          1             
0     361     0:06.5 -inf          1             
0     381     0:06.5 -inf          1             
1     404     0:06.6 -inf          0.25          
20    424     0:06.6 -11961.96419  0.833333333   
40    444     0:06.6 -8352.604183  0.909090909   
60    464     0:06.7 -6359.778919  0.9375        
80    488     0:06.7 -5275.768444  0.909090909   
100   520     0:06.8 -4590.47119   0.833333333   
120   544     0:06.8 -4326.757781  0.833333333   
140   576     0:06.9 -3719.789341  0.795454545   
160   600     0:06.9 -3392.855978  0.8           
180   636     0:07.0 -3108.993827  0.762711864   
200   660     0:07.1 -2754.993979  0.769230769   
220   688     0:07.1 -2529.775272  0.763888889   
240   720     0:07.2 -2351.601106  0.75          
260   756     0:07.2 -2175.081064  0.730337079   
280   792     0:07.3 -2044.625718  0.714285714   
300   820     0:07.3 -1918.872831  0.714285714   
320   856     0:07.4 -1805.151712  0.701754386   
340   888     0:07.4 -1689.317568  0.696721311   
360   936     0:07.5 -1586.3349    0.671641791   
380   980     0:08.3 -1489.927673  0.655172414   
400   1032    0:08.4 -1401.254961  0.632911392   
420   1064    0:08.4 -1310.834638  0.63253012    
440   1096    0:08.5 -1216.232591  0.632183908046
460   1132    0:08.6 -1149.102977  0.628415301   
480   1172    0:08.7 -1086.938189  0.621761658   
500   1216    0:09.9 -1032.899956  0.612745098   
520   1252    0:09.9 -971.9526094  0.610328638   
540   1284    0:10.0 -919.9015005  0.610859729   
560   1336    0:10.1 -865.2953872  0.598290598   
580   1380    0:10.2 -818.8343722  0.591836735   
600   1428    0:10.3 -770.6446078  0.583657588   
620   1464    0:10.3 -743.0924604  0.582706767   
640   1508    0:10.4 -699.0575667  0.577617329   
660   1560    0:10.5 -665.7779445  0.568965517   
680   1616    0:10.7 -642.617847   0.559210526   
700   1664    0:10.8 -610.8239256  0.553797468   
720   1704    0:10.9 -582.4655629  0.552147239   
740   1728    0:10.9 -553.9587075  0.557228916   
760   1768    0:11.0 -529.239175   0.555555556   
780   1816    0:11.1 -503.4733143  0.550847458   
800   1884    0:11.2 -477.9142571  0.539083558   
820   1912    0:11.3 -443.8719624  0.542328042328
840   1964    0:11.4 -420.2479985  0.537084398977
860   2008    0:17.5 -396.8890682  0.534825871   
880   2048    0:17.6 -377.7338834  0.533980583   
900   2092    0:17.6 -352.209204   0.531914893617
920   2140    0:17.7 -324.9226736  0.528735632   
940   2200    0:17.8 -307.4450264  0.522222222   
960   2248    0:17.8 -290.8736334  0.519480519   
980   2292    0:17.9 -277.4842442  0.517970402   
1000  2360    0:18.0 -259.9878235  0.510204082   
1020  2408    0:18.1 -247.6515773  0.50796812749 
1040  2460    0:18.1 -237.333319   0.504854368932
1060  2504    0:19.0 -218.9319717  0.503802281   
1080  2552    0:19.1 -207.3628985  0.501858736   
1100  2596    0:19.1 -195.246171   0.500910747   
1120  2640    0:19.2 -186.031126   0.5           
1140  2684    0:19.2 -174.9244619  0.499124343   
1160  2728    0:19.3 -165.854887   0.498281787   
1180  2768    0:19.3 -158.9586757  0.498310811   
1200  2816    0:19.4 -153.9111851  0.496688742   
1220  2864    0:19.5 -147.3558753  0.49512987    
1240  2916    0:19.5 -145.8282542  0.492845787   
1260  2968    0:19.6 -140.7466328  0.490654206   
1280  3012    0:20.5 -137.1815534  0.490045941807
1300  3056    0:20.5 -133.8850999  0.489457831   
1320  3104    0:20.6 -129.4724271  0.48816568    
1340  3160    0:20.6 -125.7228883  0.485507246   
1360  3196    0:20.7 -120.279964   0.486409156   
1380  3268    0:20.8 -116.3331368  0.481171548   
1400  3340    0:20.8 -113.3900068  0.476190476   
1420  3424    0:20.9 -110.5637194  0.46957672    
1440  3540    0:21.0 -108.0741926  0.458598726   
1460  3612    0:21.1 -105.543546   0.454545455   
1480  3664    0:21.2 -103.2284003  0.453431372549
1500  3732    0:21.2 -101.2561546  0.450180072   
1520  3828    0:21.4 -99.38890983  0.443407235   
1540  3940    0:21.9 -97.54519601  0.435028249   
1560  4056    0:23.8 -95.80301875  0.426695842   
1580  4164    0:24.0 -93.82367642  0.419766206   
1600  4312    0:24.2 -91.55523354  0.408997955   
1620  4336    0:24.2 -89.43104957  0.411585366   
1640  4368    0:24.3 -87.64257243  0.413306452   
1660  4396    0:24.3 -86.06338007  0.415415415   
1680  4428    0:24.4 -84.44832862  0.417080437   
1700  4464    0:24.4 -83.18414238  0.418307087   
1720  4492    0:24.5 -81.95574752  0.420332356   
1740  4516    0:26.0 -80.79838234  0.422740525   
1760  4548    0:26.0 -79.43406283  0.424300868   
1780  4580    0:26.1 -78.0530443   0.425837321   
1800  4612    0:26.1 -76.75921356  0.427350427   
1820  4648    0:26.2 -75.52914362  0.428436911   
1840  4684    0:26.2 -74.49560163  0.429505135   
1860  4720    0:26.3 -73.44291573  0.430555556   
1880  4752    0:26.3 -72.35234981  0.431985294   
1900  4784    0:26.4 -71.32262181  0.433394161   
1920  4824    0:26.4 -70.0908834   0.433996383   
1940  4860    0:26.5 -68.69185073  0.434977578   
1960  4912    0:26.5 -67.42729521  0.434397163   
1980  4956    0:26.6 -66.25324711  0.434591747   
2000  4996    0:26.6 -65.14478357  0.43516101    
2020  5052    0:27.4 -63.97010481  0.43422184    
2040  5096    0:27.4 -62.82926098  0.434412266   
2060  5140    0:27.5 -61.77683486  0.434599156   
2080  5212    0:27.6 -60.48984022  0.432252702   
2100  5280    0:27.7 -59.24114129  0.430327869   
2120  5308    0:27.7 -58.11618716  0.43194784    
2140  5336    0:27.8 -57.16139475  0.433549433   
2160  5360    0:27.9 -56.13369893  0.435483871   
2180  5404    0:27.9 -54.82890499  0.435651479   
2200  5436    0:28.0 -53.53591305  0.436854647   
2220  5456    0:28.0 -52.47235975  0.439082278481
2240  5484    0:28.0 -51.42696157  0.440597954   
2260  5508    0:28.1 -50.50509087  0.442443226   
2280  5540    0:28.1 -49.45177092  0.443579766537
2300  5572    0:28.2 -48.52654179  0.444702243   
2320  5604    0:28.2 -47.47265284  0.445810914681
2340  5632    0:28.3 -46.63052003  0.447247706422
2360  5672    0:28.3 -45.65977965  0.447647951   
2380  5716    0:28.4 -44.75842374  0.447705041   
2400  5752    0:28.5 -43.96567828  0.448430493   
2420  5776    0:28.5 -43.30026266  0.45014881    
2440  5796    0:28.6 -42.5866327   0.452186805   
2460  5820    0:28.6 -41.81378689  0.453874539   
2480  5852    0:28.6 -40.91586948  0.454878943507
2500  5884    0:28.7 -40.09987659  0.45587162655 
2520  5916    0:29.2 -39.31693858  0.456852792   
2540  5944    0:29.3 -38.45608851  0.458152958153
2560  5972    0:29.3 -37.58393358  0.45944005743 
2580  6016    0:29.4 -36.68273763  0.459401709   
2600  6056    0:29.4 -35.91479435  0.459688826   
2620  6076    0:29.5 -35.20875886  0.461592670895
2640  6104    0:29.6 -34.42035725  0.4628331     
2660  6132    0:29.9 -33.52393033  0.46406141    
2680  6156    0:30.0 -32.69261296  0.465601112   
2700  6180    0:30.0 -31.95660541  0.467128028   
2720  6216    0:30.0 -31.49113766  0.467675378   
2740  6244    0:30.1 -30.78795045  0.468856947   
2760  6288    0:30.2 -30.07688502  0.46875       
2780  6332    0:30.2 -29.42305105  0.468644639   
2800  6368    0:30.3 -28.84538666  0.469168901   
2820  6408    0:30.3 -28.24930599  0.469374168   
2840  6448    0:30.4 -27.47903096  0.46957672    
2860  6496    0:30.5 -26.70709881  0.469160105   
2880  6532    0:31.0 -25.93762189  0.469667319   
2900  6576    0:31.0 -25.27702144  0.469559585   
2920  6596    0:31.1 -24.61540232  0.471271788   
2940  6620    0:31.1 -24.0467441   0.47266881    
2960  6644    0:31.2 -23.48313358  0.474055093   
2980  6668    0:31.2 -22.93188037  0.475430759   
3000  6692    0:31.3 -22.39463485  0.476795931   
3020  6720    0:31.3 -21.89199034  0.477848101   
3040  6748    0:31.3 -21.38515542  0.478890989288
3060  6780    0:31.4 -20.8674129   0.479623824   
3080  6808    0:31.4 -20.37271826  0.480649189   
3100  6836    0:31.5 -19.84836779  0.481665631   
3120  6876    0:31.6 -19.29009278  0.481778876   
3140  6904    0:31.6 -18.73268854  0.482779828   
3160  6948    0:31.7 -18.1433521   0.482590104   
3180  6988    0:31.7 -17.54468311  0.482695811   
3200  7036    0:32.6 -16.94597411  0.482218204   
3220  7060    0:32.6 -16.43274182  0.483483483   
3240  7084    0:32.7 -15.98274995  0.484739677   
3260  7108    0:32.7 -15.56086098  0.485986881   
3280  7140    0:32.8 -15.26496322  0.486646884273
3300  7180    0:32.8 -14.90929632  0.486725664   
3320  7220    0:32.9 -14.56597025  0.486803519   
3340  7252    0:32.9 -14.2088416   0.48744892    
3360  7292    0:33.0 -13.83524883  0.487521764   
3380  7332    0:33.1 -13.45993124  0.487593768   
3400  7372    0:33.1 -13.08426307  0.487664945   
3420  7400    0:33.2 -12.67106451  0.488571429   
3440  7428    0:33.2 -12.27595072  0.489470689   
3460  7452    0:33.3 -11.91206828  0.490640953   
3480  7484    0:33.3 -11.55856633  0.491247883   
3500  7516    0:33.4 -11.19961339  0.491849354   
3520  7548    0:33.4 -10.83760494  0.492445439   
3540  7584    0:34.0 -10.4824231   0.492761693   
3560  7624    0:34.0 -10.36336527  0.492801772   
3580  7656    0:34.1 -10.0575758   0.493384785   
3600  7692    0:34.1 -9.776437933  0.493691717   
3620  7716    0:34.2 -9.517409977  0.494805904866
3640  7740    0:34.2 -9.276152848  0.495912807   
3660  7768    0:34.3 -9.039111206  0.496742671   
3680  7800    0:34.4 -8.798967429  0.497297297   
3700  7836    0:34.4 -8.552023806  0.497579344   
3720  7864    0:34.5 -8.29645501   0.498392283   
3740  7892    0:34.5 -8.04755083   0.499199146   
3760  7920    0:34.6 -7.807561001  0.5           
3780  7964    0:34.6 -7.56765474   0.49973559    
3800  8004    0:35.4 -7.335283629  0.499736981   
3820  8028    0:35.5 -7.11477005   0.500786576   
3840  8056    0:35.5 -6.900830111  0.501567398   
3860  8080    0:35.5 -6.689921705  0.502604167   
3880  8108    0:35.6 -6.487720734  0.503373119   
3900  8144    0:35.7 -6.29099825   0.503615702   
3920  8172    0:35.7 -6.094812697  0.504374678   
3940  8208    0:35.8 -5.905216015  0.504610656   
3960  8236    0:35.8 -5.72046549   0.505359877   
3980  8260    0:35.9 -5.538291728  0.506361323   
4000  8292    0:35.9 -5.361538973  0.506842372   
4020  8324    0:36.0 -5.183169443  0.507319536   
4040  8356    0:36.0 -5.005499661  0.507792860734
4060  8384    0:36.1 -4.8302518    0.508517034   
4080  8432    0:36.1 -4.661826     0.50796812749 
4100  8468    0:36.2 -4.499569064  0.508180466   
4120  8512    0:37.2 -4.341571823  0.507889546   
4140  8552    0:37.2 -4.18961608   0.507850834   
4160  8600    0:37.3 -4.041397337  0.507317073   
4180  8652    0:37.4 -3.897877916  0.506543868   
4200  8700    0:37.4 -3.759561273  0.506024096   
4220  8752    0:37.5 -3.62655309   0.505268199   
4240  8836    0:37.6 -3.495091471  0.502607871   
4260  8892    0:37.6 -3.367122903  0.50164861    
4280  8960    0:37.7 -3.243861381  0.5           
4300  9012    0:38.6 -3.123581891  0.499303298   
4320  9036    0:38.6 -3.005269354  0.500231589   
4340  9064    0:38.6 -2.891499668  0.500923361   
4360  9088    0:38.7 -2.782204296  0.501841621   
4380  9116    0:38.7 -2.675514353  0.502524094   
4400  9140    0:38.8 -2.570142208  0.503432494   
4420  9176    0:38.8 -2.467872788  0.503646308113
4440  9212    0:38.9 -2.369097611  0.503858375   
4460  9248    0:38.9 -2.274235917  0.504068716094
4480  9284    0:39.0 -2.182427352  0.504277353   
4500  9328    0:39.1 -2.092884827  0.504032258   
4520  9348    0:39.1 -2.005826719  0.505140814   
4540  9376    0:39.1 -1.921487903  0.505793226   
4560  9404    0:39.2 -1.839416046  0.506441582   
4580  9428    0:39.2 -1.766283029  0.507310589   
4600  9464    0:39.3 -1.68907854   0.507502207   
4620  9484    0:39.3 -1.625758745  0.508586526   
4640  9512    0:39.4 -1.552340948  0.509218613   
4660  9536    0:39.9 -1.481584005  0.510070053   
4680  9564    0:39.9 -1.413641665  0.51069402    
4700  9596    0:40.0 -1.348443856  0.511091779   
4720  9620    0:40.0 -1.285759203  0.511930586   
4740  9652    0:40.1 -1.225563026  0.51232166    
4760  9692    0:40.1 -1.167871931  0.512268618   
4780  9732    0:40.2 -1.11271321   0.512216031   
4800  9772    0:40.3 -1.059999624  0.512163892   
4820  9796    0:40.3 -1.009452042  0.512984249   
4840  9824    0:40.4 -0.960922     0.513582343   
4860  9844    0:40.4 -0.914457     0.514612452   
4880  9872    0:40.5 -0.870034     0.515202703   
4900  9900    0:40.5 -0.827619     0.515789474   
4920  9920    0:40.6 -0.787142     0.516806723   
4940  9952    0:40.6 -0.748524     0.517169179   
4960  9984    0:40.6 -0.711621     0.517529215   
4980  10016   0:40.7 -0.676434     0.517886855   
5000  10044   0:40.7 -0.642811     0.518457072   
5020  10080   0:40.8 -0.610705     0.518595041   
5040  10128   0:40.8 -0.580168     0.518092105   
5060  10176   0:40.9 -0.551077     0.517594108   
5080  10212   0:40.9 -0.524139     0.517733388   
Convergence obtained with Delta_z = -0.4989821806669852
Done!

Plot posterior samples versus true parameter values (dashed lines)


In [4]:
# Plot output
import pints.plot

pints.plot.histogram([samples], ref_parameters=[r, k, sigma])

plt.show()


Plot posterior predictive simulations versus the observed data


In [5]:
pints.plot.series(samples[:100], problem)
plt.show()


Marginal likelihood estimate


In [6]:
print('marginal log-likelihood = ' + str(sampler.marginal_log_likelihood())
      + ' ± ' + str(sampler.marginal_log_likelihood_standard_deviation()))


marginal log-likelihood = -375.348895962891 ± 0.07908966826933428

Effective sample size


In [7]:
print('effective sample size = ' + str(sampler.effective_sample_size()))


effective sample size = 1597.4248390822784