Simple Example

Approximating a simple intractable distribution: Gaussian with a Gaussian prior on it's log variance. Approximation is with a Gaussian.

I'll be putting in explanations of what's going in each cell after the cell:


In [1]:
# %load ../examples/black_box_svi.py
from __future__ import absolute_import
from __future__ import print_function
import matplotlib.pyplot as plt
import holoviews as hv
hv.notebook_extension(bokeh=True)

import autograd.numpy as np
import autograd.numpy.random as npr
import autograd.scipy.stats.multivariate_normal as mvn
import autograd.scipy.stats.norm as norm

from autograd import grad


/usr/lib/python2.7/site-packages/matplotlib/font_manager.py:273: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
  warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.')
WARNING:root:notebook_extension: 'bokeh' will be ignored (not a Parameter).
HoloViewsJS successfully loaded in this cell.

Just imports. Whe have to import numpy through autograd, hence the autograd.numpy as np, to have access to automatic differentiation for all of our numpy code.


In [3]:
# %load ../examples/optimizers.py
from __future__ import absolute_import

import autograd.numpy as np
from builtins import range

def adam(grad, x, callback=None, num_iters=100,
         step_size=0.001, b1=0.9, b2=0.999, eps=10**-8):
    """Adam as described in http://arxiv.org/pdf/1412.6980.pdf.
    It's basically RMSprop with momentum and some correction terms."""
    m = np.zeros(len(x))
    v = np.zeros(len(x))
    for i in range(num_iters):
        g = grad(x, i)
        if callback: callback(x, i, g)
        m = (1 - b1) * g      + b1 * m  # First  moment estimate.
        v = (1 - b2) * (g**2) + b2 * v  # Second moment estimate.
        mhat = m / (1 - b1**(i + 1))    # Bias correction.
        vhat = v / (1 - b2**(i + 1))
        x -= step_size*mhat/(np.sqrt(vhat) + eps)
    return x

This isn't really necessary, we could just use vanilla SGD, but this is probably slightly better. Defines the adaptive gradient updates of the parameters, and the optimisation loop. Optionally, we can add a callback function.


In [4]:
def black_box_variational_inference(logprob, D, num_samples):
    """Implements http://arxiv.org/abs/1401.0118, and uses the
    local reparameterization trick from http://arxiv.org/abs/1506.02557"""

    def unpack_params(params):
        # Variational dist is a diagonal Gaussian.
        mean, log_std = params[:D], params[D:]
        return mean, log_std

    def gaussian_entropy(log_std):
        return 0.5 * D * (1.0 + np.log(2*np.pi)) + np.sum(log_std)

    rs = npr.RandomState(0)
    def variational_objective(params, t):
        """Provides a stochastic estimate of the variational lower bound."""
        mean, log_std = unpack_params(params)
        samples = rs.randn(num_samples, D) * np.exp(log_std) + mean
        lower_bound = gaussian_entropy(log_std) + np.mean(logprob(samples, t))
        return -lower_bound

    gradient = grad(variational_objective)

    return variational_objective, gradient, unpack_params

Sample from our variational distribution, and then use this to define a lower bound (a variational lower bound) on the marginal likelihood. Then, take a gradient of this stochastic estimate of the lower bound so we can perform gradient updates to the parameters we used to sample from the variational distribution; this is black box variational inference in 23 lines minus comments.


In [5]:
# Specify an inference problem by its unnormalized log-posterior.
D = 2
def log_posterior(x, t):
    """An example 2D intractable distribution:
    a Gaussian evaluated at zero with a Gaussian prior on the log-variance."""
    mu, log_sigma = x[:, 0], x[:, 1]
    prior       = norm.logpdf(log_sigma, 0, 1.35)
    likelihood  = norm.logpdf(mu,        0, np.exp(log_sigma))
    return prior + likelihood

Here's our intractable distribution, it even says it right there, and we have a graph of it coming up!


In [6]:
# Build variational objective.
objective, gradient, unpack_params = \
    black_box_variational_inference(log_posterior, D, num_samples=2000)

Running the function we defined above...


In [7]:
def isocontours(func, xlimits=[-2, 2], ylimits=[-4, 2], numticks=512):
    x = np.linspace(*xlimits, num=numticks)
    y = np.linspace(*ylimits, num=numticks)
    X, Y = np.meshgrid(x, y)
    zs = func(np.concatenate([np.atleast_2d(X.ravel()), np.atleast_2d(Y.ravel())]).T)
    Z = zs.reshape(X.shape)
    return Z

This is going to give us an image with the contours of a distribution. Going to use it to graph the intractable distribution from above.


In [8]:
def callback(params, t, g):
    print("Iteration {} lower bound {}".format(t, -objective(params, t)))

    plt.cla()
    target_distribution = lambda x : np.exp(log_posterior(x, t))
    plot_isocontours(ax, target_distribution)

    mean, log_std = unpack_params(params)
    variational_contour = lambda x: mvn.pdf(x, mean, np.diag(np.exp(2*log_std)))
    plot_isocontours(ax, variational_contour)
    plt.draw()
    plt.pause(1.0/30.0)

Our callback function while we're running the updates. It's just going to print out the progress as we loop around in the adam function above.


In [9]:
target = lambda x: np.exp(log_posterior(x, 1))

Defining our target distribution.


In [12]:
%%output size=250
Z = isocontours(target)
target_distribution = hv.Image(Z)
hv.operation.contours(target_distribution, levels=np.linspace(0,max(Z.ravel()),10))


Out[12]:

It's that graph of the intractable distribution I told you about!


In [13]:
init_mean    = -1 * np.ones(D)
init_log_std = -5 * np.ones(D)

Initialising the parameters of the variational approximation.


In [14]:
variational_contour = lambda x: mvn.pdf(x, init_mean, np.diag(np.exp(2*init_log_std)))
Z = isocontours(variational_contour)
variational_distribution = hv.Image(Z)
hv.operation.contours(variational_distribution, levels=np.linspace(0,max(Z.ravel()),10))


Out[14]:

Graph of the current variational approximation, centered at zero as we might expect.


In [15]:
print("Optimizing variational parameters...")
init_var_params = np.concatenate([init_mean, init_log_std])
variational_params = adam(gradient, init_var_params, step_size=0.1, num_iters=2000, callback=None)


Optimizing variational parameters...

Then we can optimise the parameters with no callback, so we see no progress, and it should finish fairly quickly.


In [19]:
%%output size=200
mean, log_std = unpack_params(variational_params)
variational_contour = lambda x: mvn.pdf(x, mean, np.diag(np.exp(2*log_std)))
Z = isocontours(variational_contour)
variational_distribution = hv.Image(Z)
hv.operation.contours(variational_distribution, levels=np.linspace(0,max(Z.ravel()),10))+target_distribution


Out[19]:

Now we can graph the isotropic approximation next to the intractable distribution and see that we've not really got a perfect approximation, but at least it's doing its best, right?

Bayesian Neural Network

Next, fitting a Bayesian neural network in only a few more lines:


In [20]:
def make_nn_funs(layer_sizes, L2_reg, noise_variance, nonlinearity=np.tanh):
    """These functions implement a standard multi-layer perceptron,
    vectorized over both training examples and weight samples."""
    shapes = list(zip(layer_sizes[:-1], layer_sizes[1:]))
    num_weights = sum((m+1)*n for m, n in shapes)

    def unpack_layers(weights):
        num_weight_sets = len(weights)
        for m, n in shapes:
            yield weights[:, :m*n]     .reshape((num_weight_sets, m, n)),\
                  weights[:, m*n:m*n+n].reshape((num_weight_sets, 1, n))
            weights = weights[:, (m+1)*n:]

    def predictions(weights, inputs):
        """weights is shape (num_weight_samples x num_weights)
           inputs  is shape (num_datapoints x D)"""
        inputs = np.expand_dims(inputs, 0)
        for W, b in unpack_layers(weights):
            outputs = np.einsum('mnd,mdo->mno', inputs, W) + b
            inputs = nonlinearity(outputs)
        return outputs

    def logprob(weights, inputs, targets):
        log_prior = -L2_reg * np.sum(weights**2, axis=1)
        preds = predictions(weights, inputs)
        log_lik = -np.sum((preds - targets)**2, axis=1)[:, 0] / noise_variance
        return log_prior + log_lik

    return num_weights, predictions, logprob

These nested functions are kind of complicated. One of the strange parts above is the unpack_layers function. This is a quirk of how autograd works, and the fact that they wanted to write this in as few lines as possible. The grad function of autograd only takes the gradient of a single numpy array argument to a function (you state which argument, or it defaults to the first), so in order to only call grad once they have to unpack the weights from a single large array. Alternatively, they could have had all of these parameters in lists and iterated over the grad calls (this is kind of what Lasagne does, keeping everything in lists which are assumed to stay in the correct order). But, they already wrote the black_box_variational_inference function above, and they're not going to rewrite it.

In predictions, we're getting the output of the network for a given input and set of weights. They do this in as few lines as possible using the completely opaque np.einsum, which is a matrix operation in Einstein notation. This seems like a really useful thing to understand well, but, like regular expressions, not something you're going to remember without regular use.

Finally, we describe the log probability, which is just an L2 regularised squared error loss function.


In [21]:
def build_toy_dataset(n_data=40, noise_std=0.1):
    D = 1
    rs = npr.RandomState(0)
    inputs  = np.concatenate([np.linspace(0, 2, num=n_data/2),
                              np.linspace(6, 8, num=n_data/2)])
    targets = np.cos(inputs) + rs.randn(n_data) * noise_std
    inputs = (inputs - 4.0) / 4.0
    inputs  = inputs.reshape((len(inputs), D))
    targets = targets.reshape((len(targets), D))
    return inputs, targets

Defining a simple 1D regression problem, so we can visualise the output.


In [22]:
# Specify inference problem by its unnormalized log-posterior.
rbf = lambda x: norm.pdf(x, 0, 1)
sq = lambda x: np.sin(x)
num_weights, predictions, logprob = \
    make_nn_funs(layer_sizes=[1, 10, 10, 1], L2_reg=0.01,
                 noise_variance = 0.01, nonlinearity=rbf)

Instance the network, with an unconventional rbf nonlinearity.


In [23]:
inputs, targets = build_toy_dataset()
log_posterior = lambda weights, t: logprob(weights, inputs, targets)

Define our log posterior, and instance the dataset.


In [24]:
hv.Curve(zip(inputs,targets))


Out[24]:

Looking at our toy dataset.


In [25]:
# Build variational objective.
objective, gradient, unpack_params = \
    black_box_variational_inference(log_posterior, num_weights,
                                    num_samples=20)

def callback(params, t, g):
    print("Iteration {} lower bound {}".format(t, -objective(params, t)))
    
# Initialize variational parameters
rs = npr.RandomState(0)
init_mean    = rs.randn(num_weights)
init_log_std = -1 * np.ones(num_weights)
init_var_params = np.concatenate([init_mean, init_log_std])

Now we call the same black_box_variational_inference function from above, so we're using an isotropic Gaussian variational distribution on all the weights in our network. Then, we initialise all the variational parameters we're going to need.


In [26]:
def sample_predictions(params, inputs):
    # Sample functions from posterior.
    #rs = npr.RandomState(0)
    mean, log_std = unpack_params(params)
    sample_weights = rs.randn(10, num_weights) * np.exp(log_std) + mean
    return predictions(sample_weights, inputs)

In [28]:
%%output size=200
preds = sample_predictions(init_var_params, inputs)
hv.Overlay([hv.Curve(zip(inputs.ravel(), p.ravel())) for p in preds])


Out[28]:

Looking at some of the outputs from the network before training. Obviously, the match to the dataset above isn't great.


In [29]:
print("Optimizing variational parameters...")
variational_params = adam(gradient, init_var_params,
                          step_size=0.1, num_iters=1000, callback=callback)


Optimizing variational parameters...
Iteration 0 lower bound -23837.8412344
Iteration 1 lower bound -12701.1404043
Iteration 2 lower bound -6494.77525443
Iteration 3 lower bound -3199.99333708
Iteration 4 lower bound -2128.84844724
Iteration 5 lower bound -1348.38349564
Iteration 6 lower bound -1597.58282418
Iteration 7 lower bound -1132.5283615
Iteration 8 lower bound -1552.14377848
Iteration 9 lower bound -1335.781635
Iteration 10 lower bound -1806.91382332
Iteration 11 lower bound -1033.30200933
Iteration 12 lower bound -1322.81101889
Iteration 13 lower bound -985.28167169
Iteration 14 lower bound -1243.54369636
Iteration 15 lower bound -1134.87404683
Iteration 16 lower bound -1007.13710981
Iteration 17 lower bound -1402.09653752
Iteration 18 lower bound -1203.00791227
Iteration 19 lower bound -1015.57169272
Iteration 20 lower bound -903.451640578
Iteration 21 lower bound -1030.26215262
Iteration 22 lower bound -1027.16965672
Iteration 23 lower bound -1037.70151199
Iteration 24 lower bound -891.648341381
Iteration 25 lower bound -823.321990187
Iteration 26 lower bound -794.752141931
Iteration 27 lower bound -787.820375607
Iteration 28 lower bound -854.26822487
Iteration 29 lower bound -750.009471384
Iteration 30 lower bound -706.727976116
Iteration 31 lower bound -730.810881713
Iteration 32 lower bound -667.95702987
Iteration 33 lower bound -634.583684273
Iteration 34 lower bound -589.041920831
Iteration 35 lower bound -541.165559101
Iteration 36 lower bound -523.187847046
Iteration 37 lower bound -485.813431516
Iteration 38 lower bound -475.242474333
Iteration 39 lower bound -466.002147277
Iteration 40 lower bound -449.167286473
Iteration 41 lower bound -407.910147629
Iteration 42 lower bound -456.174254783
Iteration 43 lower bound -359.218316736
Iteration 44 lower bound -384.007692909
Iteration 45 lower bound -376.741278519
Iteration 46 lower bound -341.751697737
Iteration 47 lower bound -358.51683701
Iteration 48 lower bound -347.423331624
Iteration 49 lower bound -349.265122781
Iteration 50 lower bound -391.073354119
Iteration 51 lower bound -357.716998271
Iteration 52 lower bound -348.678829706
Iteration 53 lower bound -309.451132974
Iteration 54 lower bound -332.457299848
Iteration 55 lower bound -346.472918634
Iteration 56 lower bound -293.706098291
Iteration 57 lower bound -293.959196982
Iteration 58 lower bound -315.932500125
Iteration 59 lower bound -295.54855269
Iteration 60 lower bound -307.989299244
Iteration 61 lower bound -286.541619671
Iteration 62 lower bound -253.131557951
Iteration 63 lower bound -312.03581481
Iteration 64 lower bound -331.227782562
Iteration 65 lower bound -266.236870765
Iteration 66 lower bound -236.87434846
Iteration 67 lower bound -305.498907442
Iteration 68 lower bound -243.326070116
Iteration 69 lower bound -274.773212055
Iteration 70 lower bound -259.747571713
Iteration 71 lower bound -231.838600223
Iteration 72 lower bound -204.861054202
Iteration 73 lower bound -223.085934496
Iteration 74 lower bound -185.413932695
Iteration 75 lower bound -226.288799259
Iteration 76 lower bound -206.327662542
Iteration 77 lower bound -178.86751131
Iteration 78 lower bound -210.087299488
Iteration 79 lower bound -206.474967235
Iteration 80 lower bound -178.417779769
Iteration 81 lower bound -173.967513417
Iteration 82 lower bound -157.528729146
Iteration 83 lower bound -197.586884353
Iteration 84 lower bound -172.962226559
Iteration 85 lower bound -180.965587662
Iteration 86 lower bound -131.627904926
Iteration 87 lower bound -162.776906465
Iteration 88 lower bound -173.063800059
Iteration 89 lower bound -135.278767458
Iteration 90 lower bound -158.057442258
Iteration 91 lower bound -121.149974387
Iteration 92 lower bound -113.464742432
Iteration 93 lower bound -95.7338547206
Iteration 94 lower bound -163.576487172
Iteration 95 lower bound -138.173322362
Iteration 96 lower bound -138.056337873
Iteration 97 lower bound -111.297889306
Iteration 98 lower bound -86.4347715491
Iteration 99 lower bound -115.75620491
Iteration 100 lower bound -90.6053696946
Iteration 101 lower bound -107.150535189
Iteration 102 lower bound -97.1335379017
Iteration 103 lower bound -65.3802118765
Iteration 104 lower bound -88.7805460706
Iteration 105 lower bound -117.664462669
Iteration 106 lower bound -76.6270668042
Iteration 107 lower bound -96.0018473888
Iteration 108 lower bound -80.6758575458
Iteration 109 lower bound -75.7189105099
Iteration 110 lower bound -80.8902810301
Iteration 111 lower bound -66.7693586797
Iteration 112 lower bound -57.085337889
Iteration 113 lower bound -76.5007699892
Iteration 114 lower bound -55.8596802442
Iteration 115 lower bound -73.1613781603
Iteration 116 lower bound -72.4321621699
Iteration 117 lower bound -59.270481962
Iteration 118 lower bound -49.2420407077
Iteration 119 lower bound -37.4814416088
Iteration 120 lower bound -53.6776804402
Iteration 121 lower bound -59.2208137104
Iteration 122 lower bound -64.267492378
Iteration 123 lower bound -57.0416540333
Iteration 124 lower bound -59.9004662575
Iteration 125 lower bound -37.9068780153
Iteration 126 lower bound -74.9878511581
Iteration 127 lower bound -47.8619929971
Iteration 128 lower bound -65.4395369801
Iteration 129 lower bound -28.5010089769
Iteration 130 lower bound -52.489676561
Iteration 131 lower bound -35.8893032335
Iteration 132 lower bound -33.684989947
Iteration 133 lower bound -45.1787642372
Iteration 134 lower bound -29.672371113
Iteration 135 lower bound -42.4497237202
Iteration 136 lower bound -45.0065649635
Iteration 137 lower bound -87.2801685133
Iteration 138 lower bound -33.3959309045
Iteration 139 lower bound -42.9864059684
Iteration 140 lower bound -51.9304743135
Iteration 141 lower bound -17.8605508379
Iteration 142 lower bound -46.3832883854
Iteration 143 lower bound -37.7858427347
Iteration 144 lower bound -40.2735214086
Iteration 145 lower bound -37.4480629878
Iteration 146 lower bound -17.6952014093
Iteration 147 lower bound -20.0812520781
Iteration 148 lower bound -46.6649714334
Iteration 149 lower bound -25.6513998991
Iteration 150 lower bound -30.2512309884
Iteration 151 lower bound -6.48315298585
Iteration 152 lower bound -24.2940741395
Iteration 153 lower bound -21.6673516679
Iteration 154 lower bound -12.7027460867
Iteration 155 lower bound 0.409288801324
Iteration 156 lower bound -17.4749562832
Iteration 157 lower bound -11.5467991105
Iteration 158 lower bound -10.5382571687
Iteration 159 lower bound -8.99489114964
Iteration 160 lower bound -18.1897056397
Iteration 161 lower bound -32.3127419641
Iteration 162 lower bound -13.1323912895
Iteration 163 lower bound 3.09264692637
Iteration 164 lower bound -1.20213918721
Iteration 165 lower bound 6.09464768001
Iteration 166 lower bound -14.7463721455
Iteration 167 lower bound 6.17974391106
Iteration 168 lower bound 11.6059996866
Iteration 169 lower bound 13.0678594693
Iteration 170 lower bound 2.59385732841
Iteration 171 lower bound 7.8851229419
Iteration 172 lower bound 3.93602423513
Iteration 173 lower bound 20.6831756587
Iteration 174 lower bound -9.67641957864
Iteration 175 lower bound 6.95466696924
Iteration 176 lower bound 3.45310108861
Iteration 177 lower bound 11.995013195
Iteration 178 lower bound 8.10442500674
Iteration 179 lower bound 19.0883891765
Iteration 180 lower bound 26.5258672224
Iteration 181 lower bound 20.1243810218
Iteration 182 lower bound 14.0029678713
Iteration 183 lower bound 6.52010843329
Iteration 184 lower bound 2.75558068303
Iteration 185 lower bound -1.9503630177
Iteration 186 lower bound 1.65987627181
Iteration 187 lower bound 16.585780478
Iteration 188 lower bound 6.92292891641
Iteration 189 lower bound 4.20250196559
Iteration 190 lower bound 20.9786108105
Iteration 191 lower bound 14.1385771866
Iteration 192 lower bound 13.053539874
Iteration 193 lower bound 28.0167505943
Iteration 194 lower bound 25.7145672325
Iteration 195 lower bound 2.410568282
Iteration 196 lower bound 26.3418473528
Iteration 197 lower bound 12.4312414914
Iteration 198 lower bound 35.3728261504
Iteration 199 lower bound 33.3100064779
Iteration 200 lower bound 5.91170017691
Iteration 201 lower bound 34.3347259741
Iteration 202 lower bound 15.058809925
Iteration 203 lower bound 34.8059183848
Iteration 204 lower bound 33.8598194786
Iteration 205 lower bound 18.980797043
Iteration 206 lower bound 27.005991236
Iteration 207 lower bound 36.2553615629
Iteration 208 lower bound 22.0502942151
Iteration 209 lower bound 24.4735246014
Iteration 210 lower bound 22.9196199553
Iteration 211 lower bound 40.4790541278
Iteration 212 lower bound 24.0201279455
Iteration 213 lower bound 15.4216055353
Iteration 214 lower bound 26.3776917705
Iteration 215 lower bound 19.9279473259
Iteration 216 lower bound 28.3997688682
Iteration 217 lower bound 9.36663785919
Iteration 218 lower bound 51.8215056484
Iteration 219 lower bound 42.2526672618
Iteration 220 lower bound 18.9649556383
Iteration 221 lower bound 42.9309833169
Iteration 222 lower bound 43.8467877542
Iteration 223 lower bound 45.2058878273
Iteration 224 lower bound 42.0772392879
Iteration 225 lower bound 34.3198832896
Iteration 226 lower bound 49.2839447434
Iteration 227 lower bound 46.4399259561
Iteration 228 lower bound 60.6707062129
Iteration 229 lower bound 42.3963548242
Iteration 230 lower bound 29.9570739004
Iteration 231 lower bound 42.3372950063
Iteration 232 lower bound 26.9943421449
Iteration 233 lower bound 23.4186387646
Iteration 234 lower bound 36.7256031986
Iteration 235 lower bound 55.2239850096
Iteration 236 lower bound 48.6859583073
Iteration 237 lower bound 58.919215947
Iteration 238 lower bound 50.3153320205
Iteration 239 lower bound 62.239467247
Iteration 240 lower bound 43.4478279294
Iteration 241 lower bound 42.1059731041
Iteration 242 lower bound 64.1718821494
Iteration 243 lower bound 52.6139586741
Iteration 244 lower bound 38.6556261161
Iteration 245 lower bound 50.6348936491
Iteration 246 lower bound 71.4408652985
Iteration 247 lower bound 57.76611498
Iteration 248 lower bound 65.7579981425
Iteration 249 lower bound 56.2183586041
Iteration 250 lower bound 65.8255222228
Iteration 251 lower bound 41.6660276638
Iteration 252 lower bound 39.5845834895
Iteration 253 lower bound 73.6203326615
Iteration 254 lower bound 54.7993123626
Iteration 255 lower bound 64.742811038
Iteration 256 lower bound 67.1151794601
Iteration 257 lower bound 70.6281269185
Iteration 258 lower bound 56.7091522902
Iteration 259 lower bound 38.7913127086
Iteration 260 lower bound 72.2876405031
Iteration 261 lower bound 67.3338063146
Iteration 262 lower bound 49.8229981647
Iteration 263 lower bound 36.6773765858
Iteration 264 lower bound 65.5796797463
Iteration 265 lower bound 78.1677564598
Iteration 266 lower bound 54.5879122131
Iteration 267 lower bound 74.2085340828
Iteration 268 lower bound 78.7847449131
Iteration 269 lower bound 49.2920071314
Iteration 270 lower bound 68.79299305
Iteration 271 lower bound 76.4660413212
Iteration 272 lower bound 61.318856873
Iteration 273 lower bound 79.1774859852
Iteration 274 lower bound 73.4134429931
Iteration 275 lower bound 74.276807266
Iteration 276 lower bound 80.652364069
Iteration 277 lower bound 78.1014251304
Iteration 278 lower bound 46.7445604728
Iteration 279 lower bound 75.4740608274
Iteration 280 lower bound 62.2583953805
Iteration 281 lower bound 82.7652016783
Iteration 282 lower bound 80.2937791767
Iteration 283 lower bound 63.8469384407
Iteration 284 lower bound 80.0393945074
Iteration 285 lower bound 85.9003138866
Iteration 286 lower bound 78.5442993363
Iteration 287 lower bound 74.0844345136
Iteration 288 lower bound 58.4748812312
Iteration 289 lower bound 78.7691909495
Iteration 290 lower bound 83.5540531931
Iteration 291 lower bound 86.5552082587
Iteration 292 lower bound 88.5466357739
Iteration 293 lower bound 69.6361017375
Iteration 294 lower bound 80.2802642356
Iteration 295 lower bound 79.4913104143
Iteration 296 lower bound 79.7988514205
Iteration 297 lower bound 73.8753161351
Iteration 298 lower bound 81.3827345366
Iteration 299 lower bound 75.2217544711
Iteration 300 lower bound 77.8327996061
Iteration 301 lower bound 66.2257847569
Iteration 302 lower bound 78.6426064771
Iteration 303 lower bound 93.7619563027
Iteration 304 lower bound 85.2850497485
Iteration 305 lower bound 81.0871634751
Iteration 306 lower bound 94.7580451846
Iteration 307 lower bound 72.5212519972
Iteration 308 lower bound 79.5689425005
Iteration 309 lower bound 73.7175154723
Iteration 310 lower bound 89.1690562196
Iteration 311 lower bound 83.6931903425
Iteration 312 lower bound 91.4257386902
Iteration 313 lower bound 85.6639444471
Iteration 314 lower bound 91.9948430136
Iteration 315 lower bound 74.0238325673
Iteration 316 lower bound 94.2087381641
Iteration 317 lower bound 82.8054140806
Iteration 318 lower bound 98.0290838549
Iteration 319 lower bound 90.6320443851
Iteration 320 lower bound 74.9043610108
Iteration 321 lower bound 82.1516611638
Iteration 322 lower bound 95.7852351682
Iteration 323 lower bound 101.636913152
Iteration 324 lower bound 94.0228960367
Iteration 325 lower bound 97.5751337245
Iteration 326 lower bound 88.2083143283
Iteration 327 lower bound 94.7635477711
Iteration 328 lower bound 77.9902715021
Iteration 329 lower bound 93.0322722493
Iteration 330 lower bound 102.070502425
Iteration 331 lower bound 84.07260621
Iteration 332 lower bound 105.935373373
Iteration 333 lower bound 92.4278252674
Iteration 334 lower bound 97.8682050382
Iteration 335 lower bound 82.715688809
Iteration 336 lower bound 93.2669135578
Iteration 337 lower bound 109.499350447
Iteration 338 lower bound 96.2474481224
Iteration 339 lower bound 90.6749530128
Iteration 340 lower bound 93.869386162
Iteration 341 lower bound 103.863639103
Iteration 342 lower bound 97.0646898046
Iteration 343 lower bound 100.837847149
Iteration 344 lower bound 107.145970985
Iteration 345 lower bound 93.2771203147
Iteration 346 lower bound 103.815545143
Iteration 347 lower bound 103.2750039
Iteration 348 lower bound 115.711352002
Iteration 349 lower bound 102.277992588
Iteration 350 lower bound 103.236507923
Iteration 351 lower bound 95.6137486901
Iteration 352 lower bound 104.285315337
Iteration 353 lower bound 115.5191628
Iteration 354 lower bound 112.967967016
Iteration 355 lower bound 101.384612964
Iteration 356 lower bound 106.498611332
Iteration 357 lower bound 109.723239973
Iteration 358 lower bound 105.803362737
Iteration 359 lower bound 100.264864135
Iteration 360 lower bound 95.8774091042
Iteration 361 lower bound 100.870142757
Iteration 362 lower bound 102.847703078
Iteration 363 lower bound 102.794894304
Iteration 364 lower bound 108.011660282
Iteration 365 lower bound 117.107281475
Iteration 366 lower bound 86.1747358357
Iteration 367 lower bound 92.5062816308
Iteration 368 lower bound 95.2953297046
Iteration 369 lower bound 106.423180103
Iteration 370 lower bound 103.548052555
Iteration 371 lower bound 100.091565074
Iteration 372 lower bound 105.05109736
Iteration 373 lower bound 120.806618102
Iteration 374 lower bound 109.079491466
Iteration 375 lower bound 118.815467669
Iteration 376 lower bound 120.373907807
Iteration 377 lower bound 114.664249849
Iteration 378 lower bound 97.7627100337
Iteration 379 lower bound 110.154540501
Iteration 380 lower bound 123.245241149
Iteration 381 lower bound 116.116192663
Iteration 382 lower bound 120.127528381
Iteration 383 lower bound 115.439778332
Iteration 384 lower bound 123.012656384
Iteration 385 lower bound 115.57364358
Iteration 386 lower bound 122.114155979
Iteration 387 lower bound 112.301967633
Iteration 388 lower bound 127.074158466
Iteration 389 lower bound 125.415591409
Iteration 390 lower bound 125.76389041
Iteration 391 lower bound 121.920267991
Iteration 392 lower bound 126.670833018
Iteration 393 lower bound 116.999499275
Iteration 394 lower bound 126.977742476
Iteration 395 lower bound 129.410373904
Iteration 396 lower bound 125.684347565
Iteration 397 lower bound 122.026844247
Iteration 398 lower bound 124.625998029
Iteration 399 lower bound 128.762717481
Iteration 400 lower bound 122.489547981
Iteration 401 lower bound 123.328226416
Iteration 402 lower bound 130.094989508
Iteration 403 lower bound 123.887018275
Iteration 404 lower bound 130.224782022
Iteration 405 lower bound 124.39624029
Iteration 406 lower bound 126.907066114
Iteration 407 lower bound 103.047788144
Iteration 408 lower bound 124.340184833
Iteration 409 lower bound 129.644905587
Iteration 410 lower bound 121.962294364
Iteration 411 lower bound 133.63204014
Iteration 412 lower bound 130.116632997
Iteration 413 lower bound 135.572868994
Iteration 414 lower bound 138.918762026
Iteration 415 lower bound 126.273265723
Iteration 416 lower bound 124.086596693
Iteration 417 lower bound 139.067698944
Iteration 418 lower bound 128.39624456
Iteration 419 lower bound 135.710746967
Iteration 420 lower bound 131.188047655
Iteration 421 lower bound 133.778468122
Iteration 422 lower bound 121.999654651
Iteration 423 lower bound 122.516634896
Iteration 424 lower bound 131.449637672
Iteration 425 lower bound 123.494334077
Iteration 426 lower bound 125.752914006
Iteration 427 lower bound 139.885724477
Iteration 428 lower bound 135.703010271
Iteration 429 lower bound 120.760616446
Iteration 430 lower bound 145.507534037
Iteration 431 lower bound 139.343737604
Iteration 432 lower bound 134.930034454
Iteration 433 lower bound 138.151725443
Iteration 434 lower bound 133.204711396
Iteration 435 lower bound 136.749122279
Iteration 436 lower bound 118.592484999
Iteration 437 lower bound 145.041666061
Iteration 438 lower bound 143.481612998
Iteration 439 lower bound 135.826549714
Iteration 440 lower bound 124.944625094
Iteration 441 lower bound 134.223359623
Iteration 442 lower bound 152.703123773
Iteration 443 lower bound 130.059902913
Iteration 444 lower bound 137.200178728
Iteration 445 lower bound 137.805859878
Iteration 446 lower bound 134.590879521
Iteration 447 lower bound 140.794393503
Iteration 448 lower bound 149.472048587
Iteration 449 lower bound 129.349805046
Iteration 450 lower bound 140.9043088
Iteration 451 lower bound 140.833842665
Iteration 452 lower bound 143.417720146
Iteration 453 lower bound 140.783460827
Iteration 454 lower bound 141.761553854
Iteration 455 lower bound 153.675140144
Iteration 456 lower bound 140.091566917
Iteration 457 lower bound 138.905674337
Iteration 458 lower bound 145.917882363
Iteration 459 lower bound 146.502860498
Iteration 460 lower bound 152.326321766
Iteration 461 lower bound 142.592660099
Iteration 462 lower bound 138.782306376
Iteration 463 lower bound 149.733273802
Iteration 464 lower bound 138.995810781
Iteration 465 lower bound 148.306536137
Iteration 466 lower bound 147.940435509
Iteration 467 lower bound 149.841095509
Iteration 468 lower bound 134.872198282
Iteration 469 lower bound 136.838389821
Iteration 470 lower bound 152.784924794
Iteration 471 lower bound 150.790899095
Iteration 472 lower bound 147.70747713
Iteration 473 lower bound 153.562469877
Iteration 474 lower bound 155.623246417
Iteration 475 lower bound 136.393604904
Iteration 476 lower bound 148.108217668
Iteration 477 lower bound 149.498390382
Iteration 478 lower bound 151.010647582
Iteration 479 lower bound 149.505268978
Iteration 480 lower bound 150.469523113
Iteration 481 lower bound 152.785790242
Iteration 482 lower bound 146.741677645
Iteration 483 lower bound 151.444562701
Iteration 484 lower bound 149.280019267
Iteration 485 lower bound 153.261957184
Iteration 486 lower bound 155.89938862
Iteration 487 lower bound 148.48904871
Iteration 488 lower bound 143.357749557
Iteration 489 lower bound 152.244908462
Iteration 490 lower bound 145.254160212
Iteration 491 lower bound 149.835302164
Iteration 492 lower bound 152.451497876
Iteration 493 lower bound 154.318592262
Iteration 494 lower bound 152.030120513
Iteration 495 lower bound 146.133955218
Iteration 496 lower bound 146.107228096
Iteration 497 lower bound 155.379302022
Iteration 498 lower bound 147.176043133
Iteration 499 lower bound 156.939837989
Iteration 500 lower bound 156.647729879
Iteration 501 lower bound 159.126600679
Iteration 502 lower bound 153.197338148
Iteration 503 lower bound 151.489251909
Iteration 504 lower bound 146.338773
Iteration 505 lower bound 150.348610878
Iteration 506 lower bound 159.225520799
Iteration 507 lower bound 148.463007518
Iteration 508 lower bound 160.752313542
Iteration 509 lower bound 153.174530288
Iteration 510 lower bound 156.012951963
Iteration 511 lower bound 151.454742541
Iteration 512 lower bound 150.918759253
Iteration 513 lower bound 159.292389405
Iteration 514 lower bound 147.460255821
Iteration 515 lower bound 161.386956773
Iteration 516 lower bound 155.300869719
Iteration 517 lower bound 146.726443584
Iteration 518 lower bound 158.667533979
Iteration 519 lower bound 153.525350374
Iteration 520 lower bound 158.606364584
Iteration 521 lower bound 160.541189256
Iteration 522 lower bound 151.377134915
Iteration 523 lower bound 155.051485799
Iteration 524 lower bound 138.718532385
Iteration 525 lower bound 155.407583932
Iteration 526 lower bound 162.791832927
Iteration 527 lower bound 154.263679609
Iteration 528 lower bound 155.752534091
Iteration 529 lower bound 158.126066462
Iteration 530 lower bound 159.590501594
Iteration 531 lower bound 160.076834522
Iteration 532 lower bound 154.794809605
Iteration 533 lower bound 157.551630061
Iteration 534 lower bound 163.984781295
Iteration 535 lower bound 161.802762857
Iteration 536 lower bound 161.403156242
Iteration 537 lower bound 165.702825069
Iteration 538 lower bound 161.346484482
Iteration 539 lower bound 161.027041578
Iteration 540 lower bound 158.536789936
Iteration 541 lower bound 153.51026057
Iteration 542 lower bound 160.610736088
Iteration 543 lower bound 160.146342276
Iteration 544 lower bound 138.783481789
Iteration 545 lower bound 153.623307887
Iteration 546 lower bound 156.028413663
Iteration 547 lower bound 158.257118912
Iteration 548 lower bound 144.793805122
Iteration 549 lower bound 155.502440394
Iteration 550 lower bound 151.056047098
Iteration 551 lower bound 139.844001195
Iteration 552 lower bound 163.533083799
Iteration 553 lower bound 157.277881597
Iteration 554 lower bound 146.031173937
Iteration 555 lower bound 145.144508642
Iteration 556 lower bound 158.008538528
Iteration 557 lower bound 146.777588543
Iteration 558 lower bound 158.290723345
Iteration 559 lower bound 168.140948988
Iteration 560 lower bound 161.053013132
Iteration 561 lower bound 164.480760079
Iteration 562 lower bound 163.143863018
Iteration 563 lower bound 155.860539645
Iteration 564 lower bound 157.79695218
Iteration 565 lower bound 166.918711184
Iteration 566 lower bound 160.270620688
Iteration 567 lower bound 165.204509255
Iteration 568 lower bound 162.458579443
Iteration 569 lower bound 157.801618757
Iteration 570 lower bound 153.270013789
Iteration 571 lower bound 171.94347153
Iteration 572 lower bound 164.742007489
Iteration 573 lower bound 156.196033938
Iteration 574 lower bound 163.843435754
Iteration 575 lower bound 165.814933282
Iteration 576 lower bound 162.220743733
Iteration 577 lower bound 167.998821937
Iteration 578 lower bound 156.442743526
Iteration 579 lower bound 162.424790301
Iteration 580 lower bound 161.537523147
Iteration 581 lower bound 157.28302435
Iteration 582 lower bound 166.250487481
Iteration 583 lower bound 166.602283075
Iteration 584 lower bound 150.875915889
Iteration 585 lower bound 162.842013165
Iteration 586 lower bound 167.00556856
Iteration 587 lower bound 165.922486367
Iteration 588 lower bound 166.865870713
Iteration 589 lower bound 150.368314239
Iteration 590 lower bound 160.474581014
Iteration 591 lower bound 166.625367234
Iteration 592 lower bound 158.876582265
Iteration 593 lower bound 156.810695267
Iteration 594 lower bound 161.444021632
Iteration 595 lower bound 168.380697531
Iteration 596 lower bound 173.880266664
Iteration 597 lower bound 166.142152654
Iteration 598 lower bound 166.846033249
Iteration 599 lower bound 167.469149003
Iteration 600 lower bound 172.539415585
Iteration 601 lower bound 171.057047265
Iteration 602 lower bound 164.444336554
Iteration 603 lower bound 161.1184853
Iteration 604 lower bound 168.484726461
Iteration 605 lower bound 159.705356055
Iteration 606 lower bound 172.812025197
Iteration 607 lower bound 167.976728766
Iteration 608 lower bound 164.438474336
Iteration 609 lower bound 167.637079715
Iteration 610 lower bound 165.769220709
Iteration 611 lower bound 170.534752316
Iteration 612 lower bound 159.892070428
Iteration 613 lower bound 169.959280541
Iteration 614 lower bound 173.77337394
Iteration 615 lower bound 169.380418133
Iteration 616 lower bound 174.371926644
Iteration 617 lower bound 171.67801615
Iteration 618 lower bound 168.868384437
Iteration 619 lower bound 165.339738114
Iteration 620 lower bound 161.44907664
Iteration 621 lower bound 167.299109752
Iteration 622 lower bound 174.754860931
Iteration 623 lower bound 166.421319188
Iteration 624 lower bound 167.246474517
Iteration 625 lower bound 166.765084843
Iteration 626 lower bound 161.063244258
Iteration 627 lower bound 168.563933668
Iteration 628 lower bound 164.013074931
Iteration 629 lower bound 153.101514304
Iteration 630 lower bound 161.949857816
Iteration 631 lower bound 171.420035129
Iteration 632 lower bound 166.898543691
Iteration 633 lower bound 154.386117042
Iteration 634 lower bound 170.067760159
Iteration 635 lower bound 170.356099753
Iteration 636 lower bound 171.109746907
Iteration 637 lower bound 168.208481693
Iteration 638 lower bound 157.322616493
Iteration 639 lower bound 177.169511612
Iteration 640 lower bound 171.495962145
Iteration 641 lower bound 173.229415543
Iteration 642 lower bound 166.084622934
Iteration 643 lower bound 160.180452134
Iteration 644 lower bound 174.083933475
Iteration 645 lower bound 174.339345949
Iteration 646 lower bound 174.26957098
Iteration 647 lower bound 176.19424295
Iteration 648 lower bound 165.623847061
Iteration 649 lower bound 177.415518156
Iteration 650 lower bound 165.929850965
Iteration 651 lower bound 166.398292417
Iteration 652 lower bound 174.323750165
Iteration 653 lower bound 164.145362794
Iteration 654 lower bound 163.389355616
Iteration 655 lower bound 177.436956865
Iteration 656 lower bound 166.260371372
Iteration 657 lower bound 167.984827698
Iteration 658 lower bound 180.316045578
Iteration 659 lower bound 166.534605469
Iteration 660 lower bound 164.558147552
Iteration 661 lower bound 163.411815915
Iteration 662 lower bound 168.036607897
Iteration 663 lower bound 179.715212956
Iteration 664 lower bound 174.745020413
Iteration 665 lower bound 168.28867793
Iteration 666 lower bound 181.155945919
Iteration 667 lower bound 175.042826868
Iteration 668 lower bound 165.091560893
Iteration 669 lower bound 181.024968865
Iteration 670 lower bound 172.374072022
Iteration 671 lower bound 167.972889013
Iteration 672 lower bound 178.514154422
Iteration 673 lower bound 176.959210931
Iteration 674 lower bound 151.182366067
Iteration 675 lower bound 177.533415243
Iteration 676 lower bound 177.597545113
Iteration 677 lower bound 178.530368232
Iteration 678 lower bound 162.790993961
Iteration 679 lower bound 175.710200561
Iteration 680 lower bound 175.965972744
Iteration 681 lower bound 175.35604327
Iteration 682 lower bound 183.767830533
Iteration 683 lower bound 176.446084189
Iteration 684 lower bound 173.590584192
Iteration 685 lower bound 175.069108719
Iteration 686 lower bound 175.538519589
Iteration 687 lower bound 169.832770382
Iteration 688 lower bound 172.363230439
Iteration 689 lower bound 176.276915916
Iteration 690 lower bound 170.454700363
Iteration 691 lower bound 178.108152622
Iteration 692 lower bound 174.971237306
Iteration 693 lower bound 177.736254992
Iteration 694 lower bound 181.331113151
Iteration 695 lower bound 173.202950756
Iteration 696 lower bound 181.474565967
Iteration 697 lower bound 176.650847179
Iteration 698 lower bound 173.661933424
Iteration 699 lower bound 179.280498882
Iteration 700 lower bound 182.200373252
Iteration 701 lower bound 167.783728583
Iteration 702 lower bound 177.523824114
Iteration 703 lower bound 159.794337392
Iteration 704 lower bound 157.300731168
Iteration 705 lower bound 175.248812447
Iteration 706 lower bound 171.514289022
Iteration 707 lower bound 164.75782015
Iteration 708 lower bound 181.739697628
Iteration 709 lower bound 169.516992235
Iteration 710 lower bound 163.388331142
Iteration 711 lower bound 168.904027046
Iteration 712 lower bound 182.662722133
Iteration 713 lower bound 169.574741622
Iteration 714 lower bound 169.203007797
Iteration 715 lower bound 177.86867024
Iteration 716 lower bound 164.131405784
Iteration 717 lower bound 175.860378319
Iteration 718 lower bound 177.085611228
Iteration 719 lower bound 176.419183936
Iteration 720 lower bound 172.500117778
Iteration 721 lower bound 177.841864227
Iteration 722 lower bound 170.294261554
Iteration 723 lower bound 172.693205657
Iteration 724 lower bound 173.892426197
Iteration 725 lower bound 179.982640965
Iteration 726 lower bound 165.614701974
Iteration 727 lower bound 184.245143957
Iteration 728 lower bound 182.657218169
Iteration 729 lower bound 167.697395591
Iteration 730 lower bound 181.455377542
Iteration 731 lower bound 165.918805373
Iteration 732 lower bound 158.849737805
Iteration 733 lower bound 179.870245114
Iteration 734 lower bound 176.060005852
Iteration 735 lower bound 173.525690498
Iteration 736 lower bound 177.285713811
Iteration 737 lower bound 166.571632544
Iteration 738 lower bound 178.50542767
Iteration 739 lower bound 181.270942514
Iteration 740 lower bound 180.591511649
Iteration 741 lower bound 177.71145663
Iteration 742 lower bound 183.106068995
Iteration 743 lower bound 183.357070445
Iteration 744 lower bound 169.621615909
Iteration 745 lower bound 171.222207379
Iteration 746 lower bound 178.118446469
Iteration 747 lower bound 173.517390413
Iteration 748 lower bound 184.964365355
Iteration 749 lower bound 180.98814646
Iteration 750 lower bound 181.229471579
Iteration 751 lower bound 167.999724855
Iteration 752 lower bound 172.342932377
Iteration 753 lower bound 170.632448047
Iteration 754 lower bound 183.616455906
Iteration 755 lower bound 164.250326695
Iteration 756 lower bound 179.763408199
Iteration 757 lower bound 172.123787294
Iteration 758 lower bound 168.244904124
Iteration 759 lower bound 171.485320714
Iteration 760 lower bound 186.244534026
Iteration 761 lower bound 177.820706017
Iteration 762 lower bound 169.144535689
Iteration 763 lower bound 180.76202371
Iteration 764 lower bound 182.811191711
Iteration 765 lower bound 178.926227092
Iteration 766 lower bound 183.387335211
Iteration 767 lower bound 184.551087681
Iteration 768 lower bound 182.666449844
Iteration 769 lower bound 181.584855827
Iteration 770 lower bound 176.803513441
Iteration 771 lower bound 172.003258042
Iteration 772 lower bound 179.457696815
Iteration 773 lower bound 186.466572739
Iteration 774 lower bound 175.371263064
Iteration 775 lower bound 181.486588152
Iteration 776 lower bound 176.991264331
Iteration 777 lower bound 172.236750464
Iteration 778 lower bound 179.543991931
Iteration 779 lower bound 182.998168177
Iteration 780 lower bound 168.054796512
Iteration 781 lower bound 173.48590179
Iteration 782 lower bound 161.705613329
Iteration 783 lower bound 168.838210158
Iteration 784 lower bound 179.906178946
Iteration 785 lower bound 174.160654819
Iteration 786 lower bound 174.760868091
Iteration 787 lower bound 176.912077954
Iteration 788 lower bound 180.200343258
Iteration 789 lower bound 177.633793187
Iteration 790 lower bound 178.181108251
Iteration 791 lower bound 181.213733973
Iteration 792 lower bound 181.363921432
Iteration 793 lower bound 177.575675135
Iteration 794 lower bound 180.778732416
Iteration 795 lower bound 175.686577895
Iteration 796 lower bound 171.454308236
Iteration 797 lower bound 172.714729111
Iteration 798 lower bound 161.683279756
Iteration 799 lower bound 178.668586708
Iteration 800 lower bound 177.737009947
Iteration 801 lower bound 174.43189672
Iteration 802 lower bound 167.924525192
Iteration 803 lower bound 185.934753033
Iteration 804 lower bound 187.822428541
Iteration 805 lower bound 184.29878377
Iteration 806 lower bound 179.951429783
Iteration 807 lower bound 170.812941448
Iteration 808 lower bound 184.420504063
Iteration 809 lower bound 176.464490275
Iteration 810 lower bound 178.483795787
Iteration 811 lower bound 185.404471906
Iteration 812 lower bound 185.138898623
Iteration 813 lower bound 183.257375889
Iteration 814 lower bound 186.982343033
Iteration 815 lower bound 184.750134747
Iteration 816 lower bound 176.109394397
Iteration 817 lower bound 165.039015266
Iteration 818 lower bound 176.706899104
Iteration 819 lower bound 185.123488844
Iteration 820 lower bound 183.429389148
Iteration 821 lower bound 159.621517922
Iteration 822 lower bound 184.035590021
Iteration 823 lower bound 178.82132348
Iteration 824 lower bound 175.684392172
Iteration 825 lower bound 180.919127322
Iteration 826 lower bound 185.919191564
Iteration 827 lower bound 190.810194569
Iteration 828 lower bound 184.448158697
Iteration 829 lower bound 179.329187047
Iteration 830 lower bound 180.645717203
Iteration 831 lower bound 180.732298067
Iteration 832 lower bound 185.478402473
Iteration 833 lower bound 179.965592343
Iteration 834 lower bound 185.629913827
Iteration 835 lower bound 185.773643644
Iteration 836 lower bound 179.319624388
Iteration 837 lower bound 170.364421177
Iteration 838 lower bound 189.521761771
Iteration 839 lower bound 186.052866979
Iteration 840 lower bound 177.84914364
Iteration 841 lower bound 184.783929247
Iteration 842 lower bound 177.569334304
Iteration 843 lower bound 187.879596824
Iteration 844 lower bound 184.785637884
Iteration 845 lower bound 173.341213307
Iteration 846 lower bound 166.413187197
Iteration 847 lower bound 178.146013538
Iteration 848 lower bound 176.670723137
Iteration 849 lower bound 176.609686434
Iteration 850 lower bound 180.620416411
Iteration 851 lower bound 167.64799275
Iteration 852 lower bound 174.585613158
Iteration 853 lower bound 173.014702865
Iteration 854 lower bound 179.427348514
Iteration 855 lower bound 168.459458659
Iteration 856 lower bound 174.885587179
Iteration 857 lower bound 175.780008938
Iteration 858 lower bound 175.104263254
Iteration 859 lower bound 173.435944709
Iteration 860 lower bound 179.250707646
Iteration 861 lower bound 177.567976681
Iteration 862 lower bound 180.560847563
Iteration 863 lower bound 182.441511795
Iteration 864 lower bound 181.491414008
Iteration 865 lower bound 177.409176617
Iteration 866 lower bound 182.418544658
Iteration 867 lower bound 179.303978281
Iteration 868 lower bound 164.11099128
Iteration 869 lower bound 187.698710006
Iteration 870 lower bound 181.543250685
Iteration 871 lower bound 180.989809366
Iteration 872 lower bound 175.377125217
Iteration 873 lower bound 182.894426058
Iteration 874 lower bound 183.933530212
Iteration 875 lower bound 177.409531234
Iteration 876 lower bound 185.396039756
Iteration 877 lower bound 186.188854598
Iteration 878 lower bound 186.816376449
Iteration 879 lower bound 185.516137464
Iteration 880 lower bound 191.76108647
Iteration 881 lower bound 187.919142684
Iteration 882 lower bound 187.907400685
Iteration 883 lower bound 179.844970941
Iteration 884 lower bound 174.329120976
Iteration 885 lower bound 174.189271181
Iteration 886 lower bound 180.467207058
Iteration 887 lower bound 190.111387825
Iteration 888 lower bound 179.548250841
Iteration 889 lower bound 174.398681047
Iteration 890 lower bound 174.649467034
Iteration 891 lower bound 185.168253499
Iteration 892 lower bound 186.499782938
Iteration 893 lower bound 184.166925798
Iteration 894 lower bound 184.793299048
Iteration 895 lower bound 168.639986831
Iteration 896 lower bound 185.446718132
Iteration 897 lower bound 181.096178528
Iteration 898 lower bound 177.792576867
Iteration 899 lower bound 185.463910803
Iteration 900 lower bound 180.46102683
Iteration 901 lower bound 187.010863645
Iteration 902 lower bound 185.041904159
Iteration 903 lower bound 180.841054644
Iteration 904 lower bound 164.621588726
Iteration 905 lower bound 181.075355391
Iteration 906 lower bound 178.16140602
Iteration 907 lower bound 153.632016771
Iteration 908 lower bound 174.31381182
Iteration 909 lower bound 182.898740574
Iteration 910 lower bound 144.064255689
Iteration 911 lower bound 170.703213926
Iteration 912 lower bound 188.445117765
Iteration 913 lower bound 167.320944403
Iteration 914 lower bound 168.727092307
Iteration 915 lower bound 136.314972959
Iteration 916 lower bound 180.69442869
Iteration 917 lower bound 175.155087696
Iteration 918 lower bound 164.936204481
Iteration 919 lower bound 171.501522081
Iteration 920 lower bound 190.703632354
Iteration 921 lower bound 187.859513559
Iteration 922 lower bound 169.942971352
Iteration 923 lower bound 184.640638287
Iteration 924 lower bound 177.757867035
Iteration 925 lower bound 176.746159209
Iteration 926 lower bound 172.918350728
Iteration 927 lower bound 190.374414832
Iteration 928 lower bound 182.404168985
Iteration 929 lower bound 171.869416292
Iteration 930 lower bound 189.440714078
Iteration 931 lower bound 184.350542619
Iteration 932 lower bound 169.568327528
Iteration 933 lower bound 189.13066054
Iteration 934 lower bound 183.014988672
Iteration 935 lower bound 186.251128501
Iteration 936 lower bound 190.977052672
Iteration 937 lower bound 187.203474434
Iteration 938 lower bound 182.171899918
Iteration 939 lower bound 185.784861311
Iteration 940 lower bound 187.134883335
Iteration 941 lower bound 189.273340675
Iteration 942 lower bound 182.61681117
Iteration 943 lower bound 192.600820274
Iteration 944 lower bound 168.128643
Iteration 945 lower bound 170.38523507
Iteration 946 lower bound 185.33017217
Iteration 947 lower bound 176.279451432
Iteration 948 lower bound 178.330506573
Iteration 949 lower bound 183.525267924
Iteration 950 lower bound 187.290522812
Iteration 951 lower bound 184.76178505
Iteration 952 lower bound 187.832146127
Iteration 953 lower bound 195.630256717
Iteration 954 lower bound 193.532483173
Iteration 955 lower bound 183.922245458
Iteration 956 lower bound 173.341253942
Iteration 957 lower bound 188.730086623
Iteration 958 lower bound 189.949162125
Iteration 959 lower bound 191.089311981
Iteration 960 lower bound 191.663879616
Iteration 961 lower bound 188.287672669
Iteration 962 lower bound 191.503809725
Iteration 963 lower bound 187.024594247
Iteration 964 lower bound 189.323030988
Iteration 965 lower bound 187.369676297
Iteration 966 lower bound 183.883473245
Iteration 967 lower bound 179.232938146
Iteration 968 lower bound 189.81630415
Iteration 969 lower bound 189.398475513
Iteration 970 lower bound 186.281105385
Iteration 971 lower bound 199.578734886
Iteration 972 lower bound 194.638226224
Iteration 973 lower bound 191.323917304
Iteration 974 lower bound 182.192301712
Iteration 975 lower bound 188.528318632
Iteration 976 lower bound 190.430082826
Iteration 977 lower bound 195.508082213
Iteration 978 lower bound 184.670766293
Iteration 979 lower bound 176.955162472
Iteration 980 lower bound 176.288033931
Iteration 981 lower bound 183.607559177
Iteration 982 lower bound 175.234595729
Iteration 983 lower bound 191.800173866
Iteration 984 lower bound 185.788671728
Iteration 985 lower bound 189.822434353
Iteration 986 lower bound 190.953633047
Iteration 987 lower bound 186.721955423
Iteration 988 lower bound 192.181543696
Iteration 989 lower bound 193.459128198
Iteration 990 lower bound 194.056625196
Iteration 991 lower bound 184.818025982
Iteration 992 lower bound 194.468882159
Iteration 993 lower bound 187.760284185
Iteration 994 lower bound 188.574765182
Iteration 995 lower bound 188.637713055
Iteration 996 lower bound 191.131298232
Iteration 997 lower bound 186.295382414
Iteration 998 lower bound 194.404946176
Iteration 999 lower bound 183.432407797

You don't mind adding 1000 lines to the notebook output do you? Should be able to see the lower bound consistently increasing.


In [32]:
%output size=350
plot_inputs = np.linspace(-4,4,200).reshape(-1,1)
preds = sample_predictions(variational_params, plot_inputs)
hv.Overlay([hv.Curve(zip(plot_inputs.ravel(), p.ravel())) for p in preds])*hv.Points(zip(inputs,targets))


Out[32]:

Now that it's trained we can sample weights from our model as if it were the posterior distribution over weights and look at the resulting predictions of our model. We can see that we have captured uncertainty where the model has no data, and fit the data where it is placed.