We want to know more about optimal step sizes.
In [1]:
import h5py
f = h5py.File('../data/mc.hdf5', mode='r')
list(f.keys())
In [7]:
d = f['samples']
In [10]:
d.value.shape
Out[10]:
It's a $50000 \times 6$ dataset.
In [11]:
import matplotlib.pyplot as plt
% matplotlib inline
% config InlineBackend.figure_format = 'retina'
import seaborn as sns
In [33]:
from IPython.display import Image
In [41]:
!chain.py --files ../data/mc.hdf5 --chain --burn=10000
In [42]:
Image(filename='walkers.png')
Out[42]:
In [30]:
!chain.py --files ../data/mc.hdf5 --triangle --burn=10000
In [43]:
Image(filename='triangle.png')
Out[43]:
Not bad! There is little constraint on $\log{g}, Z$, but we sort of expected that.
In [49]:
!chain.py --files ../data/mc.hdf5 --cov
In [51]:
Image(filename='cor_coefficient.png')
Out[51]:
In [52]:
import numpy as np
In [54]:
ojs = np.load('opt_jump.npy')
In [55]:
ojs.shape
Out[55]: