In [37]:
#configure plotting
%matplotlib inline
%config InlineBackend.figure_format = 'svg'
import matplotlib;matplotlib.rcParams['figure.figsize'] = (8,5)

Gaussian process regression tutorial

Nicolas durrande 2013

with edits by James Hensman

We will see in this tutorial the basics for building a 1 dimensional and a 2 dimensional Gaussian process regression model, also known as a kriging model.

We first import the libraries we will need:


In [38]:
import numpy as np
from matplotlib import pyplot as plt
import GPy

1-dimensional model

For this toy example, we assume we have the following inputs and outputs:


In [39]:
X = np.random.uniform(-3.,3.,(20,1))
Y = np.sin(X) + np.random.randn(20,1)*0.05

Note that the observations Y include some noise.

The first step is to define the covariance kernel we want to use for the model. We choose here a kernel based on Gaussian kernel (i.e. rbf or square exponential):


In [40]:
kernel = GPy.kern.RBF(input_dim=1, variance=1., lengthscale=1.)

The parameter input_dim stands for the dimension of the input space. The parameters variance and lengthscale are optional, and default to 1. Many other kernels are implemented, type GPy.kern.<tab> to see a list


In [41]:
GPy.kern.


  File "<ipython-input-41-5724e546a3bc>", line 1
    GPy.kern.
             ^
SyntaxError: invalid syntax

The inputs required for building the model are the observations and the kernel:


In [42]:
m = GPy.models.GPRegression(X,Y,kernel)

By default, some observation noise is added to the model. The functions print and plot give an insight of the model we have just built:


In [43]:
print m


Name                 : GP regression
Log-likelihood       : -22.9100898977
Number of Parameters : 3
Parameters:
  GP_regression.           |  Value  |  Constraint  |  Prior  |  Tied to
  rbf.variance             |    1.0  |     +ve      |         |         
  rbf.lengthscale          |    1.0  |     +ve      |         |         
  Gaussian_noise.variance  |    1.0  |     +ve      |         |         

In [44]:
m.plot()


Out[44]:
{'dataplot': [<matplotlib.lines.Line2D at 0x7911bd0>],
 'gpplot': [[<matplotlib.lines.Line2D at 0x790d790>],
  [<matplotlib.patches.Polygon at 0x790d9d0>],
  [<matplotlib.lines.Line2D at 0x7911050>],
  [<matplotlib.lines.Line2D at 0x7911590>]]}

The above cell shows our GP regression model before optimization of the parameters. The shaded region corresponds to ~95% confidence intervals (ie +/- 2 standard deviation).

The default values of the kernel parameters may not be optimal for the current data (for example, the confidence intervals seems too wide on the previous figure). A common approach is to find the values of the parameters that maximize the likelihood of the data. It as easy as calling m.optimize in GPy:


In [45]:
m.optimize()

If we want to perform some restarts to try to improve the result of the optimization, we can use the optimize_restarts function. This selects random (drawn from $N(0,1)$) initializations for the parameter values, optimizes each, and sets the model to the best solution found.


In [46]:
m.optimize_restarts(num_restarts = 10)


Optimization restart 1/10, f = -20.0429453747
Optimization restart 2/10, f = -20.0429453747
Optimization restart 3/10, f = -20.0429453747
Optimization restart 4/10, f = -20.0429453746
Optimization restart 5/10, f = -20.0429453747
Optimization restart 6/10, f = -20.0429453742
Optimization restart 7/10, f = -20.0429453747
Optimization restart 8/10, f = -20.0429453747
Optimization restart 9/10, f = -20.0429453747
Optimization restart 10/10, f = -20.0429453747

In this simple example, the objective function (usually!) has only one local minima, and each of the found solutions are the same.

Once again, we can use print(m) and m.plot() to look at the resulting model resulting model. This time, the paraemters values have been optimized agains the log likelihood (aka the log marginal likelihood): the fit shoul dbe much better.


In [47]:
print m
_ = m.plot()


Name                 : GP regression
Log-likelihood       : 20.0429453747
Number of Parameters : 3
Parameters:
  GP_regression.           |       Value        |  Constraint  |  Prior  |  Tied to
  rbf.variance             |    0.848836422258  |     +ve      |         |         
  rbf.lengthscale          |     1.83251490926  |     +ve      |         |         
  Gaussian_noise.variance  |  0.00140539899957  |     +ve      |         |         

2-dimensional example

Here is a 2 dimensional example:


In [30]:
# sample inputs and outputs
X = np.random.uniform(-3.,3.,(50,2))
Y = np.sin(X[:,0:1]) * np.sin(X[:,1:2])+np.random.randn(50,1)*0.05

# define kernel
ker = GPy.kern.Matern52(2,ARD=True) + GPy.kern.White(2)

# create simple GP model
m = GPy.models.GPRegression(X,Y,ker)

# optimize and plot
m.optimize(max_f_eval = 1000)
m.plot()
print(m)


Name                 : GP regression
Log-likelihood       : 18.9327625016
Number of Parameters : 5
Parameters:
  GP_regression.           |       Value        |  Constraint  |  Prior  |  Tied to
  add.Mat52.variance       |    0.283004003921  |     +ve      |         |         
  add.Mat52.lengthscale    |              (2,)  |     +ve      |         |         
  add.white.variance       |  0.00114317815501  |     +ve      |         |         
  Gaussian_noise.variance  |  0.00114317815501  |     +ve      |         |         

The flag ARD=True in the definition of the Matern kernel specifies that we want one lengthscale parameter per dimension (ie the GP is not isotropic). Note that for 2-d plotting, only the mean is shown.

Plotting slices

To see the uncertaintly associated with the above predictions, we can plot slices through the surface. this is done by passing the optional fixed_inputs argument to the plot function. fixed_inputs is a list of tuples containing which of the inputs to fix, and to which value.

To get horixontal slices of the above GP, we'll fix second (index 1) input to -1, 0, and 1.5:


In [34]:
figure, axes = plt.subplots(3,1)
for ax, y in zip(axes, [-1, 0, 1.5]):
    m.plot(fixed_inputs=[(1,y)], ax=ax)
    ax.set_ylabel('y=%d'%y)
plt.suptitle('horizontal slices through the function')


Out[34]:
<matplotlib.text.Text at 0x6386350>

A few things to note:

  • we've also passed the optional ax argument, to mnake the GP plot on a particular subplot
  • the data look strange here: we're seeing slices of the GP, but all the data are displayed, even though they might not be close to the current slice.

To get vertical slices, we simply fixed the other input. We'll turn the display of data off also:


In [35]:
figure, axes = plt.subplots(3,1)
for ax, y in zip(axes, [-1, 0, 1.5]):
    m.plot(fixed_inputs=[(1,y)], ax=ax, which_data_rows=[])
    ax.set_ylabel('y=%d'%y)
plt.suptitle('vertical slices through the function')


Out[35]:
<matplotlib.text.Text at 0x73f7910>

You can find a host of other plotting options in the m.plot docstring. Type m.plot?<enter> to see.