Jupyter notebooks

This is a Jupyter notebook using Python. You can install Jupyter locally to edit and interact with this notebook.

Interpolation and Regression

Interpolation and regression address the problem of approximating functions using their (possibly noisy) values at a finite set of points. There is usually an underlying process from which the observed data are obtained, but this process is impractical to evaluate every time a function value is needed. Examples of underlying processes include:

  • direct field observations/measurement of a physical or social system
  • numerically processed observations, perhaps by applying physical principles
  • output from an expensive "exact" numerical computation
  • output from an approximate numerical computation

We would like an inexpensive deterministic surrogate that we can use instead. The most common surrogate functions are polynomials and rational functions (ratios of polynomials) because they are convenient to compute with. Other choices are often made when there is prior knowledge about the behavior of the system, such as using

  • $\sin kx$ and $\cos kx$ to represent periodic functions
  • powers/exponentials ($a^x$ or $a^{1/x}$) for material properties or reaction rates (e.g., Arhennius relations).

We start our discussion by building surrogate functions that exactly match the observations at a number of points, either given or specially chosen, using polynomials.

Polynomial Interpolation

In the Linear Algebra notebook, we discussed Vandermonde matrices which we could use to solve for polynomial coefficients. It is also possible to compute the coefficients explicitly (rather than by solving a linear system).

Lagrange Interpolating Polynomials

Suppose we are given function values $y_0, \dotsc, y_m$ at the distinct points $x_0, \dotsc, x_m$ and we would like to build a polynomial of degree $m$ that goes through all these points. This explicit construction is attributed to Lagrange (though he was not first):

$$ p(x) = \sum_{i=0}^m y_i \prod_{j \ne i} \frac{x - x_j}{x_i - x_j} $$
  • What is the degree of this polynomial?
  • Why is $p(x_i) = y_i$?
  • How expensive (in terms of $m$) is it to evaluate $p(x)$?
  • How expensive (in terms of $m$) is it to convert to standard form $p(x) = \sum_{i=0}^m a_i x^i$?
  • Can we easily evaluate the derivative $p'(x)$?
  • What can go wrong? Is this formulation numerically stable?

In [63]:
%matplotlib notebook
import numpy
from matplotlib import pyplot

def lagrange(x, y):
    def p(t):
        from numpy import prod
        m = len(x) - 1
        w = 0
        for i in range(m):
            w += y[i] * (prod(t - x[:i]) * prod(t - x[i+1:])
                / (prod(x[i] - x[:i]) * prod(x[i] - x[i+1:])))
        w += y[m] * prod(t - x[:m]) / prod(x[m] - x[:m])
        return w
    return numpy.vectorize(p)

x = numpy.linspace(-2,2,4)
y = numpy.sin(x)
p = lagrange(x, y)
xx = numpy.linspace(-3,3)
pyplot.style.use('ggplot')
pyplot.rcParams['figure.max_open_warning'] = False
pyplot.figure()
pyplot.plot(x, y, '*')
pyplot.plot(xx, p(xx), label='p(x)')
pyplot.plot(xx, numpy.sin(xx), label='sin(x)')
pyplot.legend(loc='upper left')
pyplot.show()


Uniqueness

Is the polynomial $p(x)$ of degree $m$ that interpolates $m+1$ points unique? Why?

Vandermonde matrices

We have used the Vandermonde matrix with a monomial basis for polynomial interpolation.


In [2]:
p = numpy.linalg.solve(numpy.vander(x), y)
pyplot.figure()
pyplot.plot(x, y, '*')
pyplot.plot(xx, numpy.vander(xx, 4).dot(p), label='p(x)')
pyplot.plot(xx, numpy.sin(xx), label='sin(x)')
pyplot.legend(loc='upper left')


Out[2]:
<matplotlib.legend.Legend at 0x7f5f778f2828>

Vandermonde matrices are often ill-conditioned and this requires solving an $m\times m$ linear system, at a cost of $m^3$.

Newton polynomials

Newton polynomials are polynomials

$$ n_k(x) = \prod_{i=0}^{k-1} (x - x_i) $$

How does the Vandermonde procedure change if we replace $x^k$ with $n_k(x)$?


In [3]:
def vander_newton(x, abscissa=None):
    if abscissa is None:
        abscissa = x
    n = len(abscissa)
    A = numpy.zeros((len(x), n))
    A[:,0] = 1
    for i in range(1,n):
        A[:,i] = A[:,i-1] * (x - abscissa[i-1])
    return A

A = vander_newton(numpy.linspace(-1,1,5))
print(A)


[[ 1.    0.   -0.    0.   -0.  ]
 [ 1.    0.5   0.   -0.    0.  ]
 [ 1.    1.    0.5   0.   -0.  ]
 [ 1.    1.5   1.5   0.75  0.  ]
 [ 1.    2.    3.    3.    1.5 ]]
  • Does this affect the cost of solving for the coefficients?
  • How does the condition number depend on the number and position of the points?

In [4]:
# First, let's check that it works.
p = numpy.linalg.solve(vander_newton(x), y)
pyplot.figure()
pyplot.plot(x, y, '*')
pyplot.plot(xx, vander_newton(xx, x).dot(p), label='p(x)')
pyplot.plot(xx, numpy.sin(xx), label='sin(x)')
pyplot.legend(loc='upper left')


Out[4]:
<matplotlib.legend.Legend at 0x7f5f77794588>

In [5]:
def cond(mat, points, interval=(-1,1), nmax=20):
    degree = numpy.arange(2, nmax)
    return degree, numpy.array([numpy.linalg.cond(mat(points(*interval,n))) for n in degree])

pyplot.figure()
pyplot.semilogy(*cond(numpy.vander, numpy.linspace), label='monomial')
pyplot.semilogy(*cond(vander_newton, numpy.linspace), label='newton')
pyplot.legend(loc='upper left')


Out[5]:
<matplotlib.legend.Legend at 0x7f5f77769c88>

In [6]:
pyplot.figure()
pyplot.semilogy(*cond(numpy.vander, numpy.linspace, (10,12)), label='monomial')
pyplot.semilogy(*cond(vander_newton, numpy.linspace, (10,12)), label='newton')
pyplot.legend(loc='upper left')


Out[6]:
<matplotlib.legend.Legend at 0x7f5f776cbc18>

Conclusions

  • Vandermonde matrices are typically ill-conditioned. Even with many points, columns typically become nearly linearly dependent.
  • Interpolation using an arbitrary basis requires $O(n^3)$ operations for $n$ data points because we must solve with a full Vandermonde matrix.
  • Newton polynomials cause the Vandermonde matrix to be triangular, thus $O(n^2)$ for interpolation.
  • Newton polynomials can incrementally assimilate new observations: just add extra rows.

Polynomial bases

We have seen that monomials and Newton bases are ill-conditioned, but we have a procedure for constructing well-conditioned bases that span the same space.


In [7]:
def vander_q(x, n=None, interval=None, print_basis=False):
    if n is None:
        n = len(x)
    if interval is None:
        a, b = min(x), max(x)
    else:
        a, b = interval
    # Set up integration on the interval [a,b] using the midpoint rule
    w = b - a
    V = numpy.vander(numpy.linspace(a + 0.5*w/100, b - 0.5*w/100, 100), n, increasing=True)
    V *= numpy.sqrt(w/100)
    Q, R = numpy.linalg.qr(V)
    if print_basis:
        print('R', R)
    A = numpy.vander(x, n, increasing=True)
    return numpy.linalg.solve(R.T, A.T).T

p = numpy.linalg.solve(vander_q(x), y)
pyplot.figure()
pyplot.plot(x, y, '*')
pyplot.plot(xx, vander_q(xx, 4, interval=(min(x), max(x))).dot(p), label='p(x)')
pyplot.plot(xx, numpy.sin(xx), label='sin(x)')
pyplot.legend(loc='upper left')


Out[7]:
<matplotlib.legend.Legend at 0x7f5f776b1668>

In [8]:
pyplot.figure()
pyplot.semilogy(*cond(numpy.vander, numpy.linspace, (-1,1)), label='monomial')
pyplot.semilogy(*cond(vander_newton, numpy.linspace, (-1,1)), label='newton')
pyplot.semilogy(*cond(vander_q, numpy.linspace, (-1,1)), label='q')
pyplot.legend(loc='upper left')


Out[8]:
<matplotlib.legend.Legend at 0x7f5f77661a90>

In [9]:
def cosspace(a, b, n=50):
    return (a + b)/2 + (b - a)/2 * (numpy.cos(numpy.linspace(0, numpy.pi, n)))

pyplot.figure()
pyplot.semilogy(*cond(numpy.vander, cosspace, (-1,1)), label='monomial')
pyplot.semilogy(*cond(vander_newton, cosspace, (-1,1)), label='newton')
pyplot.semilogy(*cond(vander_q, cosspace, (-1,1)), label='q')
pyplot.legend(loc='upper left')


Out[9]:
<matplotlib.legend.Legend at 0x7f5f775eca58>

In [10]:
for n in range(2,5):
    vander_q(numpy.linspace(-1,1,n), print_basis=True)


R [[ -1.41421356e+00  -1.11022302e-16]
 [  0.00000000e+00   8.16455755e-01]]
R [[ -1.41421356e+00  -1.11022302e-16  -4.71357380e-01]
 [  0.00000000e+00   8.16455755e-01   1.38777878e-17]
 [  0.00000000e+00   0.00000000e+00  -4.21531607e-01]]
R [[ -1.41421356e+00  -1.11022302e-16  -4.71357380e-01  -5.55111512e-17]
 [  0.00000000e+00   8.16455755e-01   1.38777878e-17   4.89759149e-01]
 [  0.00000000e+00   0.00000000e+00  -4.21531607e-01  -6.59194921e-17]
 [  0.00000000e+00   0.00000000e+00   0.00000000e+00   2.13659327e-01]]

Observations

  • Orthogonalizing the monomials makes for a much better conditioned basis.
  • That basis has much smaller condition number for interpolation on equally spaced points.
  • The condition number still grows exponentially.
  • Using cosspace for interpolation with monomials or Newton basis does not qualitatively change their ill-conditioning.
  • Using cosspace with orthogonal polynomials gives a small condition number.
  • The orthogonal polynomials can be written as a linear combination of monomials.
  • That is, a different sequence of constant, linear, quadratic, etc., polynomials.

Another look at these polynomials


In [11]:
x = numpy.linspace(-1, 1, 100)
M = numpy.vander(x, 8, increasing=True)
N = vander_newton(x, abscissa=numpy.linspace(-1,1,8))
Q = vander_q(x, 8)

pyplot.figure()
pyplot.plot(x, M)
pyplot.title('Monomials')

pyplot.figure()
pyplot.plot(x, N)
pyplot.title('Newton Polynomials')

pyplot.figure()
pyplot.plot(x, Q)
pyplot.title('Orthogonal Polynomials')


Out[11]:
<matplotlib.text.Text at 0x7f5f774ed438>

Constructing these "good" (orthogonal) polynomials using QR factorization is a bit cumbersome and depends on a finite accuracy parameter.

Legendre Polynomials

Classical (19th century) mathematics discovered the polynomials we are approximating because they are eigenfunctions (resonant modes) of a differential operator, $$ \frac{d}{d x} (1 - x^2) \frac{d P_n(x)}{dx} . $$ Anyway, this theory led to a recursive definition $$\begin{split} P_0(x) &= 1 \\ P_1(x) &= x \\ (n+1) P_{n+1}(x) &= (2n+1) x P_n(x) - n P_{n-1}(x) \end{split}$$

We can implement this recurrence in code to efficiently evaluate Legendre polynomials.


In [12]:
def vander_legendre(x, n=None):
    if n is None:
        n = len(x)
    P = numpy.ones((len(x), n))
    if n > 1:
        P[:,1] = x
    for k in range(1,n-1):
        P[:,k+1] = ((2*k+1) * x * P[:,k] - k * P[:,k-1]) / (k + 1)
    return P

P = vander_legendre(x, 8)
pyplot.figure()
pyplot.plot(x, P)
pyplot.title('Legendre Polynomials')


Out[12]:
<matplotlib.text.Text at 0x7f5f77449c88>

In [13]:
# Legendre polynomials are well-conditioned

pyplot.figure()
pyplot.semilogy(*cond(vander_q, cosspace, (-1,1)), label='q')
pyplot.semilogy(*cond(vander_legendre, cosspace, (-1,1)), label='legendre')
pyplot.legend(loc='upper left')


Out[13]:
<matplotlib.legend.Legend at 0x7f5f773cfcf8>

Chebyshev polynomials

Define $$ T_n(x) = \cos (n \arccos(x)) .$$ This turns out to be a polynomial, but it's not obvious why. Recall $$ \cos(a + b) = \cos a \cos b - \sin a \sin b .$$ Let $y = \arccos x$ and check $$ \begin{split} T_{n+1}(x) &= \cos (n+1) y = \cos ny \cos y - \sin ny \sin y \\ T_{n-1}(x) &= \cos (n-1) y = \cos ny \cos y + \sin ny \sin y \end{split}$$ Adding these together produces a similar recurrence: $$\begin{split} T_0(x) &= 1 \\ T_1(x) &= x \\ T_{n+1}(x) &= 2 x T_n(x) - T_{n-1}(x) \end{split}$$ which we can also implement in code


In [14]:
def vander_chebyshev(x, n=None):
    if n is None:
        n = len(x)
    T = numpy.ones((len(x), n))
    if n > 1:
        T[:,1] = x
    for k in range(1,n-1):
        T[:,k+1] = 2* x * T[:,k] - T[:,k-1]
    return T

T = vander_chebyshev(x, 8)
pyplot.figure()
pyplot.plot(x, T)
pyplot.title('Chebyshev Polynomials')


Out[14]:
<matplotlib.text.Text at 0x7f5f7738bdd8>

In [15]:
# Chebyshev polynomials are also well-conditioned

pyplot.figure()
pyplot.semilogy(*cond(vander_q, cosspace, (-1,1)), label='q')
pyplot.semilogy(*cond(vander_legendre, cosspace, (-1,1)), label='legendre')
pyplot.semilogy(*cond(vander_chebyshev, cosspace, (-1,1)), label='chebyshev')
pyplot.legend(loc='upper left')


Out[15]:
<matplotlib.legend.Legend at 0x7f5f7730be80>

In [16]:
print(cond(vander_chebyshev, cosspace, (-1,1))[1])


[ 1.          1.41421356  1.5         1.81129136  1.6553889   1.73205081
  1.61576683  1.68614066  1.59066729  1.6553889   1.57298184  1.63299316
  1.55967145  1.61576683  1.54919334  1.60199598  1.54067117  1.59066729]

This is actually amazing: converting from the values at some special points to the coefficients of some specially crafted polynomials has a constant condition number of about 1.6. As we will find later, Chebyshev interpolation has a number of other remarkable properties.

Runge Effect

We've seen before that the accuracy of an interpolating polynomial is often poor near (and beyond) the ends of the interval. We've also found "good" bases for representing the polynomials and found that when "good" points are used, the condition number can be small independent of polynomial order/number of points. When points are poorly spaced, however, the condition number grows rapidly.


In [17]:
print(cond(vander_chebyshev, numpy.linspace, (-1,1))[1])


[  1.00000000e+00   1.41421356e+00   1.85420334e+00   2.21525044e+00
   2.93455079e+00   3.87489586e+00   5.84747351e+00   8.69278249e+00
   1.45809411e+01   2.36653261e+01   4.21307493e+01   7.12649335e+01
   1.31546312e+02   2.25731724e+02   4.26917827e+02   7.40862750e+02
   1.42216979e+03   2.50187219e+03]

In [26]:
def chebyshev_interp_and_eval(x, xx):
    """Matrix mapping from values at points x to values
    of Chebyshev interpolating polynomial at points xx"""
    A = vander_chebyshev(x)
    B = vander_chebyshev(xx, len(x))
    return B.dot(numpy.linalg.inv(A))

print(numpy.linalg.cond(chebyshev_interp_and_eval(cosspace(-1,1,20),
                                                  numpy.linspace(-1,1,1000))))


5.36030029497

In [27]:
print(numpy.linalg.cond(chebyshev_interp_and_eval(cosspace(-1,1,20),
                                                  cosspace(-1,1,100))))


1.45754700986

In [28]:
print(numpy.linalg.cond(chebyshev_interp_and_eval(numpy.linspace(-1,1,20),
                                                  numpy.linspace(-1,1,100))))


4115.2853068

In [21]:
print(numpy.linalg.cond(chebyshev_interp_and_eval(cosspace(-1,1,20),
                                                  numpy.linspace(-2,2,100))))


11716704807.7

What are we seeing?

  • Constructing the polynomials from x=cosspace points is good (well conditioned)
  • Constructing the polynomial from x=linspace points is bad (ill conditioned)
  • We can evaluate anywhere within the interval (-1,1) accurately; it doesn't matter whether we use linspace or cosspace.
  • Evaluating outside the interval ("extrapolation") is terrible

What does this ill-conditioning look like in practice?


In [22]:
def runge1(x):
    return 1 / (1 + 10*x**2)
x = numpy.linspace(-1,1,20)
xx = numpy.linspace(-1,1,100)
pyplot.figure()
pyplot.plot(x, runge1(x), '*')
pyplot.plot(xx, chebyshev_interp_and_eval(x, xx).dot(runge1(x)))


Out[22]:
[<matplotlib.lines.Line2D at 0x7f5f7771a400>]

In [23]:
x = numpy.linspace(-1,1,21)
pyplot.figure()
pyplot.plot(x, runge1(x), '*')
pyplot.plot(xx, chebyshev_interp_and_eval(x, xx).dot(runge1(x)))


Out[23]:
[<matplotlib.lines.Line2D at 0x7f5f772f6550>]

In [24]:
def runge2(x):
    return numpy.exp(-(4*x)**2)

pyplot.figure()
pyplot.plot(x, runge2(x), '*')
pyplot.plot(xx, chebyshev_interp_and_eval(x, xx).dot(runge2(x)))


Out[24]:
[<matplotlib.lines.Line2D at 0x7f5f77248e80>]

In [25]:
x = cosspace(-1,1,20)
pyplot.figure()
pyplot.plot(x, runge1(x), '*')
pyplot.plot(x, runge2(x), '^')
pyplot.plot(xx, chebyshev_interp_and_eval(x, xx).dot(runge1(x)))
pyplot.plot(xx, chebyshev_interp_and_eval(x, xx).dot(runge2(x)))


Out[25]:
[<matplotlib.lines.Line2D at 0x7f5f771e7ac8>]

In [35]:
def runge3(x):
    return 1.*(x > 0)

pyplot.figure()
pyplot.plot(x, runge3(x), '*')
pyplot.plot(xx, chebyshev_interp_and_eval(x, xx).dot(runge3(x)))


Out[35]:
[<matplotlib.lines.Line2D at 0x7f5f7510df98>]

Questions

  • The condition number is the ratio of largest singular value ($\sigma_{\max} = \lVert A \rVert$) to the smallest $\sigma_{\min}$. Is the condition number large because the norm is large or because the smallest is tiny?
  • Does the condition number of the interp_and_eval procedure above depend on the basis used to represent polynomials?

In [103]:
A = chebyshev_interp_and_eval(numpy.linspace(-1,1,20), xx)
print(A.shape, numpy.linalg.cond(A))
U, S, V = numpy.linalg.svd(A, full_matrices=0)
print(U.shape, S.shape, V.shape)
print(S[:4])
pyplot.figure()
pyplot.plot(x, V[:,:2])
pyplot.title('Worst input functions')


(100, 20) 4115.2853068
(100, 20) (20,) (20, 20)
[ 4141.82178207   756.18861167    11.20139444     5.20036681]
Out[103]:
<matplotlib.text.Text at 0x7f5f740c0f98>

In [110]:
pyplot.figure()
pyplot.plot(x, V[:,5])
pyplot.plot(xx, chebyshev_interp_and_eval(x, xx).dot(V[:,5]))


Out[110]:
[<matplotlib.lines.Line2D at 0x7f5f6fe0bd30>]

In [43]:
x = cosspace(-1,1,80)
numpy.linalg.norm(chebyshev_interp_and_eval(x, xx).dot(numpy.abs(x)) - numpy.abs(xx), numpy.inf)


Out[43]:
0.0044875206560248855

In [111]:
pyplot.figure()
pyplot.plot(xx, numpy.sin(1/xx))


Out[111]:
[<matplotlib.lines.Line2D at 0x7f5f6fd072b0>]

Accuracy

Up to this point, we have been primarily concerned with stability. That is, we have been looking for formulations in which small changes to the input do not produce large changes to the output. Intrinsically, this problem "should" have a norm of about 1 (with suitable scaling, or using the max ($\infty$) norm -- changing the input function should change the output function by the same amount.

Now we explore accuracy: the dependence of error on the cost of interpolation. We'll start with piecewise constant interpolation.


In [120]:
def find_nearest(x, xx):
    """For each target (xx), find the index of the nearest source point (x)"""
    i = x.argsort()  # Indices that sort x
    x = x[i]         # x sorted
    j = numpy.arange(len(i))[i] # Index into the original x
    loc = x.searchsorted(xx)
    loc -= abs(xx - x[loc-1]) < abs(xx - x.take(loc, mode='wrap'))
    return j[loc]

def piecewise_constant_interp_and_eval(x, xx):
    """Construct matrix for interpolation of data at x, evaluating at xx"""
    nearest = find_nearest(x, xx)
    A = numpy.zeros((len(xx), len(x)))
    A[numpy.arange(len(xx)), nearest] = 1
    return A

x = numpy.linspace(-1,1,15)
xx = numpy.linspace(-1, 1, 100)
pyplot.figure()
pyplot.ylim(-0.2, 1.2)
pyplot.plot(x, runge3(x), '*')
pyplot.plot(xx, piecewise_constant_interp_and_eval(x, xx).dot(runge3(x)), label='interp')
pyplot.plot(xx, runge3(xx), label='exact')


Out[120]:
[<matplotlib.lines.Line2D at 0x7f5f6fa4a748>]

In [121]:
print(numpy.linalg.cond(piecewise_constant_interp_and_eval(numpy.linspace(-1,1,20),
                                                           numpy.linspace(-1,1,100))))


1.41421356237

In [126]:
def maxerror(interp_and_eval, f, xspace, interval, npoints):
    error = []
    for n in npoints:
        x = xspace(*interval, n)
        xx = numpy.linspace(*interval, 300)
        A = interp_and_eval(x, xx)
        error.append(numpy.linalg.norm(A.dot(f(x)) - f(xx), numpy.inf))
    return error

npoints = numpy.arange(2, 30)
pyplot.figure()
pyplot.loglog(npoints, maxerror(piecewise_constant_interp_and_eval, runge1, numpy.linspace, (-1,1), npoints), label='piecewise_constant')
pyplot.loglog(npoints, maxerror(chebyshev_interp_and_eval, runge1, cosspace, (-1,1), npoints), label='chebyshev cos')
pyplot.loglog(npoints, maxerror(chebyshev_interp_and_eval, runge1, numpy.linspace, (-1,1), npoints), label='chebyshev lin')
pyplot.loglog(npoints, 1/npoints, label='slope=-1')
pyplot.loglog(npoints, npoints**(-2.), label='slope=-2')
pyplot.legend(loc='lower left')
pyplot.xlabel('Number of points')
pyplot.ylabel('Max error')


Out[126]:
<matplotlib.text.Text at 0x7f5f6f467518>

In [190]:
pyplot.figure()
pyplot.semilogy(npoints, maxerror(piecewise_constant_interp_and_eval, runge1, numpy.linspace, (-1,1), npoints), label='piecewise_constant')
pyplot.semilogy(npoints, maxerror(chebyshev_interp_and_eval, runge1, cosspace, (-1,1), npoints), label='chebyshev cos')
pyplot.semilogy(npoints, maxerror(chebyshev_interp_and_eval, runge1, numpy.linspace, (-1,1), npoints), label='chebyshev lin')
pyplot.semilogy(npoints, 1/npoints, label='$n^{-1}$')
pyplot.semilogy(npoints, npoints**(-2.), label='$n^{-2}$')
pyplot.legend(loc='lower left')
pyplot.xlabel('Number of points')
pyplot.ylabel('Max error')


Out[190]:
<matplotlib.text.Text at 0x7f5f6e2f8978>

Observations

  • Piecewise constant interpolation is stable on any set of points
  • Piecewise constant interpolation converges very slowly (needs many points to increase accuracy)
  • Chebyshev/polynomial interpolation requires special input points, otherwise it is not stable
  • Chebyshev/polynomial interpolation has "exponential" convergence

Splines

If we are given an arbitrary distribution of points, interpolation with a single polynomial is not robust. Piecewise constant interpolation is not very accurate and gives a rough function. We could improve the accuracy by using a piecewise linear function (see HW3), but the accuracy is still limited and the function still isn't smooth (there is a "corner" where the slope changes at each data point). Splines are a way to guarantee an arbitrary amount of smoothness. The idea is that given sorted input points $\{x_i\}_{i=0}^n$, we compute an interpolating polynomial $s_i(x)$ on every interval $(x_i, x_{i+1})$.

Interpolation

Given a function value $y_i$ at each $x_i$, we require $$\begin{split} s_i(x_i) &= y_i \\ s_i(x_{i+1}) &= y_{i+1} \end{split} $$ so that the polynomial interpolates our data. If the polynomial has order greater than 1, we are left with some extra degrees of freedom. To provide a unique solution, we'll need to add conditions.

Smoothness

The conditions above guarantee continuity, but not smoothness. We use our extra degree(s) of freedom to impose smoothness conditions of the form $$\begin{split} s_i'(x_{i+1}) &= s_{i+1}'(x_{i+1}) \\ s_i''(x_{i+1}) &= s_{i+1}''(x_{i+1}) . \end{split}$$ These conditions, which are applied at the interior nodes ($x=1,\dotsc,n-1$) couple the splines from adjacent intervals and causes the spline approximation to be globally coupled.

End-point conditions

The conditions above are still not enough to guarantee a unique spline. Suppose we use quadratic polynomials for each $s_i$. Then with $n$ intervals, we have $n$ degrees of freedom after imposing the interpolation condition. Meanwhile, there are only $n-1$ internal nodes. If we impose continuity of the first derivative, we have $n - (n-1) = 1$ undetermined degrees of freedom. We could fix this by imposing a boundary condition, such as that the slope at one end-point (e.g., $s_0'(x_0)$) was equal to a known value. This is not symmetric and is often an unnatural condition.

Suppose we use cubic polynomials. Now we have two degrees of freedom per interval after imposing the interpolation condition. If we impose continuity of the first and second derivatives, we have $2n - 2(n-1) = 2$ remaining degrees of freedom. A common choice here is the "natural spline", $s_0''(x_0) = 0$ and $s_n''(x_n) = 0$. Cubic splines are the most popular spline in this family.

Solving spline interpolation problems

We need to choose a basis for the polynomials $s_i(x)$. We could choose $$ s_i(x) = a_i + b_i x + c_i x^2 + d_i x^3 $$ but this would be very ill-conditioned when the interval $(x_i,x_{i+1})$ is far from zero. A better-conditioned choice is $$ s_i(x) = a_i + b_i(x - x_i) + c_i(x - x_i)^2 + d_i(x - x_i)^3 . $$ The interpolation property gives $$\begin{split} a_i &= y_i \\ a_i + b_i(x_{i+1} - x_i) + c_i(x_{i+1} - x_i)^2 + d_i(x_{i+1} - x_i)^3 &= y_{i+1} \end{split}$$ and continuity of the first and second derivatives gives $$\begin{split} b_i + 2c_i(x_{i+1} - x_i) + 3d_i(x_{i+1}-x_i)^2 &= b_{i+1} \\ 2c_i + 6d_i(x_{i+1} - x_i) &= 2 c_{i+1} . \end{split}$$ After trivially eliminating the $a_i$, this is a block bidiagonal system ($3\times 3$ blocks). We can reduce this to a scalar tridiagonal system. Define $\delta_i = x_{i+1} - x_i$ and $\Delta_i = y_{i+1} - y_i$. Then eliminate $d_i$ using $$ d_i = \frac{c_{i+1} - c_i}{3\delta_i} $$ and $b_i$ using $$\begin{split} b_i\delta i &= \Delta_i - c_i\delta_i^2 - \underbrace{\frac{c_{i+1} - c_i}{3\delta_i}}_{d_i} \delta_i^3 \\ b_i &= \frac{\Delta_i}{\delta_i} - \frac{\delta_i}{3}(c_{i+1} + 2c_i) . \end{split}$$ Substituting into the equation for continuity of the first derivative gives $$ \frac{\Delta_i}{\delta i} - \frac{\delta_i}{3}(c_{i+1} + 2c_i) + 2c_i\delta_i + (c_{i+1} - c_i)\delta_i = \frac{\Delta_{i+1}}{\delta_{i+1}} - \frac{\delta_{i+1}}{3}(c_{i+2} + 2c_{i+1}) $$ which reduces to $$ \delta_i c_i + 2(\delta_i + \delta_{i+1}) c_{i+1} + \delta_{i+1} c_{i+2} = 3\left(\frac{\Delta_{i+1}}{\delta_{i+1}} - \frac{\Delta_i}{\delta_i}\right) . $$ To impose boundary conditions, we add a dummy interval on the right end (the actual value of $x_{n+1}>x_n$ cancels out) so that the equation above is valid for $i = 0,\dotsc,n-2$ and the right boundary condition $s_{n-1}''(x_n) = s_n''(x_n)$ becomes $c_n = 0$. The left boundary condition $s_0''(x_0) = 0$ yields $c_0 = 0$, so we must solve $$\begin{bmatrix} 1 & & & & & & \\ \delta_0 & 2(\delta_0+\delta_1) & \delta_1 & & & & \\ & \delta_1 & 2(\delta_1+\delta_2) & \delta_2 & & & \\ & & \ddots & \ddots & \ddots & & \\ & & & & \delta_{n-2} & 2(\delta_{n-2}+\delta_{n-1}) & \delta_{n-1} \\ & & & & & & 1 \\ \end{bmatrix} \begin{bmatrix} c_0 \\ c_1 \\ \vdots \\ \\ c_{n-1} \\ c_n \end{bmatrix} = \begin{bmatrix} 0 \\ 3\left(\frac{\Delta_1}{\delta_1} - \frac{\Delta_0}{\delta_0} \right) \\ \vdots \\ 3\left(\frac{\Delta_{n-1}}{\delta_{n-1}} - \frac{\Delta_{n-2}}{\delta_{n-2}}\right) \\ 0 \end{bmatrix} .$$ After solving this equation for $c_i$, we will recover $b_i$ and $d_i$, and then can evaluate the spline at arbitrary points.


In [197]:
def spline_cubic_interp_and_eval(x, xx):
    n = len(x) - 1
    def s_interp(y):
        perm = numpy.argsort(x)
        xs = x[perm]
        ys = y[perm]
        delta = xs[1:] - xs[:-1]
        Delta = ys[1:] - ys[:-1]
        
        # Assemble tridiagonal system (this can be optimized)
        T = numpy.zeros((n+1, n+1))
        T[0,0] = 1
        for i in range(1,n):
            T[i,i-1:i+2] = [delta[i-1], 2*(delta[i-1]+delta[i]), delta[i]]
        T[-1,-1] = 1
        rhs = numpy.zeros(n+1)
        rhs[1:-1] = 3*(Delta[1:]/delta[1:] - Delta[:-1]/delta[:-1])
        c = numpy.linalg.solve(T, rhs)
        S = numpy.zeros((n, 5))    # Matrix to hold splines as [xs,d,c,b,a]
        S[:,0] = xs[:-1]                                  # Sorted interval left ends
        S[:,2] = c[:-1]                                   # From tridiagonal solve
        S[:,4] = ys[:-1]                                  # Interpolation property
        S[:,1] = (c[1:] - c[:-1])/(3*delta)               # Recover d
        S[:,3] = Delta/delta - delta/3*(2*c[:-1] + c[1:]) # Recover b
        return S
    def s_eval(S, xx):
        left = S[:,0].searchsorted(xx) - 1
        left[left<0] = 0 # Use the leftmost interval even if xx<=x
        f = numpy.zeros_like(xx)
        for i,t in enumerate(xx):
            f[i] = numpy.polyval(S[left[i],1:], xx[i] - S[left[i],0])
        return f
    
    # Build an explicit matrix for the spline fit evaluated at xx
    A = numpy.zeros((len(xx), len(x)))
    for i,e in enumerate(numpy.eye(len(x), len(x))):
        S = s_interp(e)
        A[:,i] = s_eval(S, xx)
    return A

spline_cubic_interp_and_eval(numpy.linspace(-1,1,3), numpy.linspace(-1,1,6))


Out[197]:
array([[ 1.   ,  0.   ,  0.   ],
       [ 0.516,  0.568, -0.084],
       [ 0.128,  0.944, -0.072],
       [-0.072,  0.944,  0.128],
       [-0.084,  0.568,  0.516],
       [ 0.   ,  0.   ,  1.   ]])

In [184]:
x = numpy.linspace(-1,1,10)
xx = numpy.linspace(-1, 1, 100)
pyplot.figure()
pyplot.ylim(-0.2, 1.2)
pyplot.plot(x, runge3(x), '*')
pyplot.plot(xx, spline_cubic_interp_and_eval(x, xx).dot(runge3(x)), label='spline')
pyplot.plot(xx, runge3(xx), label='exact')
pyplot.legend(loc='upper left')


Out[184]:
<matplotlib.legend.Legend at 0x7f5f6ebbfc18>

In [192]:
# Check the accuracy of cubic splines for smooth equations

npoints = numpy.arange(2,30)
pyplot.figure()
pyplot.loglog(npoints, maxerror(piecewise_constant_interp_and_eval, runge1, numpy.linspace, (-1,1), npoints), label='piecewise_constant')
pyplot.loglog(npoints, maxerror(chebyshev_interp_and_eval, runge1, cosspace, (-1,1), npoints), label='chebyshev cos')
pyplot.loglog(npoints, maxerror(spline_cubic_interp_and_eval, runge1, numpy.linspace, (-1,1), npoints), label='spline_cubic')
pyplot.loglog(npoints, 1/npoints, label='$n^{-1}$')
pyplot.loglog(npoints, npoints**(-2.), label='$n^{-2}$')
pyplot.legend(loc='lower left')
pyplot.xlabel('Number of points')
pyplot.ylabel('Max error')


Out[192]:
<matplotlib.text.Text at 0x7f5f6e00cf28>

Observations

  • Splines are smooth by construction
  • Splines can have artifacts at sharp transitions
  • Fitting with splines requires solving a tridiagonal system, $O(n)$ cost
  • Moving, adding, or removing any control point $x_i$ requires rebuilding the tridiagonal matrix (not a local operation)
  • Splines appear to be reasonably accurate, at least for smooth functions
  • Evaluating splines requires finding which interval in which to evaluate
  • Storing splines appears to require more data than the input function (store $n\times 5$ matrix S)

In [195]:
# Are splines well conditioned?

print(numpy.linalg.cond(
        spline_cubic_interp_and_eval(numpy.linspace(-1,1,20),
                                     numpy.linspace(-1,1,1000))))


2.05003425874

Outlook

  • The artifacts at sharp transitions can be relieved by relaxing continuity. For example, splines can be broken at any point by replacing derivative continuity with endpoint conditions. This also decouples the tridiagonal system. Such methods produce Bezier curves, which are used for many applications including fonts.
  • B-splines are a more computationally convenient representation of a piecewise polynomial curve, but are not inherently interpolatory. They are local in the sense that modifying control points has only local effects.
  • NURBS (Non-Uniform Rational B-Splines) are an extension of B-splines to use rational functions. This allows exact representation of conic sections. NURBS are used heavily for Computer Aided Design (CAD) and other engineering applications.

Regression

Long ago, we solved an over-determined linear system to compute a polynomial approximation of a function. Sometimes we make approximations because we are interested in the polynomial coefficients. That is usually only for very low order (like linear or quadratic fits). Inferring higher coefficients is ill-conditioned (as we saw with Vandermonde matrices) and probably not meaningful. We now know that Chebyshev bases are good for representing high-degree polynomials, but if the points are arbitrarily spaced, how many do we need for the Chebyshev approximation to be well-conditioned?


In [201]:
def chebyshev_regress_eval(x, xx, n):
    V = vander_chebyshev(x, n)
    Q, R = numpy.linalg.qr(V)
    return vander_chebyshev(xx, n).dot(numpy.linalg.inv(R).dot(Q.T))


print(chebyshev_regress_eval(x, xx, 5))
pyplot.figure()
pyplot.plot(xx, chebyshev_regress_eval(x, xx, 5).dot(runge1(x)), label='p(x)')
pyplot.plot(xx, runge1(xx), label='runge1(x)')
pyplot.legend(loc='upper left')


[[  9.37062937e-01   1.74825175e-01  -8.74125874e-02  -8.74125874e-02
   -2.77555756e-17   6.29370629e-02   5.24475524e-02  -1.74825175e-02
   -6.99300699e-02   3.49650350e-02]
 [  8.39651606e-01   2.25743828e-01  -2.08663397e-02  -6.20855649e-02
   -1.95267815e-02   2.54908104e-02   3.19417452e-02  -9.05709276e-04
   -3.34895508e-02   1.40459568e-02]
 [  7.48567224e-01   2.71099424e-01   4.01948244e-02  -3.75683272e-02
   -3.56460583e-02  -7.52867104e-03   1.32592574e-02   1.31588751e-02
   -1.42294470e-03  -4.11360377e-03]
 [  6.63566677e-01   3.11156412e-01   9.59964267e-02  -1.38681185e-02
   -4.85536593e-02  -3.63382264e-02  -3.66144567e-03   2.48754615e-02
    2.65096787e-02  -1.96832059e-02]
 [  5.84411149e-01   3.46173986e-01   1.46759929e-01   9.00853353e-03
   -5.84411149e-02  -6.11504016e-02  -1.88811826e-02   3.44042156e-02
    5.05429957e-02  -3.28281104e-02]
 [  5.10866123e-01   3.76406088e-01   1.92702734e-01   3.10558178e-02
   -6.54956568e-02  -8.21734443e-02  -3.24600553e-02   4.19012436e-02
    7.09064289e-02  -4.37092789e-02]
 [  4.42701380e-01   4.02101405e-01   2.34038183e-01   5.22686395e-02
   -6.99002179e-02  -9.96113033e-02  -4.44574493e-02   4.75185914e-02
    8.78241468e-02  -5.24833748e-02]
 [  3.79691001e-01   4.23503369e-01   2.70975558e-01   7.26426205e-02
   -7.18334326e-02  -1.13663628e-01  -5.49320338e-02   5.14042454e-02
    1.01515064e-01  -5.93027627e-02]
 [  3.21613364e-01   4.40850159e-01   3.03720083e-01   9.21740987e-02
   -7.14696364e-02  -1.24525771e-01  -6.39417614e-02   5.37021319e-02
    1.12192841e-01  -6.43155083e-02]
 [  2.68251146e-01   4.54374701e-01   3.32472919e-01   1.10860129e-01
   -6.89788662e-02  -1.32388785e-01  -7.15438683e-02   5.45521175e-02
    1.20065885e-01  -6.76653788e-02]
 [  2.19391325e-01   4.64304666e-01   3.57431170e-01   1.28698482e-01
   -6.45268603e-02  -1.37439422e-01  -7.77948743e-02   5.40900088e-02
    1.25337347e-01  -6.94918428e-02]
 [  1.74825175e-01   4.70862471e-01   3.78787879e-01   1.45687646e-01
   -5.82750583e-02  -1.39860140e-01  -8.27505828e-02   5.24475524e-02
    1.28205128e-01  -6.99300699e-02]
 [  1.34348269e-01   4.74265279e-01   3.96732027e-01   1.61826824e-01
   -5.03806009e-02  -1.39829094e-01  -8.64660804e-02   4.97524355e-02
    1.28861872e-01  -6.91109314e-02]
 [  9.77604801e-02   4.74725000e-01   4.11448539e-01   1.77115937e-01
   -4.09963304e-02  -1.37520142e-01  -8.89957378e-02   4.61282849e-02
    1.27494969e-01  -6.71609996e-02]
 [  6.48659789e-02   4.72448288e-01   4.23118278e-01   1.91555621e-01
   -3.02707902e-02  -1.33102844e-01  -9.03932088e-02   4.16946678e-02
    1.24286557e-01  -6.42025484e-02]
 [  3.54732352e-02   4.67636546e-01   4.31918047e-01   2.05147231e-01
   -1.83482251e-02  -1.26742460e-01  -9.07114310e-02   3.65670916e-02
    1.19413519e-01  -6.03535526e-02]
 [  9.39501725e-03   4.60485920e-01   4.38020589e-01   2.17892835e-01
   -5.36858129e-03  -1.18599953e-01  -9.00026254e-02   3.08570036e-02
    1.13047483e-01  -5.57276887e-02]
 [ -1.35516079e-02   4.51187305e-01   4.41594588e-01   2.29795220e-01
    8.53249385e-03  -1.08831985e-01  -8.83182967e-02   2.46717913e-02
    1.05354825e-01  -5.04343345e-02]
 [ -3.35452745e-02   4.39926340e-01   4.42804668e-01   2.40857889e-01
    2.32236516e-02  -9.75909208e-02  -8.57092331e-02   1.81147826e-02
    9.64966664e-02  -4.45785688e-02]
 [ -5.07603182e-02   4.26883410e-01   4.41811393e-01   2.51085060e-01
    3.85778418e-02  -8.50248265e-02  -8.22255062e-02   1.12852450e-02
    8.66288733e-02  -3.82611719e-02]
 [ -6.53667758e-02   4.12233647e-01   4.38771267e-01   2.60481669e-01
    5.44723132e-02  -7.12774692e-02  -7.79164714e-02   4.27838673e-03
    7.59020594e-02  -3.15786256e-02]
 [ -7.75303857e-02   3.96146928e-01   4.33836733e-01   2.69053369e-01
    7.07886131e-02  -5.64883174e-02  -7.28307676e-02  -2.81464432e-03
    6.44615840e-02  -2.46231127e-02]
 [ -8.74125874e-02   3.78787879e-01   4.27156177e-01   2.76806527e-01
    8.74125874e-02  -4.07925408e-02  -6.70163170e-02  -9.90675991e-03
    5.24475524e-02  -1.74825175e-02]
 [ -9.51705217e-02   3.60315867e-01   4.18873923e-01   2.83748228e-01
    1.04234381e-01  -2.43210106e-02  -6.05203257e-02  -1.69149317e-02
    3.99948161e-02  -1.02404255e-02]
 [ -1.00957031e-01   3.40885010e-01   4.09130234e-01   2.89886274e-01
    1.21148437e-01  -7.20029919e-03  -5.33892831e-02  -2.37601913e-02
    2.72329724e-02  -2.97612366e-03]
 [ -1.04920658e-01   3.20644169e-01   3.98061316e-01   2.95229183e-01
    1.38053498e-01   1.04473197e-02  -4.56689623e-02  -3.03676300e-02
    1.42863648e-02   4.23539987e-03]
 [ -1.07205649e-01   2.99736952e-01   3.85799314e-01   2.99786188e-01
    1.54852604e-01   2.85038710e-02  -3.74044199e-02  -3.66663992e-02
    1.27408279e-03   1.13234556e-02]
 [ -1.07951948e-01   2.78301713e-01   3.72472313e-01   3.03567241e-01
    1.71453094e-01   4.68556783e-02  -2.86399960e-02  -4.25897100e-02
   -1.16900380e-02   1.82216527e-02]
 [ -1.07295205e-01   2.56471552e-01   3.58204337e-01   3.06583008e-01
    1.87766608e-01   6.53933640e-02  -1.94193144e-02  -4.80748334e-02
   -2.44974159e-02   2.48678990e-02]
 [ -1.05366766e-01   2.34374316e-01   3.43115352e-01   3.08844873e-01
    2.03709081e-01   8.40118491e-02  -9.78528222e-03  -5.30631004e-02
   -3.70447234e-02   3.12044011e-02]
 [ -1.02293683e-01   2.12132596e-01   3.27321263e-01   3.10364936e-01
    2.19200750e-01   1.02610353e-01   2.19909635e-04  -5.74999017e-02
   -4.92338866e-02   3.71776642e-02]
 [ -9.81987072e-02   1.89863730e-01   3.10933915e-01   3.11156014e-01
    2.34166148e-01   1.21092395e-01   1.05547868e-02  -6.13346879e-02
   -6.09720858e-02   4.27384923e-02]
 [ -9.32002906e-02   1.67679803e-01   2.94061095e-01   3.11231639e-01
    2.48534108e-01   1.39365791e-01   2.11785915e-02  -6.45209695e-02
   -7.21717551e-02   4.78419879e-02]
 [ -8.74125874e-02   1.45687646e-01   2.76806527e-01   3.10606061e-01
    2.62237762e-01   1.57342657e-01   3.20512821e-02  -6.70163170e-02
   -8.27505828e-02   5.24475524e-02]
 [ -8.09454530e-02   1.23988834e-01   2.59269878e-01   3.09294245e-01
    2.75214540e-01   1.74939408e-01   4.31335336e-02  -6.87823605e-02
   -9.26315107e-02   5.65188858e-02]
 [ -7.39044440e-02   1.02679689e-01   2.41546753e-01   3.07311874e-01
    2.87406171e-01   1.92076757e-01   5.43867375e-02  -6.97847903e-02
   -1.01742735e-01   6.00239867e-02]
 [ -6.63908184e-02   8.18512817e-02   2.23728699e-01   3.04675347e-01
    2.98758683e-01   2.08679716e-01   6.57730017e-02  -6.99933561e-02
   -1.10017706e-01   6.29351524e-02]
 [ -5.85015353e-02   6.15894249e-02   2.05903203e-01   3.01401778e-01
    3.09222401e-01   2.24677595e-01   7.72551504e-02  -6.93818680e-02
   -1.17395127e-01   6.52289791e-02]
 [ -5.03292555e-02   4.19746795e-02   1.88153689e-01   2.97508999e-01
    3.18751951e-01   2.40004003e-01   8.87967245e-02  -6.79281957e-02
   -1.23818956e-01   6.68863614e-02]
 [ -4.19623406e-02   2.30823522e-02   1.70559525e-01   2.93015559e-01
    3.27306257e-01   2.54596848e-01   1.00361981e-01  -6.56142687e-02
   -1.29238405e-01   6.78924927e-02]
 [ -3.34848541e-02   4.98249559e-03   1.53196017e-01   2.87940721e-01
    3.34848541e-01   2.68398337e-01   1.11915894e-01  -6.24260765e-02
   -1.33607940e-01   6.82368652e-02]
 [ -2.49765602e-02  -1.22600915e-02   1.36134413e-01   2.82304467e-01
    3.41346323e-01   2.81354975e-01   1.23424153e-01  -5.83536685e-02
   -1.36887280e-01   6.79132697e-02]
 [ -1.65129250e-02  -2.85858644e-02   1.19441898e-01   2.76127494e-01
    3.46771424e-01   2.93417565e-01   1.34853166e-01  -5.33911538e-02
   -1.39041399e-01   6.69197956e-02]
 [ -8.16511540e-03  -4.39405323e-02   1.03181601e-01   2.69431215e-01
    3.51099962e-01   3.04541212e-01   1.46170054e-01  -4.75367016e-02
   -1.40040525e-01   6.52588311e-02]
 [  6.93889390e-17  -5.82750583e-02   8.74125874e-02   2.62237762e-01
    3.54312354e-01   3.14685315e-01   1.57342657e-01  -4.07925408e-02
   -1.39860140e-01   6.29370629e-02]
 [  7.91985148e-03  -7.15456594e-02   7.21898653e-02   2.54569981e-01
    3.56393316e-01   3.23813575e-01   1.68339532e-01  -3.31649602e-02
   -1.38480978e-01   5.99654768e-02]
 [  1.55361680e-02  -8.37138069e-02   5.75643820e-02   2.46451435e-01
    3.57331863e-01   3.31893990e-01   1.79129951e-01  -2.46643084e-02
   -1.35889030e-01   5.63593568e-02]
 [  2.27949770e-02  -9.47462255e-02   4.35830249e-02   2.37906403e-01
    3.57121307e-01   3.38898858e-01   1.89683902e-01  -1.53049942e-02
   -1.32075538e-01   5.21382859e-02]
 [  2.96466050e-02  -1.04614894e-01   3.02886217e-02   2.28959882e-01
    3.55759260e-01   3.44804775e-01   1.99972090e-01  -5.10548578e-03
   -1.27037000e-01   4.73261456e-02]
 [  3.60456769e-02  -1.13297046e-01   1.77199403e-02   2.19637584e-01
    3.53247633e-01   3.49592636e-01   2.09965938e-01   5.91168843e-03
   -1.20775168e-01   4.19511163e-02]
 [  4.19511163e-02  -1.20775168e-01   5.91168843e-03   2.09965938e-01
    3.49592636e-01   3.53247633e-01   2.19637584e-01   1.77199403e-02
   -1.13297046e-01   3.60456769e-02]
 [  4.73261456e-02  -1.27037000e-01  -5.10548578e-03   1.99972090e-01
    3.44804775e-01   3.55759260e-01   2.28959882e-01   3.02886217e-02
   -1.04614894e-01   2.96466050e-02]
 [  5.21382859e-02  -1.32075538e-01  -1.53049942e-02   1.89683902e-01
    3.38898858e-01   3.57121307e-01   2.37906403e-01   4.35830249e-02
   -9.47462255e-02   2.27949770e-02]
 [  5.63593568e-02  -1.35889030e-01  -2.46643084e-02   1.79129951e-01
    3.31893990e-01   3.57331863e-01   2.46451435e-01   5.75643820e-02
   -8.37138069e-02   1.55361680e-02]
 [  5.99654768e-02  -1.38480978e-01  -3.31649602e-02   1.68339532e-01
    3.23813575e-01   3.56393316e-01   2.54569981e-01   7.21898653e-02
   -7.15456594e-02   7.91985148e-03]
 [  6.29370629e-02  -1.39860140e-01  -4.07925408e-02   1.57342657e-01
    3.14685315e-01   3.54312354e-01   2.62237762e-01   8.74125874e-02
   -5.82750583e-02   6.93889390e-17]
 [  6.52588311e-02  -1.40040525e-01  -4.75367016e-02   1.46170054e-01
    3.04541212e-01   3.51099962e-01   2.69431215e-01   1.03181601e-01
   -4.39405323e-02  -8.16511540e-03]
 [  6.69197956e-02  -1.39041399e-01  -5.33911538e-02   1.34853166e-01
    2.93417565e-01   3.46771424e-01   2.76127494e-01   1.19441898e-01
   -2.85858644e-02  -1.65129250e-02]
 [  6.79132697e-02  -1.36887280e-01  -5.83536685e-02   1.23424153e-01
    2.81354975e-01   3.41346323e-01   2.82304467e-01   1.36134413e-01
   -1.22600915e-02  -2.49765602e-02]
 [  6.82368652e-02  -1.33607940e-01  -6.24260765e-02   1.11915894e-01
    2.68398337e-01   3.34848541e-01   2.87940721e-01   1.53196017e-01
    4.98249559e-03  -3.34848541e-02]
 [  6.78924927e-02  -1.29238405e-01  -6.56142687e-02   1.00361981e-01
    2.54596848e-01   3.27306257e-01   2.93015559e-01   1.70559525e-01
    2.30823522e-02  -4.19623406e-02]
 [  6.68863614e-02  -1.23818956e-01  -6.79281957e-02   8.87967245e-02
    2.40004003e-01   3.18751951e-01   2.97508999e-01   1.88153689e-01
    4.19746795e-02  -5.03292555e-02]
 [  6.52289791e-02  -1.17395127e-01  -6.93818680e-02   7.72551504e-02
    2.24677595e-01   3.09222401e-01   3.01401778e-01   2.05903203e-01
    6.15894249e-02  -5.85015353e-02]
 [  6.29351524e-02  -1.10017706e-01  -6.99933561e-02   6.57730017e-02
    2.08679716e-01   2.98758683e-01   3.04675347e-01   2.23728699e-01
    8.18512817e-02  -6.63908184e-02]
 [  6.00239867e-02  -1.01742735e-01  -6.97847903e-02   5.43867375e-02
    1.92076757e-01   2.87406171e-01   3.07311874e-01   2.41546753e-01
    1.02679689e-01  -7.39044440e-02]
 [  5.65188858e-02  -9.26315107e-02  -6.87823605e-02   4.31335336e-02
    1.74939408e-01   2.75214540e-01   3.09294245e-01   2.59269878e-01
    1.23988834e-01  -8.09454530e-02]
 [  5.24475524e-02  -8.27505828e-02  -6.70163170e-02   3.20512821e-02
    1.57342657e-01   2.62237762e-01   3.10606061e-01   2.76806527e-01
    1.45687646e-01  -8.74125874e-02]
 [  4.78419879e-02  -7.21717551e-02  -6.45209695e-02   2.11785915e-02
    1.39365791e-01   2.48534108e-01   3.11231639e-01   2.94061095e-01
    1.67679803e-01  -9.32002906e-02]
 [  4.27384923e-02  -6.09720858e-02  -6.13346879e-02   1.05547868e-02
    1.21092395e-01   2.34166148e-01   3.11156014e-01   3.10933915e-01
    1.89863730e-01  -9.81987072e-02]
 [  3.71776642e-02  -4.92338866e-02  -5.74999017e-02   2.19909635e-04
    1.02610353e-01   2.19200750e-01   3.10364936e-01   3.27321263e-01
    2.12132596e-01  -1.02293683e-01]
 [  3.12044011e-02  -3.70447234e-02  -5.30631004e-02  -9.78528222e-03
    8.40118491e-02   2.03709081e-01   3.08844873e-01   3.43115352e-01
    2.34374316e-01  -1.05366766e-01]
 [  2.48678990e-02  -2.44974159e-02  -4.80748334e-02  -1.94193144e-02
    6.53933640e-02   1.87766608e-01   3.06583008e-01   3.58204337e-01
    2.56471552e-01  -1.07295205e-01]
 [  1.82216527e-02  -1.16900380e-02  -4.25897100e-02  -2.86399960e-02
    4.68556783e-02   1.71453094e-01   3.03567241e-01   3.72472313e-01
    2.78301713e-01  -1.07951948e-01]
 [  1.13234556e-02   1.27408279e-03  -3.66663992e-02  -3.74044199e-02
    2.85038710e-02   1.54852604e-01   2.99786188e-01   3.85799314e-01
    2.99736952e-01  -1.07205649e-01]
 [  4.23539987e-03   1.42863648e-02  -3.03676300e-02  -4.56689623e-02
    1.04473197e-02   1.38053498e-01   2.95229183e-01   3.98061316e-01
    3.20644169e-01  -1.04920658e-01]
 [ -2.97612366e-03   2.72329724e-02  -2.37601913e-02  -5.33892831e-02
   -7.20029919e-03   1.21148437e-01   2.89886274e-01   4.09130234e-01
    3.40885010e-01  -1.00957031e-01]
 [ -1.02404255e-02   3.99948161e-02  -1.69149317e-02  -6.05203257e-02
   -2.43210106e-02   1.04234381e-01   2.83748228e-01   4.18873923e-01
    3.60315867e-01  -9.51705217e-02]
 [ -1.74825175e-02   5.24475524e-02  -9.90675991e-03  -6.70163170e-02
   -4.07925408e-02   8.74125874e-02   2.76806527e-01   4.27156177e-01
    3.78787879e-01  -8.74125874e-02]
 [ -2.46231127e-02   6.44615840e-02  -2.81464432e-03  -7.28307676e-02
   -5.64883174e-02   7.07886131e-02   2.69053369e-01   4.33836733e-01
    3.96146928e-01  -7.75303857e-02]
 [ -3.15786256e-02   7.59020594e-02   4.27838673e-03  -7.79164714e-02
   -7.12774692e-02   5.44723132e-02   2.60481669e-01   4.38771267e-01
    4.12233647e-01  -6.53667758e-02]
 [ -3.82611719e-02   8.66288733e-02   1.12852450e-02  -8.22255062e-02
   -8.50248265e-02   3.85778418e-02   2.51085060e-01   4.41811393e-01
    4.26883410e-01  -5.07603182e-02]
 [ -4.45785688e-02   9.64966664e-02   1.81147826e-02  -8.57092331e-02
   -9.75909208e-02   2.32236516e-02   2.40857889e-01   4.42804668e-01
    4.39926340e-01  -3.35452745e-02]
 [ -5.04343345e-02   1.05354825e-01   2.46717913e-02  -8.83182967e-02
   -1.08831985e-01   8.53249385e-03   2.29795220e-01   4.41594588e-01
    4.51187305e-01  -1.35516079e-02]
 [ -5.57276887e-02   1.13047483e-01   3.08570036e-02  -9.00026254e-02
   -1.18599953e-01  -5.36858129e-03   2.17892835e-01   4.38020589e-01
    4.60485920e-01   9.39501725e-03]
 [ -6.03535526e-02   1.19413519e-01   3.65670916e-02  -9.07114310e-02
   -1.26742460e-01  -1.83482251e-02   2.05147231e-01   4.31918047e-01
    4.67636546e-01   3.54732352e-02]
 [ -6.42025484e-02   1.24286557e-01   4.16946678e-02  -9.03932088e-02
   -1.33102844e-01  -3.02707902e-02   1.91555621e-01   4.23118278e-01
    4.72448288e-01   6.48659789e-02]
 [ -6.71609996e-02   1.27494969e-01   4.61282849e-02  -8.89957378e-02
   -1.37520142e-01  -4.09963304e-02   1.77115937e-01   4.11448539e-01
    4.74725000e-01   9.77604801e-02]
 [ -6.91109314e-02   1.28861872e-01   4.97524355e-02  -8.64660804e-02
   -1.39829094e-01  -5.03806009e-02   1.61826824e-01   3.96732027e-01
    4.74265279e-01   1.34348269e-01]
 [ -6.99300699e-02   1.28205128e-01   5.24475524e-02  -8.27505828e-02
   -1.39860140e-01  -5.82750583e-02   1.45687646e-01   3.78787879e-01
    4.70862471e-01   1.74825175e-01]
 [ -6.94918428e-02   1.25337347e-01   5.40900088e-02  -7.77948743e-02
   -1.37439422e-01  -6.45268603e-02   1.28698482e-01   3.57431170e-01
    4.64304666e-01   2.19391325e-01]
 [ -6.76653788e-02   1.20065885e-01   5.45521175e-02  -7.15438683e-02
   -1.32388785e-01  -6.89788662e-02   1.10860129e-01   3.32472919e-01
    4.54374701e-01   2.68251146e-01]
 [ -6.43155083e-02   1.12192841e-01   5.37021319e-02  -6.39417614e-02
   -1.24525771e-01  -7.14696364e-02   9.21740987e-02   3.03720083e-01
    4.40850159e-01   3.21613364e-01]
 [ -5.93027627e-02   1.01515064e-01   5.14042454e-02  -5.49320338e-02
   -1.13663628e-01  -7.18334326e-02   7.26426205e-02   2.70975558e-01
    4.23503369e-01   3.79691001e-01]
 [ -5.24833748e-02   8.78241468e-02   4.75185914e-02  -4.44574493e-02
   -9.96113033e-02  -6.99002179e-02   5.22686395e-02   2.34038183e-01
    4.02101405e-01   4.42701380e-01]
 [ -4.37092789e-02   7.09064289e-02   4.19012436e-02  -3.24600553e-02
   -8.21734443e-02  -6.54956568e-02   3.10558178e-02   1.92702734e-01
    3.76406088e-01   5.10866123e-01]
 [ -3.28281104e-02   5.05429957e-02   3.44042156e-02  -1.88811826e-02
   -6.11504016e-02  -5.84411149e-02   9.00853353e-03   1.46759929e-01
    3.46173986e-01   5.84411149e-01]
 [ -1.96832059e-02   2.65096787e-02   2.48754615e-02  -3.66144567e-03
   -3.63382264e-02  -4.85536593e-02  -1.38681185e-02   9.59964267e-02
    3.11156412e-01   6.63566677e-01]
 [ -4.11360377e-03  -1.42294470e-03   1.31588751e-02   1.32592574e-02
   -7.52867104e-03  -3.56460583e-02  -3.75683272e-02   4.01948244e-02
    2.71099424e-01   7.48567224e-01]
 [  1.40459568e-02  -3.34895508e-02  -9.05709276e-04   3.19417452e-02
    2.54908104e-02  -1.95267815e-02  -6.20855649e-02  -2.08663397e-02
    2.25743828e-01   8.39651606e-01]
 [  3.49650350e-02  -6.99300699e-02  -1.74825175e-02   5.24475524e-02
    6.29370629e-02   5.55111512e-17  -8.74125874e-02  -8.74125874e-02
    1.74825175e-01   9.37062937e-01]]
Out[201]:
<matplotlib.legend.Legend at 0x7f5f6de63da0>

In [ ]: