Lagrange Polynomials

This tutorial uses the same example as the example introduction.

Lagrange polynomials are not a method for creating orthogonal polynomials. Instead it is an interpolation method for creating an polynomial expansion that has the property that each polynomial interpolates exactly one point in space with the value 1 and has the value 0 for all other interpolation values.

To summarize, we need to do the following:

  • Generate $Q_1, ..., Q_N = (I_1, R_1), ..., (I_N, R_N)$, using (pseudo-)random samples or otherwise.
  • Construct a Lagrange polynomial expansion $P_1, ..., P_M$ from the samples $Q_1, ..., Q_N$.
  • Evaluate model predictor $U_1, ..., U_N$ for each sample.
  • Analyze model approximation $u(t; I, R) = \sum_m U_m(t) P_n(I, R)$ instead of actual model solver.

Generating Samples

In chaospy, creating low-discrepancy sequences can be done using the distribution.sample method. Creating quadrature points can be done using chaospy.generate_quadrature. For example:


In [1]:
import numpy
from matplotlib import pyplot
from chaospy.example import distribution
%matplotlib inline

# Generate samples:
numpy.random.seed(1000)
samples = distribution.sample(5, rule="halton")

# Show samples as 2-dimensional scatter plot
pyplot.scatter(*samples, marker="x", color="k")
    
# Make scatter ficutres pretty
pyplot.xlabel("Parameter $I$ (normal)")
pyplot.ylabel("Parameter $a$ (uniform)")
pyplot.axis([1, 2, 0.1, 0.2])

pyplot.show()


Create lagrange polynomial expansion

Creating an expansion of lagrange polynomial terms can be done using the following constructor:


In [2]:
import chaospy

polynomial_expansion = chaospy.lagrange_polynomial(samples)
polynomial_expansion[0].round(2)


Out[2]:
polynomial(-694.24*q1**2+28.97*q0*q1+170.1*q1-6.65*q0-5.96)

On can verify that the returned polynomials follows the property of evaluating 0 for all but one of the samples used in the construction as follows:


In [3]:
polynomial_expansion(*samples).round(8)


Out[3]:
array([[ 1., -0., -0., -0., -0.],
       [ 0.,  1.,  0.,  0.,  0.],
       [ 0.,  0.,  1.,  0.,  0.],
       [-0., -0., -0.,  1.,  0.],
       [-0., -0., -0., -0.,  1.]])

Evaluate Model Prediction and Creating Model Approximaion

Fitting the polynomials to the evaluations does not need to involve regression. Instead it is enough to just multiply the lagrange polynomial expansion with the evaluations, and sum it up:


In [4]:
from chaospy.example import exponential_model

model_evaluations = numpy.array([
    exponential_model(sample) for sample in samples.T])
model_approximation = chaospy.sum(
    model_evaluations.T*polynomial_expansion, axis=-1).T

model_approximation.shape


Out[4]:
(1000,)

Assess Statistics

The results is close to what we are used to from the other methods:


In [5]:
from chaospy.example import coordinates

# Estimate mean and variance
expected = chaospy.E(model_approximation, distribution)
variance = chaospy.Var(model_approximation, distribution)

# Create band one standard deviation away from mean 
pyplot.fill_between(coordinates, expected-variance**0.5,
                    expected+variance**0.5, alpha=0.3,
                    color="k", label="standard deviation")
pyplot.plot(coordinates, expected, "k-",
            label="Estimated expected value")

# Make plot pretty
pyplot.axis([0, 10, 0, 2])
pyplot.xlabel("coordinates $t$")
pyplot.ylabel("Model evaluations $u$")

pyplot.show()


Avoiding matrix inversion issues with quadrature nodes

It is worth noting that the construction of lagrange polynomials are not always numerically stable. For example when using grids along , most often the expansion construction fails:


In [6]:
nodes, _ = chaospy.generate_quadrature(1, distribution)

try:
    chaospy.lagrange_polynomial(nodes)
except numpy.linalg.LinAlgError as err:
    error = err.args[0]
error


Out[6]:
'Lagrange abscissas resulted in invertible matrix'

It is impossible to avoid the issue entirely, but in the case of structured grid, it is often possible to get around the problem. To do so, the following steps can be taken:

  • Create multple lagrange polynomials, one for each marginal distribution.
  • Rotate dimensions so each expansion get a dimension corresponding to the marginal it was created for.
  • Use an outer product of the expansions to combine them into a single expansion.

So we begin with making marginal lagrange polynomials:


In [7]:
nodes, _ = list(zip(*[chaospy.generate_quadrature(1, marginal)
                      for marginal in distribution]))
expansions = [chaospy.lagrange_polynomial(nodes_)
              for nodes_ in nodes]

[expansion_.round(4) for expansion_ in expansions]


Out[7]:
[polynomial([-0.3333*q0+1.0, 0.3333*q0]),
 polynomial([-10.0*q0+2.0, 10.0*q0-1.0])]

Then we rotate the dimensions:


In [8]:
vars_ = chaospy.variable(len(distribution))
expansions = [
    expans(q0=var)
    for expans, var in zip(expansions, vars_)]

[expans.round(4) for expans in expansions]


Out[8]:
[polynomial([-0.3333*q0+1.0, 0.3333*q0]),
 polynomial([-10.0*q1+2.0, 10.0*q1-1.0])]

And finally construct the final expansion using an outer product:


In [9]:
expansion = chaospy.outer(*expansions).flatten()
expansion.round(2)


Out[9]:
polynomial([3.33*q0*q1-10.0*q1-0.67*q0+2.0,
            -3.33*q0*q1+10.0*q1+0.33*q0-1.0, -3.33*q0*q1+0.67*q0,
            3.33*q0*q1-0.33*q0])

The end results can be verified to have the properties we are looking for:


In [10]:
nodes, _ = chaospy.generate_quadrature(1, distribution)
expansion(*nodes).round(8)


Out[10]:
array([[ 1.,  0., -0.,  0.],
       [ 0.,  1.,  0., -0.],
       [ 0.,  0.,  1.,  0.],
       [ 0.,  0.,  0.,  1.]])