As we saw in the previous notebook, the rate of chemical reactions can be described by ordinary differential equations.
In this notebook we will look at a classic example in chemical kinetics: Robertson's example. It is a 3-species reaction system:
$$ A \overset{k_1}{\rightarrow} B \\ B + C \overset{k_2}{\rightarrow} A + C \\ 2 B \overset{k_3}{\rightarrow} B + C $$where A, B and C represent three different chemical species (e.g. reactive molecules dissolved in water). What is special about this system is that the numerical stepping needs to be performed using an implicit method (requiring the Jacobian) when solving for large time scales.
The rate of each process follows the law of mass action, i.e. the rate is proportional to the concentration of each reacting species (to the power of their multiplicity). The proportionality constant is known as the rate constant of the reaction ($k_1,\ k_2\ \&\ k_3$ in our case). If we denote the rate of each reaction:
$$ r_1 = k_1[A] \\ r_2 = k_2[B][C] \\ r_3 = k_3[B]^2 $$$[A],\ [B],\ [C]$ denotes the concentration of respective species. We can now formulate a system of ordinary differential equations describing how the concentrations evolve over time:
$$ \frac{d[A]}{dt} = r_2 - r_1 \\ \frac{d[B]}{dt} = r_1 - r_2 - r_3 \\ \frac{d[C]}{dt} = r_3 $$We will now express these differential equations (and their Jacobian) symbolically using SymPy:
In [ ]:
import sympy as sym
sym.init_printing()
In [ ]:
t = sym.symbols('t') # not used, only a placeholder
c = cA, cB, cC = sym.symbols('[A] [B] [C]') # concentrations
k = k1, k2, k3 = sym.symbols('k_1 k_2 k_3')
r1, r2, r3 = k1*cA, k2*cB*cC, k3*cB**2
ydot = r2 - r1, r1 - r2 - r3, r3
ydot
$\dot{\mathbf{y}}$ now represent our ODE system, where $\mathbf{y}$ is our state vector (concentrations). We will need a callback to evaluate $\dot{\mathbf{y}}$ when we integrate this ODE system numerically (using scipy.integrate.odeint
). As we have seen SymPy can provide us with this callback:
In [ ]:
import numpy as np
from scipy.integrate import odeint
In [ ]:
f = sym.lambdify((c, t) + k, ydot)
def integrate(tend=3600, t0=1e-6, **kwargs):
tout = np.logspace(np.log10(t0), np.log10(tend))
k_vals = (0.04, 1e4, 3e7) # from the literature, k1 has another unit than k2 & k3
c0 = [1, 0, 0] # we start with unit concentration of A (number density)
yout, info = odeint(f, c0, tout, args=k_vals, full_output=True, **kwargs)
return tout, yout, info
The lambda function is needed to give f the signature asked for by odeint
. full_output=True
makes odeint return a dictionary with information about the numerical integration.
In [ ]:
tout, yout, info = integrate(1e6, atol=1e-9, rtol=1e-9)
In [ ]:
import matplotlib.pyplot as plt
%matplotlib inline
In [ ]:
def plot_result(tout, yout, info=None):
fig, axes = plt.subplots(1, 2, figsize=(14, 4))
for i, label in enumerate('ABC'):
for ax in axes:
ax.plot(tout, yout[:, i], label=label)
axes[1].set_xscale('log')
axes[1].set_yscale('log')
for ax in axes:
ax.set_ylabel('$\mathrm{concentration / mol \cdot dm^{-3}}$')
ax.set_xlabel('$\mathrm{time / s}$')
ax.legend(loc='best')
if info:
print("The jacobian was evaluated %d times" % info['nje'][-1])
In [ ]:
plot_result(tout, yout, info)
If we look closer at the info-dictionary we will see that odeint (or rather LSODA which is the underlying package) switched method from an explicit Adams method to an implicit Backward Differentiation Formula (BDF). It is common for chemical kinetics problems that the problem becomes stiff. The larger number of species, the bigger is the Jacobian matrix.
By default the solver will approximate the elements in the Jacobian matrix by taking finite differences of $\mathbf{f}$. This is often works quite satisfactorily but for larger systems it sometimes fails. A more robust (and faster) approach is to provide a callback which evaluates an analytic Jacobian. Using SymPy we can do this quite effortlessly:
In [ ]:
J = sym.Matrix(ydot).jacobian(c)
J
In [ ]:
J_cb = sym.lambdify((c, t) + k, J)
In [ ]:
tout, yout, info = integrate(1e6, Dfun=J_cb)
In [ ]:
plot_result(tout, yout, info)
We see that the solver needed to evaluate the Jacobian fewer times (due to it being essentially exact this time around). For larger systems the impact of an analytic Jacobian is often even greater (being the difference between a failed and successful integration).
Benchmarking with and without the analytic Jacobian callback:
In [ ]:
%timeit integrate(1e6)
In [ ]:
%timeit integrate(1e6, Dfun=J_cb)