Math216 Introduction to Differential Equations

Deniz Bilman, Department of Mathematics, University of Michigan

Lecture 10: Theory of systems of 2 linear differential equations

We begin with an observation. Given a system of 2 linear autonomous differential equations

$$ \mathbf{x}' = \mathbf{A}\mathbf{x} + \mathbf{b}, $$

with a nonsingular coefficient matrix, i.e. with $\det(\mathbf{A})\neq 0$ we showed earlier that there exists a unique equilibrium solution $\mathbf{x}_{\text{eq}}$, which is of course the unique constant vector that satisfies

$$ \mathbf{A}\mathbf{x}_{\text{eq}} + \mathbf{b} =\mathbf{0}, $$

and since $\mathbf{A}$ is nonsingular, we can express this equilibrium solution is

$$ \mathbf{x}_{\text{eq}} = -\mathbf{A}^{-1}\mathbf{b}. $$

Note again that we do not need to solve a differential equation to find an equilibrium solution. Now define a new unknown vector by subtracting this equilibrium solution from the old unknown:

$$ \mathbf{y}:= \mathbf{x}-\mathbf{x}_{\text{eq}}. $$

Doing so, we shift the solution space so that if expressed in the new $\mathbf{y}$ variable, the system above has the unique equilibrium solution at the origin: $\mathbf{y}_{\text{eq}}=\mathbf{0}$. (Intuitively, you may think of this as "shifting the carpet" under the solution $\mathbf{x}$ so that the constant solution is at the origin instead of at $\mathbf{x}_{\text{eq}}$.) Now, if we express the old variable in terms of the new variable:

$$ \mathbf{x}:= \mathbf{y}+\mathbf{x}_{\text{eq}} $$

and substitute this to the ODE at the beginning, we get

\begin{align} \mathbf{y}' + \mathbf{x}_\text{eq}' &= \mathbf{A}\mathbf{y} + \mathbf{A}\mathbf{x}_\text{eq}+ \mathbf{b}\\ \mathbf{y}' &= \mathbf{A}\mathbf{y}.\\ \end{align}

Expressed in the new variable $\mathbf{y}$, the system becomes homogeneous. As you can see, if we have an autonomous (constant coefficient matrix and constant source vector $\mathbf{b}$) but non-homogeneous system, with a non-singular coefficient matrix, we can always reduce it to a homogenous system.

Clearly, if we can solve the resulting homogeneous system, we can shift back and get the solution of the original system: $\mathbf{x}= \mathbf{y}+\mathbf{x}_{\text{eq}}$.

Given this fact, we should understand how to solve constant coefficient homogenous systems

$$ \mathbf{x}' = \mathbf{A}\mathbf{x}. $$

But before going into that topic, we introduce some notions and terminology.

Superposition Principle:

Given two solutions $\mathbf{x}_1(t)$ and$ \mathbf{x}_2(t)$ of the system $\mathbf{x}'(t) = \mathbf{A}\mathbf{x}(t)$, for any choice of constants $c_1$ and $c_2$, the vector

$$ \mathbf{x}(t) = c_1 \mathbf{x}_1(t) + c_2 \mathbf{x}_2(t) $$

is also a solution.

Therefore, if $\mathbf{x}_1(t)$ and $\mathbf{x}_2(t)$ are "sufficiently independent" (to be made precise soon), one can generate all of the solutions to this system, infinitely many of them. The questions is then: What if we are also given an initial value $\mathbf{x}(t_0)=\mathbf{x}_0$? Which constants comply with it?

Determining $c_1$ and $c_2$ amounts to solving the linear system obtained by evaluating the general solution above at the initial value. We want:

$$ c_1\mathbf{x}_{1}(t_0) + c_2\mathbf{x}_{2}(t_0) = \mathbf{x}_0, $$

which is

$$ \begin{bmatrix}x_{11}(t_0) & x_{12}(t_0) \\ x_{21}(t_0) & x_{22}(t_0)\end{bmatrix} \begin{bmatrix} c_1 \\ c_2 \end{bmatrix} = \begin{bmatrix} x_{10} \\ x_{20} \end{bmatrix} $$

This system determines $c_1$ and $c_2$ uniquely if and only if the coefficient matrix is nonsingular, i.e.

$$ \left\lvert\begin{matrix}x_{11}(t_0) & x_{12}(t_0) \\ x_{21}(t_0) & x_{22}(t_0) \end{matrix}\right\rvert\neq 0. $$

In that case the initial value problem has a unique solution. This determinant is called the Wronskian determinant of the solutions $\mathbf{x}_1$ and $\mathbf{x}_2$ evaluated at $t_0$. We write

$$ W[\mathbf{x}_1,\mathbf{x}_2](t_0):=\left\lvert\begin{matrix}x_{11}(t_0) & x_{12}(t_0) \\ x_{21}(t_0) & x_{22}(t_0) \end{matrix}\right\rvert. $$

Linear Independence:

$\mathbf{x}_1$ and $\mathbf{x}_2$ are said to be linearly dependent if there exists a constant $k$ such that $\mathbf{x}_1=k \mathbf{x}_2$ (i.e. if they are proportional). Otherwise, $\mathbf{x}_1$ and $\mathbf{x}_2$ are said to be linearly independent.

Fact:

$\mathbf{x}_1(t)$ and $\mathbf{x}_2(t)$ are linearly independent if and only if their Wronskian determinant is nonzero: $W[\mathbf{x}_1,\mathbf{x}_2](t)\neq 0$. Such two solutions are called a fundamental set of solutions.

General solution of an autonomous homogenous system of ODEs:

Given a fundamental set of solutions $\mathbf{x}_1(t)$ and $\mathbf{x}_2(t)$ of $\mathbf{x}' = \mathbf{A}\mathbf{x}$, the general solution is

$$ \mathbf{x}(t) =c_1\mathbf{x}_{1}(t) + c_2\mathbf{x}_{2}(t) $$

A prescribed initial condition uniquely determines the constants $c_1$ and $c_2$ and gives the unique solution of the initial value problem under consideration.

How to solve systems?

Suppose we had

$$ \begin{bmatrix} x_1'(t)\\ x_2'(t)\end{bmatrix} = \begin{bmatrix}-1 & 0 \\ 0 & 4 \end{bmatrix}\begin{bmatrix} x_1(t)\\x_2(t)\end{bmatrix},\quad \mathbf{x}(0)=\begin{bmatrix}2\\3 \end{bmatrix}. $$

Since the coefficient matrix is a diagonal matrix, the system is decoupled: $x_1'(t)$ does not depend on $x_2(t)$ and $x_2'(t)$ does not depend on $x_1(t)$. We can directly integrate each component separately, use the initial value, and get

\begin{align} x_1(t) &=2 e^{-t}\\ x_2(t) &=3 e^{4t}. \end{align}

Observe that this can also be written in a form that is similar to the form of the general solution of system of ODEs that we introduced above:

$$ \mathbf{x}(t) = c_1 e^{-t}\begin{bmatrix}1\\0\end{bmatrix}+c_2 e^{4t}\begin{bmatrix}0\\1\end{bmatrix}. $$

and we determine $c_1=2$, $c_2=3$ using the initial value to get:

$$ \mathbf{x}(t) = 2 e^{-t}\begin{bmatrix}1\\0\end{bmatrix}+3 e^{4t}\begin{bmatrix}0\\1\end{bmatrix}. $$

What systems with coefficient matrices which are not diagonal matrices?

We finished the lecture with the following observation:

Given

$$ \mathbf{x}'(t)=\mathbf{A}\mathbf{x}(t), $$

Let $\lambda$ be an eigenvalue of the coefficient matrix $\mathbf{A}$ and let $\mathbf{u}$ be an associated eigenvector. Recall that this means $\mathbf{A}\mathbf{u}=\lambda \mathbf{u}$. Then the vector function

$$ \mathbf{x}(t) = e^{\lambda t}\mathbf{u} $$

is a solution of $\mathbf{x}'(t)=\mathbf{A}\mathbf{x}(t)$. This can be verified by a direct calculation (done in class). We will solve such systems in the next class.


In [ ]: