So what really happens when we multiply the $A$ matrix with a vector $x$
Lets say we have a vector - $x$
$$ x = \begin{bmatrix} -1 \\ 1 \end{bmatrix} $$What happens when we multiply by a matrix - $A$
$$ A = \begin{bmatrix} 6 & 2 \\ 2 & 6 \end{bmatrix} $$$$ Ax = \begin{bmatrix} 6 & 2 \\ 2 & 6 \end{bmatrix} \begin{bmatrix} -1 \\ 1 \end{bmatrix} = \begin{bmatrix} -4 \\ 4 \end{bmatrix} $$$$ Ax = 4Ix $$$$ Ax = 4x $$So this particular matrix has just scaled our original vector. It is a scalar transformation. Other matrices can do reflection, rotation and any arbitary transformation in the same 2d space for n = 2.
Lets see what has happened through code.
In [3]:
import numpy as np
In [4]:
import matplotlib.pyplot as plt
%matplotlib inline
plt.style.use('fivethirtyeight')
plt.rcParams['figure.figsize'] = (10, 6)
In [5]:
def vector_plot (vector):
X,Y,U,V = zip(*vector)
C = [1,1,2,2]
plt.figure()
ax = plt.gca()
ax.quiver(X,Y,U,V,C, angles='xy',scale_units='xy',scale=1)
ax.set_xlim([-6,6])
ax.set_ylim([-6,6])
plt.axhline(0, color='grey', linewidth=1)
plt.axvline(0, color='grey', linewidth=1)
plt.axes().set_aspect('equal')
plt.draw()
In [6]:
A = np.array([[ 6 , 2],
[ 2 , 6]])
In [7]:
x = np.array([[-1],
[1]])
In [8]:
v = A.dot(x)
In [9]:
# All the vectors start at 0, 0
vAX = np.r_[[0,0],A[:,0]]
vAY = np.r_[[0,0],A[:,1]]
vx = np.r_[[0,0],x[:,0]]
vv = np.r_[[0,0],v[:,0]]
In [10]:
vector_plot([vAX, vAY, vx, vv])
So far we have been solving the equation $Ax = b$. Let us just look at special case when $b=0$.
$$ Ax =0 $$If $A^{-1}$ exists (the matrix is non-singular and invertable), then the solution is trival
$$ A^{-1}Ax =0 $$$$ x = 0$$If $A^{-1}$ does not exist, then there may be infinitely many other solutions $x$. And since $A^{-1}$ is a singular matrix then
$$||A|| = 0 $$The second part of linear algebra is solving the equation, for a given $A$ -
$$ Ax = \lambda x$$Note that both $x$ and $\lambda$ are unknown in this equation. For all solutions of them:
$$ \text{eigenvalues} = \lambda $$$$ \text{eigenvectors} = x $$Let us use the sample $A$ vector:
$$ A = \begin{bmatrix}3 & 1\\ 1 & 3\end{bmatrix} $$So our equation becomes:
$$ \begin{bmatrix}3 & 1\\ 1 & 3\end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} = \begin{bmatrix}\lambda & 0\\ 0 & \lambda \end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} $$$$ \begin{bmatrix}3 - \lambda & 1\\ 1 & 3 - \lambda \end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} = 0 $$So for a singular matrix:
$$ \begin{Vmatrix}3 - \lambda & 1\\ 1 & 3 - \lambda \end{Vmatrix} = 0 $$$$ (3 - \lambda)^2 - 1 = 0 $$$$ \lambda^2 - 6\lambda + 8 = 0 $$$$ (\lambda - 4)(\lambda - 2) = 0 $$$$ \lambda_1 = 2, \lambda_2 = 4 $$$$||A|| = \lambda_{1} \lambda_{2} $$For $\lambda = 2$,
$$ \begin{bmatrix}3 - \lambda & 1\\ 1 & 3 - \lambda \end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} = \begin{bmatrix}1 & 1\\ 1 & 1 \end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} = 0 $$So one simple solution is:
$$ \begin{bmatrix}x \\ y\end{bmatrix} = \begin{bmatrix}-1 \\ 1\end{bmatrix} $$For $\lambda = 4$,
$$ \begin{bmatrix}3 - \lambda & 1\\ 1 & 3 - \lambda \end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} = \begin{bmatrix}-1 & 1\\ 1 & -1 \end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} = 0 $$So one simple solution is:
$$ \begin{bmatrix}x \\ y\end{bmatrix} = \begin{bmatrix}1 \\ 1\end{bmatrix} $$The eigenvectors are orthonormal to each other in this case.
In [11]:
A = np.array([[ 3 , 1],
[ 1 , 3]])
In [12]:
eigen_val, eigen_vec = np.linalg.eig(A)
In [13]:
eigen_val
Out[13]:
In [14]:
eigen_vec
Out[14]:
In [15]:
eigen_vec[:,0]
Out[15]:
In [16]:
# All the vectors start at 0, 0
vX1 = np.r_[[0,0],A[:,0]]
vY1 = np.r_[[0,0],A[:,1]]
vE1 = np.r_[[0,0],eigen_vec[:,0]] * 2
vE2 = np.r_[[0,0],eigen_vec[:,1]] * 2
In [17]:
vector_plot([vX1, vY1, vE1, vE2])
In [18]:
f = np.matrix([[1,1,1],
[3,8,1],
[5,-4,3]])
In [19]:
np.linalg.eig(f)
Out[19]:
Write the matrix as np.matrix and find the Eigenvalues and Eigenvectors?
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]: