An eigenvector $\boldsymbol{x}$ and corrsponding eigenvalue $\lambda$ of a square matrix $\boldsymbol{A}$ satisfy
$$ \boldsymbol{A} \boldsymbol{x} = \lambda \boldsymbol{x} $$Rearranging this expression,
$$ \left( \boldsymbol{A} - \lambda \boldsymbol{I}\right) \boldsymbol{x} = \boldsymbol{0} $$The above equation has solutions (other than $\boldsymbol{x} = \boldsymbol{0}$) if
$$ \det \left( \boldsymbol{A} - \lambda \boldsymbol{I}\right) = 0 $$Computing the determinant of an $n \times n$ matrix requires solution of an $n$th degree polynomial. It is known how to compute roots of polynomials up to and including degree four (e.g., see http://en.wikipedia.org/wiki/Quartic_function). For matrices with $n > 4$, numerical methods must be used to compute eigenvalues and eigenvectors.
An $n \times n$ will have $n$ eigenvalue/eigenvector pairs (eigenpairs).
NumPy provides a function to compute eigenvalues and eigenvectors. To demonstrate how to compute eigpairs, we first create a $5 \times 5$ symmetric matrix:
In [1]:
# Import NumPy and seed random number generator to make generated matrices deterministic
import numpy as np
np.random.seed(1)
# Create a symmetric matrix with random entries
A = np.random.rand(5, 5)
A = A + A.T
print(A)
We can compute the eigenvectors and eigenvalues using the NumPy function linalg.eig
In [2]:
# Compute eigenvectors of A
evalues, evectors = np.linalg.eig(A)
print("Eigenvalues: {}".format(evalues))
print("Eigenvectors: {}".format(evectors))
The $i$th column of evectors
is the $i$th eigenvector.
Note that the above eigenvalues and the eigenvectors are real valued. This is always the case for symmetric matrices. Another features of symmetric matrices is that the eigenvectors are orthogonal. We can verify this for the above matrix:
We can also check that the second eigenpair is indeed an eigenpair (Python/NumPy use base 0, so the second eiegenpair has index 1):
In [3]:
import itertools
# Build pairs (0,0), (0,1), . . . (0, n-1), (1, 2), (1, 3), . . .
pairs = itertools.combinations_with_replacement(range(len(evectors)), 2)
# Compute dot product of eigenvectors x_{i} \cdot x_{j}
for p in pairs:
e0, e1 = p[0], p[1]
print ("Dot product of eigenvectors {}, {}: {}".format(e0, e1, evectors[:, e0].dot(evectors[:, e1])))
In [4]:
print("Testing Ax and (lambda)x: \n {}, \n {}".format(A.dot(evectors[:,1]), evalues[1]*evectors[:,1]))
In [5]:
B = np.random.rand(5, 5)
evalues, evectors = np.linalg.eig(B)
print("Eigenvalues: {}".format(evalues))
print("Eigenvectors: {}".format(evectors))
Unlike symmetric matrices, the eigenvectors are in general not orthogonal, which we can test:
In [6]:
# Compute dot product of eigenvectors x_{i} \cdot x_{j}
pairs = itertools.combinations_with_replacement(range(len(evectors)), 2)
for p in pairs:
e0, e1 = p[0], p[1]
print ("Dot product of eigenvectors {}, {}: {}".format(e0, e1, evectors[:, e0].dot(evectors[:, e1])))