This lecture considered rotation matrices, which led naturally to the topic on orthogonal matrices.
A matrix $\boldsymbol{A}$ is orthogonal if $\boldsymbol{A}^{-1} = \boldsymbol{A}^{T}$. A $n \times n$ matrix is orthogonal if its columns form and orthonormal basis.
In [1]:
# Import NumPy and seed random number generator to make generated matrices deterministic
import numpy as np
np.random.seed(1)
# Create a matrix with random entries
A = np.random.rand(4, 4)
# Use QR factorisation of A to create an orthogonal matrix Q (QR is covered in IB)
Q, R = np.linalg.qr(A, mode='complete')
We can now verify that Q is an orthognal matrix. We first check that $\boldsymbol{Q}^{-1} = \boldsymbol{Q}^{T}$ by computing $\boldsymbol{Q}\boldsymbol{Q}^{-1}$
In [2]:
print(Q.dot(Q.T))
We can see that $\boldsymbol{Q}\boldsymbol{Q}^{-1} = \boldsymbol{I}$ (within numerical precision). We can check that the colums of $\boldsymbol{Q}$ are orthonormal
In [3]:
import itertools
# Build pairs (0,0), (0,1), . . . (0, n-1), (1, 2), (1, 3), . . .
pairs = itertools.combinations_with_replacement(range(len(Q)), 2)
# Compute dot product of column vectors q_{i} \cdot q_{j}
for p in pairs:
col0, col1 = p[0], p[1]
print ("Dot product of column vectors {}, {}: {}".format(col0, col1, Q[:, col0].dot(Q[:, col1])))
The columns of $\boldsymbol{Q}$ are orthonormal, and $\boldsymbol{Q}^{T}$ is also a rotation matrix and has orthonormal columns. Therefore, the rows of $\boldsymbol{Q}$ are also orthonormal.
In [4]:
# Compute dot product of row vectors q_{i} \cdot q_{j}
pairs = itertools.combinations_with_replacement(range(len(Q)), 2)
for p in pairs:
row0, row1 = p[0], p[1]
print ("Dot product of row vectors {}, {}: {}".format(row0, row1, Q[row0, :].dot(Q[row1, :])))
Finally, we check the determinant of $\boldsymbol{Q}$:
In [5]:
print("Determinant of Q: {}".format(np.linalg.det(Q)))