In this unit we discuss matrices with special properties – symmetric, possibly complex, and positive definite. The central topic of this unit is converting matrices to nice form (diagonal or nearly-diagonal) through multiplication by other matrices. Generally, this process requires some knowledge of the eigenvectors and eigenvalues of the matrix.
Special matrices have special eigenvalues and eigenvectors. Symmetric and positive definite matrices have extremely nice properties, and studying these matrices brings together everything we've learned about pivots, determinants and eigenvalues. In this session we also practice doing linear algebra with complex numbers and learn how the pivots give information about the eigenvalues of a symmetric matrix.
The Fourier matrices have complex valued entries and many nice properties. This session covers the basics of working with complex matrices and vectors, and concludes with a description of the fast Fourier transform.
In calculus, the second derivative decides whether a critical point of y(x) is a minimum. For functions of multiple variables, the test is whether a matrix of second derivatives is positive definite. In this session we learn several ways of testing for positive definiteness and also how the shape of the graph of ƒ(x) = xT Ax is determined by the entries of A.
After a final discussion of positive definite matrices, we learn about "similar" matrices: B = M−1AM for some invertible matrix M. Square matrices can be grouped by similarity, and each group has a "nicest" representative in Jordan normal form. This form tells at a glance the eigenvalues and the number of eigenvectors.
If A is symmetric and positive definite, there is an orthogonal matrix Q for which A = QΛQT. Here Λ is the matrix of eigenvalues. Singular Value Decomposition lets us write any matrix A as a product UΣVT where U and V are orthogonal and Σ is a diagonal matrix whose non-zero entries are square roots of the eigenvalues of ATA. The columns of U and V give bases for the four fundamental subspaces.
When we multiply a matrix by an input vector we get an output vector, often in a new space. We can ask what this "linear transformation" does to all the vectors in a space. In fact, matrices were originally invented for the study of linear transformations.
Video cameras record data in a poor format for broadcasting video. To transmit video efficiently, linear algebra is used to change the basis. But which basis is best for video compression is an important question that has not been fully answered!
We'd like to be able to "invert A" to solve Ax = b, but A may have only a left inverse or right inverse (or no inverse). This discussion of how and when matrices have inverses improves our understanding of the four fundamental subspaces and of many other key topics in the course.
Note: In the Fall of 1999, when the lecture videos were recorded, this lecture was given after exam 3. For the OCW Scholar version of the course we have moved it into the main body of material.
Exam 3 covers the understanding of matrices through their eigenvalues. Elimination is history now, because it does NOT preserve eigenvalues. Notice the special types of matrices (like positive definite) and their factorizations.
Even the "easy" material from the first third of the class is useful in answering these surprisingly complicated questions! Your understanding of linear algebra has been built up using the four subspaces, the eigenvalues, and all the special types of matrices: symmetric, orthogonal, projections, permutations (and you can add more).
Nine questions in a three-hour closed-book exam would be typical for this course at MIT. We try to cover all the way from Ax=0 (the null space and the special solutions) to projections, determinants, eigenvalues, and even a touch of singular values from the eigenvalues of ATA. That is the good matrix of linear algebra: square, symmetric, and positive definite or at least semidefinite.
In [ ]: