This hands-on lecture corresponds to material from Lecture 02: Linear Algebra & Optimization.
Advice: Interpreting Matrix Operations (10-15mins)
Hands-on Exercises (60mins)
Recall the following definition of matrix multiplication:
$$ b_i = \sum_{j=1}^n a_{ij} x_j \quad (i = 1,\dots,m) $$If $x \in \mathbb{R}^n$ and $A \in \mathbb{R}^{m \times n}$, then the $i$th entry of $b = Ax \in \mathbb{R}^m$ is:
In other words, $b$ is a dot product between $x$ and the rows of $A$. If you think of $A$ as data matrix, it is useful to keep in mind the following interpretation:
Recall that the range or image of $A \in \mathbb{R}^{m \times n}$ is the set of vectors $y \in \mathbb{R}^m$ that can be written as $y=Ax$ for some $x \in \mathbb{R}^n$,
$$ \text{im} \hspace{0.1cm} \text{A} = \{ y \in \mathbb{R}^m \mid \exists x \in \mathbb{R}^n, y = Ax \} $$Hint: Use our "column view" of matrix-vector multiplication!
Because any $Ax$ is a linear combination of the columns of $A$, any vector $y \in \text{im} \hspace{0.1cm} \text{A}$ can be written as a linear combination of the columns of $A$, $$ y = \sum_{j=1}^n x_j a_j $$ Forming a vector $x \in \mathbb{R}^n$ out of these coefficients $x_j$, we have $y = Ax$, and thus $y \in \text{im} \hspace{0.1cm} \text{A}$.
Since $B=AR$, the columns of $B$ are linear combinations of the columns of $A$, with coefficients taken from the columns of $R$. Because of the diagonal structure of $R$, the $j$th column of $B$ is the sum of the first $j$ columns of $A$:
$$ b_j = A r_j = \sum_{k=1}^j a_j $$Recall that the transpose of $A \in \mathbb{R}^{m \times n}$ is $A^T \in \mathbb{R}^{n \times m}$ with indeces "flipped", $$ (A^T)_{ij} = A_{ji} $$
False. Let $A \in \mathbb{R}^{n \times n}$ be any square matrix and $B = I$ be the identity. Unless $A$ is symmetric, then $AB = A \neq A^T = A^TB^T$. There are plenty of asymmetric matrices! For example,
$$ A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} $$True! Assume $A \in \mathbb{R}^{m \times p}, B \in \mathbb{R}^{p \times n}$. After verifying that the matrix dimensions match up, we can verify it the brute-force way,
$$ \begin{aligned} [(AB)^T]_{ij} &= [AB]_{ji} = \sum_{k=1}^p a_{jk} b_{ki} \\ [B^TA^T]_{ij} &= \sum_{k=1}^p [B^T]_{ik} [A^T]_{kj} = \sum_{k=1}^p a_{jk} b_{ki} \end{aligned} $$Try to interpret this result using what we've learned about matrix-vector products!
Verify the first one elementwise:
Use Problem 3.2 to solve the other two:
Solution 4: Recall that, for any matrix $M$, the inverse $M^{-1}$ is the unique matrix such that $MM^{-1} = I$, the identity matrix.
...
Recall that $|| \cdot || : \mathbb{R}^n \rightarrow \mathbb{R}$ is a norm if and only if for all $x,y \in \mathbb{R}^n$ and $\alpha \in \mathbb{R}$,