Problem 1 (7 pts)

In the finite space all norms are equivalent. This means that given any two norms $\|\cdot\|_*$ and $\|\cdot\|_{**}$ over $\mathbb{C}^{n\times 1}$, inequality $$ c_1 \Vert x \Vert_* \leq \Vert x \Vert_{**} \leq c_2 \Vert x \Vert_* $$ holds for every $x\in \mathbb{C}^{n\times 1}$ for some constants $c_1, c_2$ which in general depend on the vector size $n$: $c_1 \equiv c_1(n)$, $c_2 \equiv c_2(n)$. Norms equivalence means that for a certain process convergence in $\Vert \cdot \Vert_{**}$ is followed by $\Vert \cdot \Vert_{*}$ and vice versa. Note that practically convergence in a certain norm may be better than in another due to the strong dependence on $n$.

Consider \begin{equation} \begin{split} c_1(n) \Vert x \Vert_\infty &\leqslant \Vert x \Vert_1 \leqslant c_2(n) \Vert x \Vert_\infty \\ c_1(n) \Vert x \Vert_\infty &\leqslant \Vert x \Vert_2 \leqslant c_2(n)\Vert x \Vert_\infty \\ c_1(n) \Vert x \Vert_2\ &\leqslant \Vert x \Vert_1 \leqslant c_2(n) \Vert x \Vert_2 \end{split} \end{equation}

  • Generate random vectors and plot optimal constants $c_1$ and $c_2$ from inequalities above as a function of $n$.
  • Find these optimal constants analytically and plot them together with constants found numerically

In [ ]:

Problem 2 (7 pts)

Given $A = [a_{ij}] \in\mathbb{C}^{n\times m}$

  • prove that for operator matrix norms $\Vert \cdot \Vert_{1}$, $\Vert \cdot \Vert_{\infty}$ hold $$ \Vert A \Vert_{1} = \max_{1\leqslant j \leqslant m} \sum_{i=1}^n |a_{ij}|, \quad \Vert A \Vert_{\infty} = \max_{1\leqslant i \leqslant n} \sum_{j=1}^m |a_{ij}|. $$ Hint: show that $$ \Vert Ax\Vert_{1} \leqslant \left(\max_{1\leqslant j \leqslant m} \sum_{i=1}^n |a_{ij}|\right) \Vert x\Vert_1 $$ and find such $x$ that this inequality becomes equality (almost the same hint is for $ \Vert A \Vert_\infty$).
  • check that for randomly generated $x$ and for given analytical expressions for $\Vert \cdot \Vert_{1}$, $\Vert \cdot \Vert_{\infty}$ always hold $\|A\| \geqslant \|Ax\|/\|x\|$ (choose matrix $A$ randomly)
  • prove that $ \Vert A \Vert_F = \sqrt{\text{trace}(A^{*} A)}$

In [ ]:

Problem 3 (6 pts)

  • Prove Cauchy-Schwarz (Cauchy-Bunyakovsky) inequality $(x, y) \leqslant \Vert x \Vert \Vert y \Vert $, where $(\cdot, \cdot)$ is a dot product that induces norm $ \Vert x \Vert = (x,x)^{1/2}$.
  • Show that vector norm $\|\cdot \|_2$ is unitary ivariant: $\|Ux\|_2\equiv \|x\|_2$, where $U$ is unitary
  • Prove that matrix norms $\|\cdot \|_2$ and $\|\cdot \|_F$ are unitary invariant: $\|UAV\|_2 = \|A\|_2$ and $\|UAV\|_F = \|A\|_F$, where $U$ and $V$ are unitary

In [ ]:

Problem 4 (5 pts)

  • Download Lenna image and import it in Python as a 2D real array
  • Find its SVD and plot singular values (use logarithmic scale)
  • Plot compressed images for several accuracies (use $\verb|plt.subplots|$). Specify their compression rates

In [ ]:

Problem 5 (bonus tasks)

  • The norm is called absolute if $\|x\|=\| \lvert x \lvert \|$ for any vector $x$, where $x=(x_1,\dots,x_n)^T$ and $\lvert x \lvert = (\lvert x_1 \lvert,\dots, \lvert x_n \lvert)^T$. Give an example of a norm which is not absolute.
  • Prove that Frobenius norm is not an operator norm

In [ ]: