Singulärwertzerlegung (Singular Value Decomposition, SVD)

Gegben: Matrix $A \in M(m \times n)$

Gesucht: Singulärwertzerlegung $$A = U \cdot \Sigma \cdot V^T$$ wobei

  • $U \in M(m \times m)$
  • $\Sigma \in M(m \times n)$
  • $V \in M(n \times n)$

In [1]:
# input matrix
A = matrix([
[2,2],
[-2,-2]
])
#

In [2]:
m, n = A.dimensions()
print "m x n = {} x {}".format(m, n)
show("A =", A)


m x n = 2 x 2

Eigenwerte von $A^T \cdot A$ berechnen


In [3]:
ata = A.T*A
show(ata)



In [4]:
ata.eigenvalues()


Out[4]:
[16, 0]

Singulärwerte $\sigma_i = \sqrt{e_i}$ für alle $e_i \neq 0$

Man erhält $r$ Singulärwerte


In [5]:
SV = [sqrt(abs(e)) for e in ata.eigenvalues() if e != 0]
r = len(SV)
sorted(SV)
show(SV)
print"r =", r


r = 1

Orthonormieren der Eigenvektoren von $A^T A$ um die Matrix $V$ zu erhalten.


In [6]:
ata.eigenvectors_left()
V = []
EW = []
for x in ata.eigenvectors_left():
    for y in x[1]:
        EW.append(x[0])
        V.append(vector([val for val in y]))

In [7]:
# input your own EW/EVs if they differ in position
#EW = []
#V = []

In [8]:
for i, (e, vs) in enumerate(zip(EW, V)):
    print "EW/EV", i
    show(e, ", ", vs)


EW/EV 0
EW/EV 1

Gram-Schmidt Verfahren zur Orthonormalisierung


In [9]:
W = []
U = []
W.append((1/(V[0].norm())) * V[0])
show(LatexExpr("v_0 ="), V[0])
show(LatexExpr("w_0 ="), W[0])
print
for i, v in enumerate(V[1:]):
    show(LatexExpr("v_{} =".format(i+1)), v)
    s = sum([(v.dot_product(w) * w) for w in W])
    u = v - s
    show(LatexExpr("u_{} =".format(i+1)), u)
    U.append(u)
    w = (1/u.norm()) * u
    show(LatexExpr("w_{} =".format(i+1)), w)
    W.append(w)
    print





In [ ]:

Berechnen der Spaltenvektoren $u_i$ der Matrix $U$

$$ u_i = \frac{1}{\sigma_i} A v_i \quad i = 1, \ldots, r$$

In [10]:
U = []
for e,v in zip(EW, W):
    if e != 0:
        u = (1/sqrt(e)) * A * v
        U.append(u)

# add an orthogonal vector if we miss some values
if len(U) != m:
    if len(U[0]) == 1:
        U.append(-1 * U[0])
    elif len(U[0]) == 2:
        U.append(vector([-1 * U[0][1], U[0][0]]))
    elif len(U[0]) == 3:
        assert len(U) == 2, "need at least two vectors already in U to find another orthogonal vector"
        # we can use the cross product to find a orthogonal vector
        U.append(U[0].cross_product(U[1]))
    else:
        raise Exception("Can't handle dimensions > 3 in this case :'( ")

In [11]:
show(*U)



In [12]:
U = matrix(U).T
V = matrix(W).T
Vt = V.T
D = matrix([ [0] * i + [sv] + [0] * (len(SV) - i - 1) for i, sv in enumerate(SV)])
S = matrix([[0] * i + [SV[i]] + [0] * (n - i - 1) 
            if i < len(SV) 
            else [0] * n 
            for i in range(m)])

In [13]:
assert U.dimensions() == (m, m), "dimensions of U don't match: " + str(U.dimensions())
assert S.dimensions() == (m, n), "dimensions of S don't match: " + str(S.dimensions())
assert V.dimensions() == (n, n), "dimensions of V don't match: " + str(V.dimensions())

In [14]:
show("U = ", U, ", S = ", S, ", V = ", V)



In [15]:
show(U * S * Vt, LatexExpr(" = A :"), (U * S * Vt) == A)


Comparison with sage implementation of SVD. (only available on RealDoubleField matrices)


In [16]:
Ar = matrix(RDF, A)
Ur, Sr, Vr = Ar.SVD()
show(Ur, Sr, Vr)



In [17]:
show(Ur * Sr * Vr)


Moore-Penrose Pseudo-Inverse

  • Gegeben: singuläre Matrix $A$ und Vektor $v$
  • Gesucht: $x$ in $A \cdot x = v$

Die Pseudoinverse kann mit SVD berechnet werden als: $$A^{\#} = V \cdot \Sigma^{\#} \cdot U^{T}$$

$$\text{mit} \quad \Sigma^{\#} = \begin{pmatrix} D^{-1} & 0 \\ 0 & 0 \\ \end{pmatrix} \in M(n \times m) \quad \text{ und } \quad D^{-1} = \begin{pmatrix} \frac{1}{\sigma_1} & 0 & \ldots & 0 \\ 0 & \frac{1}{\sigma_2} & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & \frac{1}{\sigma_r} \\ \end{pmatrix} $$

Das Gleichungssystem kann dann gelöst werden mit: $$ x = A^{\#} \cdot v$$


In [18]:
# input: 
v = vector([1,2])
#
show(v)



In [19]:
Sp = matrix(QQ, m, n, (D^-1).list() + [0] * (m * n - prod(D.dimensions())))
show(Sp)



In [20]:
Apinv = V * Sp * U.T
show(Apinv)



In [21]:
x = Apinv * v
show(x)


Using numpy.linalg module


In [22]:
MN = A.numpy()
import numpy
x = matrix(numpy.linalg.pinv(MN))*v
show(x)