In [4]:
from IPython.html.services.config import ConfigManager
from IPython.utils.path import locate_profile
cm = ConfigManager(profile_dir=locate_profile(get_ipython().profile))
cm.update('livereveal', {
'theme': 'sky',
'transition': 'zoom',
'start_slideshow_at': 'selected',
})
Out[4]:
A rank-$r$ matrix can be represented as
$$A = C \widehat{A}^{-1} R, $$where $C$ are some columns of the matrix $A$, $R$ are some rows of the matrix $A$, $\widehat{A}$ is the submatrix on their intersection.
Approximate case:
If $\widehat{A}$ is the submatrix with maximal volume (where volume is the absolute value of the determinant)
Idea of the cross approximation: select the submatrix to maximize the determinant, i.e. in a greedy fashion.
The term "cross" comes from the rank-$1$ update formula
$$A_{ij} := A_{ij} - \frac{A_{i j^*} A_{i^* j}}{A_{i^* j^*}},$$where the pivots $(i^*, j^*)$ has to be selected in such a way that $|A_{i^* j^*}|$ is as big as possible.
Solution: Approximate the "far zone" with few receivers ("representor set").
In [16]:
import os
os.environ['OMP_NUM_THREADS'] = '1'
os.environ['MKL_NUM_THREADS'] = '1'
from h2py.tree.quadtree import ParticlesQuadTree as PQT
from h2py.tree.octtree import ParticlesOctTree as POT
from h2py.tree.inertial_tree import ParticlesInertialTree as PIT
from h2py.data.particles_data import ParticlesData as PD
from h2py.problem import Problem
from h2py.data.particles import log_distance
inv_distance = log_distance
import numpy as np
from time import time
import sys
N = 20000
np.random.seed(0)
particles = np.random.rand(2, N)
data = PD(particles)
tree = PIT(data, block_size = 20)
problem = Problem(inv_distance, tree, tree)
problem.build()
problem.gen_queue(symmetric=0)
print('H2 1e-5')
problem.factorize('h2', tau=1e-5, onfly=0, verbose=1, iters=1)
Consider the matrix $$A = \begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22}\end{bmatrix},$$
where the first group of variables correspond to the unknowns in the first node, the second group variables corresponds to the unknowns in the second node (binary tree implicitly assumed).
Then,
$$A \begin{bmatrix} q_1 \\ q_2 \end{bmatrix} = \begin{bmatrix} f_1 \\ f_2 \end{bmatrix}.$$After the elimination we have the following equality for the Schur complement.
$$q_1 = A^{-1}_{11} f_1 - A^{-1}_{11} A_{12} q_2, \quad \underbrace{(A_{22} - A_{21} A^{-1}_{11} A_{12})}_{\mbox{Schur complement}} q_2 = f_2 - A_{21} A^{-1}_{11} f_1.$$The core idea is recursion: if we know $A^{-1}_{11},$ then we can compute the matrix $S$, and invert it as well.
The multiplication of H-matrices has $\mathcal{O}(N \log N)$ complexity, and is also (typically) implemented via recursion.
Consider 1D partioning, and multiplication of two matrices with H-structure (i.e., blocks(1,2) and (2,1) ) have low-rank
$$\begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22}\end{bmatrix}\begin{bmatrix} B_{11} & B_{12} \\ B_{21} & B_{22}\end{bmatrix}$$The (1, 2) block of the result is
$$ \underbrace{A_{21} B_{11}}_{\mbox{low rank}} + \underbrace{A_{22} B_{21}}_{\mbox{low rank}} $$(1, 1) and (2, 2) blocks are evaluated recursively, so it is a recursion inside recursion.
Sparse matrices coming from PDEs are in fact H-matrices as well!
So, we can compute the inverse by the same block factorization technique.
Works very well for the "1D" partioning (i.e., off-diagonal blocks are of low-rank), does not work for 2D/3D problems with optimal complexity (but the constants can be really good).
We also have our own idea (an since it is unpublished, will use the whiteboard magic here :)
In [4]:
from IPython.core.display import HTML
def css_styling():
styles = open("./styles/custom.css", "r").read()
return HTML(styles)
css_styling()
Out[4]:
In [ ]: