A typical discretization scheme: given a domain $\Omega$, split it into small triangles/quadrilaterals.
$$\Omega_h = T_1 \cup \ldots \cup T_N,$$with each element you associate a basic function $u_i$ (i.e., a piecewise-linear hat function),
and approximate the solution as a linear combination:
$$u \approx u_h = \sum_{i=1}^N c_i u_i.$$For a finite-element method the error of approximation for piecewise-linear behaves like
$$\Vert u - u_h \Vert_{H_1} = \mathcal{O}(h),$$i.e. we talk about low-order finite element method and for high accuracy we need huge meshes.
The B-splines in 1D are defined recursively.
Order-0 B-splines are piecewise-constant functions.
High-order B-splines are defined from the basis functions
$$x^k_+ = \begin{cases} x^k, \quad x \geq 0, \\ 0, \quad \mbox{otherwise} \end{cases} $$Higher-order B-splines are defined by taking tensor-product of 1D B-splines.
Typical story: the set of patches is given, then this surface is triangulated by some internal mesher (probably very bad) to turn the model into a polygonal model.
Then, we need remeshing, adaptive mesh generation.
The idea of isogeometric analysis by TJR Hughes, JA Cottrell, Y Bazilevs (CMAME 2005) is to use the patch representation by NURBS directly to discretize the PDEs.
In [1]:
from IPython.core.display import HTML
def css_styling():
styles = open("./styles/custom.css", "r").read()
return HTML(styles)
css_styling()
Out[1]: