Requirements:
Conda is great, but it's still fairly new. The packages have only been generated and tested on a few platforms.
Grid generation with stomel can be used either from a command line program, tom.py, or the programmatic approaches shown here.
In [1]:
%matplotlib inline
In [2]:
# Import a bunch of libraries.
import numpy as np
import matplotlib.pyplot as plt
from stomel import paver
from stomel import field
from stomel import constrained_delaunay
In [3]:
# Define a polygon
boundary=np.array([[0,0],[1000,0],[1000,1000],[0,1000]])
island =np.array([[200,200],[600,200],[200,600]])
rings=[boundary,island]
In [4]:
# And the scale:
scale=field.ConstantField(50)
In [5]:
p=paver.Paving(rings=rings,density=scale)
p.pave_all() # writes cryptic progress messages - takes about 30s
In [6]:
p.plot().set_color('g')
plt.axis('equal')
plt.title('Constant scale grid')
Out[6]:
Scales are set by various subclasses of Field, which are simply different ways of representing a scalar function over the $x$-$y$ plane. The simplest is ConstantField, but the most common for grid generation (and the two which are supported by the command-line tom.py script) are ConstrainedDelaunayField and ApolloniusField.
ApolloniusField (named for the datastructure, Apollonius graph) is an automatically telescoping field. You provide points and scales at those points. The scale is allowed to grow (telescope) at a constant rate (defaults to 10%). This is handy when there is some feature which should be resolved at a higher resolution, but you want to gradually return to the background resolution. In the example below, a sinusoidal river is given a higher resolution, and resolution in the rest of the domain is based on the distance to the "river."
The Apollonius graph implementation in CGAL does not get a lot of attention, and is missing from stock python bindings. The rustychris channel python-cgal-bindings has it, though. These are compiled from a fork of python-cgal-bindings.
In [7]:
# Second example - telescoping scale
square=np.array([[0,0],[1000,0],[1000,1000],[0,1000]])
x=np.linspace(0,1000,100)
# make a meandering channel where resolution will be higher
# and let the resolution grow at the default rate (10%)
channel=np.array( [x,300+100*np.cos(x/200.)]).T
channel_scale=field.ApolloniusField(X=channel,F=20*np.ones(len(channel)))
In [8]:
p2=paver.Paving(rings=[square],density=channel_scale)
p2.pave_all() # might take a minute or two...
In [9]:
plt.figure(figsize=(6,6))
p2.plot()
plt.plot(channel[:,0],channel[:,1],'k-')
plt.legend(['"river"'])
plt.axis('equal')
plt.axis(xmin=-50,xmax=1050,ymin=-50,ymax=1050)
Out[9]:
In [10]:
plt.figure(figsize=(6,6))
plt.clf()
p2.density.to_grid(dx=20,dy=20,bounds=p2.bounds()).plot(interpolation='none')
plt.colorbar(label='Target edge length')
Out[10]:
The other way to describe variations in scale is linear interpolation. In the most basic form, a set of points are given, and each point has an associated scale. Standard linear interpolation (nearest neighbors in this case) then fills in the space between points.
To gain a bit of control over the interpolation, though, you can also specify polylines, and a scale at each vertex of each polyline. The triangulation used for interpolation will incorporate the polylines (a constrained delaunay triangulation). This can lead to fewer surprises when the polylines are sparse.
Unlike the telescoping field, with linearly interpolated fields it is up to the user to avoid overly sharp changes in scale. Following the 10% rule of thumb (adjacent cells shouldn't vary in length-scale by more than 10%), an absolute change in scale of $d$ takes a distance of $10d$. The key here is absolute change in scale - so whether it's interpolating from 1000 to 900 or 200 to 100, you still need $10(1000-900)=10(200-100)=1000$ units separation.
In [11]:
# Rectangular shoreline
rect=np.array([[0,0],[4000,0],[4000,1000],[0,1000]])
# The background, coarse scale, defined along a polyline
# surrounding the entire domain.
# linear scale does not support extrapolation - the shoreline
# must fall entirely within convex hull of the scale polylines.
bg=np.array([[-10,-10],[4010,-10],[4010,1010],[2000,1010]])
bg_scale=180*np.ones(len(bg))
# and one corner with higher resolution.
banks=np.array([[-10,800],[-10,1010],[200,1010]])
banks_scale=80*np.ones(len(banks))
lin_scale=constrained_delaunay.ConstrainedXYZField.from_polylines([bg,banks],
[bg_scale,banks_scale])
p3=paver.Paving(rings=[rect],density=lin_scale)
In [12]:
# See how the interpolation actually distributed the scales
plt.figure(figsize=(15,4))
p3.density.to_grid(dx=50,dy=50).plot()
t=p3.density.tri()
plt.triplot(t,color='m')
plt.colorbar(label='Target edge length')
plt.title('Linearly interpolated scale: triangulation and result')
Out[12]:
In [13]:
p3.pave_all() # about 120s on reasonable desktop
In [14]:
plt.figure(figsize=(15,4))
coll=p3.plot().set_color('g')
plt.axis('equal')
plt.axis(ymin=-50,ymax=1050,xmin=-50,xmax=4050)
plt.title('Resulting Mesh')
Out[14]: