Start by loading some boiler plate: matplotlib, numpy, scipy, json, functools, and a convenience class.
In [1]:
%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
from scipy.interpolate import interp1d
from scipy.optimize import bisect
import json
from functools import partial
class Foo: pass
And some more specialized dependencies:
Slict
provides a convenient slice-able dictionary interfaceChest
is an out-of-core dictionary that we'll hook directly to a globus remote using...glopen
is an open-like context manager for remote globus files
In [2]:
from chest import Chest
from slict import CachedSlict
from glopen import glopen, glopen_many
Configuration for this figure.
In [3]:
config = Foo()
#config.name = "HighAspect/Nu02D02/HA_Nu02D02"
config.name = "HighAspect/HA_base/HA_base"
config.arch_end = "maxhutch#alpha-admin/~/pub/"
Open a chest located on a remote globus endpoint and load a remote json configuration file.
In [23]:
c = Chest(path = "{:s}-results".format(config.name),
open = partial(glopen, endpoint=config.arch_end),
open_many = partial(glopen_many, endpoint=config.arch_end))
sc = CachedSlict(c)
with glopen(
"{:s}.json".format(config.name), mode='r',
endpoint = config.arch_end,
) as f:
params = json.load(f)
We want to plot the spike depth, which is the 'H' field in the chest. Chests can prefetch lists of keys more quickly than individual ones, so we'll prefetch the keys we want.
In [27]:
t_start = sc[:,'frame'].keys() [0]
t_mid = sc[:,'frame'].keys() [10]
t_end = sc[:,'frame'].keys() [-1]
c.prefetch(sc[t_start,:].full_keys())
c.prefetch(sc[t_mid,:].full_keys())
c.prefetch(sc[t_end,:].full_keys())
Plot the spike depth, the 'H' keys, vs. time.
In [25]:
print("The last time value is {:f}".format(t_end))
#plt.figure(figsize=(32,2))
nl = sc[t_end,'t_yz'].shape[1]
fig, axs = plt.subplots(1,3)
axs[0].imshow(sc[t_start,'t_yz'][:,nl/2-100:nl/2+100].transpose())
axs[1].imshow(sc[t_mid,'t_yz'][:,nl/2-100:nl/2+100].transpose())
axs[2].imshow(sc[t_end,'t_yz'][:,nl/2-100:nl/2+100].transpose())
Out[25]:
The spike depth, 'H', is the smallest depth for which the span-wise average of the density is 99% of its asymptotic value. What do the density profiles themselves look like?
The density profiles are stored as 't_proj_z': the temperature projected onto the z axis. Let's prefech it.
In [7]:
c.prefetch(sc[:,'t_proj_z'].full_keys())
Now let's plot it.
In [8]:
Ts = sc[:,'t_proj_z'].keys()
Zs = np.linspace(params["root_mesh"][2], params["extent_mesh"][2], sc[Ts[0], 't_proj_z'].shape[0])
for t in Ts:
plt.plot(Zs,sc[t,'t_proj_z']/(params["shape_mesh"][0]*(params["order"]-1))**2, 'k-')
plt.xlim(xmin=params["root_mesh"][2],xmax=params["extent_mesh"][2]);
plt.xlabel("Height (m)");
plt.ylabel(r"$\Delta \rho / \bar{\rho}$");
plt.title("Density profiles");
That doesn't look great. Let's zoom in. The data is cached, so this is fast!
In [9]:
Ts = sc[:,'t_proj_z'].keys()
Zs = np.linspace(params["root_mesh"][2], params["extent_mesh"][2], sc[Ts[0], 't_proj_z'].shape[0])
for t in Ts:
plt.plot(Zs,sc[t,'t_proj_z']/(params["shape_mesh"][0]*(params["order"]-1))**2, 'k-')
plt.xlim(xmin=-0.8,xmax=0.8);
plt.xlabel("Height (m)");
plt.ylabel(r"$\Delta \rho / \bar{\rho}$");
plt.title("Density profiles");
Now where's that 99% threshold?
In [10]:
Ts = sc[:,'t_proj_z'].keys()
Zs = np.linspace(params["root_mesh"][2], params["extent_mesh"][2], sc[Ts[0], 't_proj_z'].shape[0])
for t in Ts:
plt.plot(Zs,sc[t,'t_proj_z']/(params["shape_mesh"][0]*(params["order"]-1))**2, 'k-')
plt.plot([-0.8, 0.], params['atwood']*np.array([.98, .98]));
plt.xlim(xmin=-0.8,xmax=0);
plt.xlabel("Height (m)");
plt.ylabel(r"$\Delta \rho / \bar{\rho}$");
plt.title("Density profiles");
plt.ylim(ymin=0.9*params['atwood'],ymax=1.1*params['atwood']);
Let's re-do the analysis for a 95% threshold to make sure nothing changes.
In [11]:
Ts = sc[:,'t_proj_z'].keys()
Zs = np.linspace(params["root_mesh"][2], params["extent_mesh"][2], sc[Ts[0], 't_proj_z'].shape[0])
my_H = []
for t in Ts:
p = sc[t, 't_proj_z']/(params["shape_mesh"][0]*(params["order"]-1))**2
f = interp1d(Zs, p-(.90*params['atwood']))
my_H.append(bisect(f,params["root_mesh"][2], 0))
plt.title("Spike depth")
plt.xlabel("Time (s)")
plt.ylabel("Depth (m)")
plt.plot(sc[:,'H'].keys(), sc[:,'H'].values());
plt.plot(sc[:,'H'].keys(), my_H);
That didn't change much. What about 90%?
In [12]:
Ts = sc[:,'t_proj_z'].keys()
Zs = np.linspace(params["root_mesh"][2], params["extent_mesh"][2], sc[Ts[0], 't_proj_z'].shape[0])
my_H = []
for t in Ts:
p = sc[t, 't_proj_z']/(params["shape_mesh"][0]*(params["order"]-1))**2
f = interp1d(Zs, p-(.80*params['atwood']))
my_H.append(bisect(f,params["root_mesh"][2], 0))
plt.title("Spike depth")
plt.xlabel("Time (s)")
plt.ylabel("Depth (m)")
plt.plot(sc[:,'H'].keys(), sc[:,'H'].values());
plt.plot(sc[:,'H'].keys(), my_H);
That introduces an offset, but the spike velocity (slope) looks the same for late time. I've convinced myself that the results in the original figure are robust to changes in parameters.
In [13]:
%install_ext http://raw.github.com/jrjohansson/version_information/master/version_information.py
%load_ext version_information
%version_information numpy, matplotlib, slict, chest, glopen, globussh
Out[13]: