In [16]:
from IPython.display import Image
In [40]:
Image('darksky.png', width=400)
Out[40]:
The audacious, longterm goal of our research is to map out the large scale structure of the universe. Specifically we are interested in resolving dark matter, a key contributor that can only be measured indirectly through its gravitional influence on nearby galaxies. We insert synthetic galaxies into a large box of space filled with dark matter halos from simulations. Then we compute the likelihood of a particular dark matter halo mass allocation by measuring the observed shear of the galaxies and comparing this to the results from ray tracing. By sampling various mass allocations and weighting them by the likelihood we produce a posterior mass distribution for each dark matter halo.
Today we are confirming the validity of this approach with simulated data, and in the future we look forward to using our framework to resolve dark matter structures in real surveys. There is also significant potential for inferring galaxy formation and cosmology parameters that would be of high value to the broader astrophysics community.
This project builds on the work done in the original 'Pangloss' paper, Reconstructing the Lensing Mass in the Universe from Photometric Catalogue Data, and Spencer Everett's thesis.
In [38]:
Image('lightcone.png', width=400)
Out[38]:
The core idea of the framework
In [37]:
Image('simple_mc_pgm.png', width=400)
Out[37]:
We have sets of foregrounds and backgrounds (may refer to them as 'halos' and 'sources' in the future) along with the variables
The likelihood of a set of parameter values, θ, given outcomes x, is equal to the probability of those observed outcomes given those parameter values, that is
$$\mathcal{L}(\theta |x) = P(x | \theta)$$In this context, under the assumption that errors are independent and exhibit a Gaussian distribution, the log-likelihood is
$$\ln(\mathcal{L}) = -N\ln(2\pi\sigma) - \frac{1}{2}\sum^N_{i=1}\sum_{j=1}^2\Big(\frac{(\epsilon_{i,j} - g_{i,j})^2}{\sigma^2}\Big)$$where $N$ is the number of samples, $\sigma = \sqrt{\sigma_{obs}^2+\sigma_{int}^2}$, $\epsilon_{i,j}$ is the ellipticity of the ith source, jth component, $g_{i,j}$ is the reduced shear of the ith source, jth component. Each iteration of our inference computes a log-likelihood and uses this to weight the sampled dark matter halo vector and build a posterior for each dark matter halo.
In [44]:
Image('smhm.png', width=400)
Out[44]:
In [43]:
Image('pangloss_inference_1.png', width=1000)
Out[43]:
Lean
Fast
Reproducible
Tested
Modular
Robust
Below we have the call structures for Pangloss and MassInference. The MassInference call structure has been widled down to sequences of numpy operations. It also eliminates some duplication that existed in Pangloss.
In [51]:
print 'Pangloss Call Structure'
Image('/Users/user/Desktop/pang_prof.png', width=1000)
Out[51]:
In [50]:
print 'MassInference Call Structure'
Image('/Users/user/Desktop/mi_prof.png', width=400)
Out[50]:
...
Thanks to Professor Wechsler, the supportive galaxy formation and cosmology group, and the great mentor Dr. Phil Marshall for making this opportunity possible. I have been impressed by how accessible Phil is across multiple communication channels (email, slack, github, google docs) and inspired by his interest in and appreciation for computational science.