Image Stacking Exercise

Written by Gautham Narayan (gnarayan@stsci.edu) for LSST DSFP #5

In the last couple of exercises, you determined the WCS solution for a couple of images and reprojected them on to a common grid. That process is slow, so for this exercise I'm just going to give you a lot of reprojected images to begin with. While it's not "Big Data" it's bigger than GitHub will manage, so I've stored it on Dropbox. Get this and extract it.

https://www.dropbox.com/s/7cpuegjdxv6lte7/bundle_sw.tar.gz?dl=0

These images were reprojected using the SWarp code, and when we did this we also produced a (gzipped) noise (sigma) map and a weight (mask) map.


In [ ]:
!ls ../wdd7_6/warped/ | head -n 10

We used both of these extra images to generate a weight per image for image stacking or when doing image subtraction. Depending on how you proceed with this exercise, you could use one or both, or ignore them altogether.

Begin by taking a look at a single image's FITS header


In [ ]:
!imhead ../wdd7_6/warped/wdd7.040920_0452.051_6.sw.fits

You might also want to look at the data (it's ALWAYS a good idea to check) with ds9 and get a sense for it. Are they really aligned or do I just have no idea what I did back when I was a wee grad student (it's ALWAYS a good idea to check...).

For example, did I actually take out the background? Are the PSFs similar-ish? Are the exposure times the same. WCSTools gethead is your friend.


In [ ]:
%matplotlib notebook
%pylab
import astropy.io.fits as afits
### astropy can seamlessly handle the gzipped fits images

Then, write a function that takes a filename and loads data from the image and the header that you think might be useful to weight the images by when producing a stack. (hint: SKYADU might be useful, maybe AIRMASS, ZPTMAG, EXPTIME - I'm tossing out suggestions - you can pick and examine what happens with different choices).


In [ ]:
#### You get to do this ####

Load the data into whatever structure you like - numpy (masked array), list, dictionary - whatever you are comfortable with slicing and dicing.


In [ ]:
#### You get to do this ####

Now that the data is loaded, decide on how you'd like to weight the data. Normalize them in some sensible way. You'll be comparing this against a median stack and an unweighted simple average.


In [ ]:
#### You get to do this ####

# Here's an example for weights
#
# weights = 10.**-0.4*zptmag # I'm just using the flux zeropoint to set the weights
# wsum = np.sum(weights)
# weights /= wsum

Create the stacks - median, unweighted mean and using your weighting function

If you decided to use the masks, make sure you use the corresponding functions in numpy.ma

if you want to get fancy, you can even try astropy.stats.sigma_clip, and then you'll have tried the most commonly used stacking methods with SWarp


In [ ]:
#### You get to do this ####
#
# from astropy.stats import sigma_clip

In [ ]:
# You'll probably want these to look at the results
from astropy.visualization import ZScaleInterval
zscaler = ZScaleInterval(nsamples=1000, contrast=0.25)

Plot up the stacks you made + one of the original images for comparison. I've saved my example output in the out directory


In [ ]:
#### You get to do this ####