CTAPIPE Tutorial

ctapipe is pre-release software and is subject to wild changes over the next few months (restructuring, etc). It is under active developement, but is only in the very preliminary prototyping stage.

Setup:

python 3.4 is required! If you don't have it, type:

conda create -n py3 python=3.4 anaconda

Then wait, and when it's done, type:

source activate py3

next, install pyhessio via conda (for low-level data access)

conda install conda-build # if you don't have it already

git clone https://github.org/cta-observatory/pyhessio
conda build pyhessio
conda install --use-local pyhessio

Next, fetch and install ctapipe in development mode:

% git clone https://github.org/cta-observatory/ctapipe
% cd ctapipe
% make init         # this will fetch the example data
% make develop

Next, try running the test suite and a tool:

% make test

Run a demo tool:

% ctapipe-camdemo

if you don't get this animated window, or the tests fail, let me know

Notes

tools

  • ctapipe is a framework for making tools (which are by default command-line tools)
  • there is a prelminary command-line option handling system (based on argparse) that can write all parameters used to a FITS header, for future re-use
  • tools are just scripts in the ctapipe/tools/ directory, and their main funcion needs to be declared in setup.py
  • they get automatically installed in your path

try:

% ctapipe-info

In [ ]:
%%sh
ctapipe-info

In [ ]:
%%sh
ctapipe-info --version
ctapipe-info --dependencies
ctapipe-info --tools

So there are so far not a lot of tools... the API is more interesting to start with.

NOTE: however that this is pre-alpha, very rough software, so not much is implemented. It's in the prototyping phase, and things will change rapidly.

See what the package provides so far...

go to the documentation at:

https://cta-observatory.github.io/ctapipe/

Let's explore some fake data...


In [ ]:
%matplotlib inline
import warnings
warnings.filterwarnings("ignore")  # get rid of some annoyances in latest matplotlib

import matplotlib.pyplot as plt
import numpy as np

from ctapipe.io import CameraGeometry
from ctapipe.visualization import CameraDisplay

first let's load up a blank image. Rememeber that images in Cherenkov Cameras are generalized: they must be represented as 2 things, a CameraGeometry and Data (an array of pixel values)

We'll start with the HESS geometry, since it's the only one included so far that doesn't need to be extracted from a real data file


In [ ]:
geom = CameraGeometry.from_name("HESS",1)

In [ ]:
geom.pix_x

In [ ]:
geom.pix_area

In [ ]:
plt.scatter(geom.pix_x, geom.pix_y)

nice, but we can do better via the visualization module:


In [ ]:
image = np.random.uniform(size=len(geom.pix_id))
disp = CameraDisplay(geom, image=image)

In [ ]:
# better fake data:
%matplotlib inline
from ctapipe.reco import generate_2d_shower_model, make_mock_shower_image

showermodel = generate_2d_shower_model(centroid=(0.2, 0.05),length=0.05, width=0.1,psi='40d')
print(showermodel)
print(showermodel.pdf)

In [ ]:
im, sig, bg = make_mock_shower_image(geom, showermodel.pdf,intensity=10,nsb_level_pe=100)

In [ ]:
disp = CameraDisplay(geom, image=im)

Interactive data analysis

CameraDisplays are pretty interactive, which doesn't work in notebook mode (so we will continue in windowed mode)

We will write a script to

  • generate a fake calibrated shower image
  • apply image cleaning
  • calculate hillas parameters

In [ ]:
# switch back to window (no more plots in the notebook)
%matplotlib

In [ ]:
disp = CameraDisplay(geom)
disp.image = im

look for the popup window! It may be behind your browser...

Try using the pan and zoom tools


In [ ]:
disp.enable_pixel_picker()

In [ ]:
# change the image
image[20:50] = 1.0
image[400:450] = 0.5
disp.image = image

In [ ]:
# change the normalization or limits:
disp.norm = 'log' # or lin, or any matplotlib.colors.Normalization

In [ ]:
disp.norm = 'lin'

In [ ]:
disp.add_colorbar()

In [ ]:
# change the colorbar and limits

disp.cmap = 'jet'  # try others (see below)

gnuplot, Set1, gnuplot2_r, Blues, Set2, gist_ncar_r, gist_ncar, BrBG, Paired, BuPu, RdGy, gist_earth_r, Spectral, gist_stern_r, YlGn_r, autumn, Greys, gist_gray_r, winter, flag, Paired_r, ocean, PuBuGn, rainbow_r, hsv_r, hot, YlGnBu_r, Blues_r, spring, Oranges, bone_r, prism_r, BuPu_r, pink, hot_r, coolwarm, gist_earth, OrRd, Purples_r, gist_yarg_r, YlOrRd_r, Accent, gist_rainbow, PRGn_r, cool_r, PRGn, flag_r, PuOr, winter_r, summer_r, brg, gray_r, afmhot_r, jet_r, cubehelix, PuRd, YlOrBr, Spectral_r, gist_stern, BrBG_r, PuRd_r, YlGn, RdYlBu_r, RdYlGn_r, autumn_r, YlGnBu, afmhot, RdPu, binary, bone, RdYlGn, nipy_spectral_r, spring_r, Set3, coolwarm_r, terrain_r, spectral, gist_yarg, RdYlBu, RdGy_r, PiYG_r, gist_heat, Pastel1, PuOr_r, PuBu, jet, gist_rainbow_r, rainbow, pink_r, Purples, nipy_spectral, Accent_r, Wistia_r, Pastel2_r, copper, bwr, seismic, Greens_r, summer, cool, YlOrRd, CMRmap_r, Dark2_r, Wistia, seismic_r, RdBu, gist_heat_r, Pastel1_r, binary_r, GnBu_r, PiYG, spectral_r, GnBu, RdPu_r, YlOrBr_r, PuBu_r, PuBuGn_r, Dark2, prism, BuGn, RdBu_r, gist_gray, ocean_r, BuGn_r, gnuplot2, Set3_r, copper_r, OrRd_r, Greens, Set2_r, gnuplot_r, Oranges_r, brg_r, cubehelix_r, Reds_r, Set1_r, gray, Reds, bwr_r, hsv, terrain, Pastel2, Greys_r, CMRmap

Exercise: Calcualte Hillas Parameters (2D moments of the signal distribution):

  • step 1: Clean the image (select only the pixels that have noise, since the noise will throw off the calculation)
    • note that tailcuts_clean defines its thresholds as a function of the pedestal variance (which here we do not know. You can use a fixed value of 10 for this exercise)
  • step 2: pass the signal pixels to the hillas_parameters routine
  • step 3: overlay the ellipse with disp.overlay_moment_ellipse()

In [ ]:
%matplotlib inline
from ctapipe.reco.cleaning import tailcuts_clean, dilate
from ctapipe.reco import hillas_parameters

In [ ]:
CameraDisplay(geom, image=im)

clean the image


In [ ]:
# cleanmask = [code here]

# im_cleaned = im.copy()
# im_cleaned =  [code here]

# Display results!

# plt.figure()
# CameraDisplay(geom, image=cleanmask, title='mask')
# plt.figure()
# CameraDisplay(geom, image=im_cleaned, title='cleaned')

now parameterize the image using hillas_parameters


In [ ]:
# params = 

#disp = CameraDisplay(geom, image=im_cleaned)
#disp.overlay_moments(params)

Next, Interactively see the effect of wider cleanings! use the dilate(geom, cleanmask) function to "widen" the cleaning (each call widens it by one row of neighbor pixels), and redo your hillas parameterization (you can just insert this dilation into your script above before you apply the clean mask to your image)

Next, is to look at some real data! For that there is a separate notebook:

in the package you downloaded, look in: ctapipe/examples/notebooks/raw_data_exploration.ipynb

(you can also view it non-interactively directly on github via:)


In [ ]: