Using 3ML's catalog and data downloading tools, it is easy to build an analysis for either a single or multiple GRBs from start to finish.
Here, we demostrate how we can search the catalogs for trigger information and then use this to perform analysis. Currently, Swift analysis is performed with the Generic OGIPLike or XRTLike plugin only and no data dowloaders are provided. In the future, a more capable Swift plugin can be included.
In [1]:
%matplotlib inline
%matplotlib notebook
from astropy.time import Time
from threeML import *
get_available_plugins()
In [2]:
gbm_catalog = FermiGBMBurstCatalog()
In [3]:
gbm_catalog.cone_search(ra=0,dec=0,radius=10)
Out[3]:
Or perhaps we want to know the GRBs around the Crab nebula:
In [5]:
gbm_catalog.search_around_source('Crab',radius=5)
Out[5]:
In [4]:
gbm_catalog.query("2<t90<10")
Out[4]:
We can also search MJD or UTC. The GBM catalog uses MJD to index time. We can use astroy's time module to convert from UTC to MJD:
In [6]:
mjd = Time(['2010-01-01T00:00:00','2011-01-01T00:00:00']).mjd
gbm_catalog.query(' %f <trigger_time < %f' % tuple(mjd))
Out[6]:
In [45]:
gbm_catalog.query("pflx_band_alpha < -2./3. & pflx_band_ergflux > 1E-5")
Out[45]:
In [5]:
gbm_catalog.query_sources("GRB080916009",'GRB100204024')
Out[5]:
In [6]:
gbm_catalog.result[['t90','t50']]
Out[6]:
To automate or repeat an analysis, it is good to know the values of source/background selections that were used in the catalog. These are extracted into a dictionary.
In [7]:
det_info = gbm_catalog.get_detector_information()
Let's examine what we have for GRB080916C.
We'll find the background selections and source (both peak and fluence) stored in a format readable by 3ML.
Additonally, we have the detectors used for catalog spectral analysis.
In [8]:
det_info['GRB080916009']
Out[8]:
In [9]:
gbm_data = download_GBM_trigger_data(trigger_name=det_info['GRB080916009']['trigger'],
detectors=det_info['GRB080916009']['detectors'],
destination_directory='gbm/bn080916009',
compress_tte=True)
We now have our data locally downloaded from the HEASARC servers. This gets the most current version of the files and prefers RSP2 to RSP. You can of course use other methods to obtain earlier versions or differnt DRM types.
We can see where they are stored:
If the data already exist in that location, the downloader will skip them.
Now we have everything we need to make an analysis. We just need to create our plugins and set our models.
3ML's GBM catalog will build a model for you from the HEASARC values. We simply choose the model and interval we want, i.e., Band, Comt, PL, SBPL and either peak or fluence interval.
Here we will use the best fit model (SBPL) for the fluence interval.
In [10]:
interval = 'fluence'
models = gbm_catalog.get_model(model=det_info['GRB080916009']['best fit model'][interval],interval=interval)
In [11]:
models['GRB080916009']
Out[11]:
In [12]:
det = 'n3'
nai3 = FermiGBMTTELike(name=det,
source_intervals=det_info['GRB080916009']['source']['fluence'],
background_selections=det_info['GRB080916009']['background']['full'],
tte_file=gbm_data[det]['tte'],
rsp_file=gbm_data[det]['rsp'])
det = 'n4'
nai4 = FermiGBMTTELike(name=det,
source_intervals=det_info['GRB080916009']['source']['fluence'],
background_selections=det_info['GRB080916009']['background']['full'],
tte_file=gbm_data[det]['tte'],
rsp_file=gbm_data[det]['rsp'])
det = 'b0'
bgo0 = FermiGBMTTELike(name=det,
source_intervals=det_info['GRB080916009']['source']['fluence'],
background_selections=det_info['GRB080916009']['background']['full'],
tte_file=gbm_data[det]['tte'],
rsp_file=gbm_data[det]['rsp'])
nai3.set_active_measurements('8-900')
nai4.set_active_measurements('8-900')
bgo0.set_active_measurements('250-42000')
In [13]:
threeML_config['gbm']['selection color'] = 'r'
nai3.view_lightcurve(stop=100)
In [14]:
data_list = DataList(nai3,nai4,bgo0)
jl = JointLikelihood(models['GRB080916009'], data_list=data_list)
# Now we can fit
res = jl.fit()
In [15]:
_ = display_ogip_model_counts(jl,min_rate=5,step=False)
In [26]:
lle_catalog = FermiLLEBurstCatalog()
We can perform cone, time, and trigger searches just like as with the GBM catalog. Currently, no T90 searches are available.
In [21]:
lle_catalog.cone_search(0,0,30)
Out[21]:
In [27]:
lle_catalog.query_sources('GRB080916009')
Out[27]:
In [28]:
lle_catalog.result.loc['GRB080916009']['trigger_name']
Out[28]:
In [29]:
lle_data_info = download_LLE_trigger_data(lle_catalog.result.loc['GRB080916009']['trigger_name'],destination_directory='lat')
In [30]:
lle_data_info
Out[30]:
In [31]:
lle = FermiLATLLELike('LLE',
lle_file=lle_data_info['lle'],
rsp_file=lle_data_info['rsp'],
ft2_file=lle_data_info['ft2'],
background_selections=det_info['GRB080916009']['background']['full'],
source_intervals=det_info['GRB080916009']['source']['fluence']
)
In [32]:
lle.view_lightcurve(stop=100)
lle.set_active_measurements('50000-100000')
In [28]:
data_list = DataList(nai3,nai4,bgo0,lle)
jl = JointLikelihood(models['GRB080916009'], data_list=data_list)
# Now we can fit
res = jl.fit()
In [29]:
_ = display_ogip_model_counts(jl,min_rate=5,step=False)
In [16]:
swift_catalog = SwiftGRBCatalog()
In [34]:
swift_catalog.cone_search(0,0,10)
Out[34]:
In [32]:
swift_catalog.query('"2005-09-22T15:02:00.257060" < trigger_time < "2005-10-22T15:02:00.257060"')
Out[32]:
In [17]:
swift_catalog.query('redshift > 7')
Out[17]:
Great! Now lets get those redshifts and learn more:
In [18]:
swift_catalog.get_redshift()
Out[18]:
We can even simply click on the links and pull up the reference when we call that column:
In [19]:
swift_catalog.get_redshift()['reference']
Out[19]:
In [36]:
swift_catalog.other_observing_instruments
Out[36]:
Let's look at all those that are also seen by Integral and/or Chandra:
In [20]:
swift_catalog.query_other_observing_instruments('Integral','Chandra')
Out[20]:
We can then obtain the informatrion from all the observing instruments from this (or any other) search:
In [21]:
other_instruments = swift_catalog.get_other_instrument_information()
other_instruments
Out[21]:
In particular, for Fermi-GBM, we can view the trigger number. In the future, other instruments will have this feature
It is then very simple to use the GBM catalog and downloader to obtain the proper information and data to perform analysis with 3ML
In [22]:
other_instruments['Fermi-GBM']
Out[22]:
In [41]:
swift_catalog.get_other_observation_information()
Out[41]:
In [23]:
cleanup_downloaded_GBM_data(gbm_data)
cleanup_downloaded_LLE_data(lle_data_info)
In [ ]: