XPCS&XSVS Pipeline for Single-(Gi)-SAXS Run

"This notebook corresponds to version {{ version }} of the pipeline tool: https://github.com/NSLS-II/pipelines"

This notebook begins with a raw time-series of images and ends with $g_2(t)$ for a range of $q$, fit to an exponential or stretched exponential, and a two-time correlation functoin.

Overview

  • Setup: load packages/setup path
  • Load Metadata & Image Data
  • Apply Mask
  • Clean Data: shutter open/bad frames
  • Get Q-Map
  • Get 1D curve
  • Define Q-ROI (qr, qz)
  • Check beam damage
  • One-time Correlation
  • Fitting
  • Two-time Correlation The important scientific code is imported from the chxanalys and scikit-beam project. Refer to chxanalys and scikit-beam for additional documentation and citation information.

CHX Olog NoteBook

CHX Olog (https://logbook.nsls2.bnl.gov/11-ID/)

Setup

Import packages for I/O, visualization, and analysis.


In [1]:
from chxanalys.chx_packages import *
%matplotlib notebook
plt.rcParams.update({'figure.max_open_warning': 0})
plt.rcParams.update({ 'image.origin': 'lower'   })
plt.rcParams.update({ 'image.interpolation': 'none'   })
import pickle as cpk
from chxanalys.chx_xpcs_xsvs_jupyter_V1 import *


/opt/conda_envs/analysis/lib/python3.5/site-packages/filestore/retrieve.py:15: UserWarning: Do not import filestore.retrieve, import filestore.api instead
  warnings.warn("Do not import filestore.retrieve, "
/opt/conda_envs/analysis/lib/python3.5/site-packages/IPython/html.py:14: ShimWarning: The `IPython.html` package has been deprecated since IPython 4.0. You should import from `notebook` instead. `IPython.html.widgets` has moved to `ipywidgets`.
  "`IPython.html.widgets` has moved to `ipywidgets`.", ShimWarning)

In [2]:
Javascript( '''
        var nb = IPython.notebook;
        var kernel = IPython.notebook.kernel;
        var command = "NFP = '" + nb.base_url + nb.notebook_path + "'";        
        kernel.execute(command);
        ''' )


Out[2]:

In [3]:
#print( 'The current running pipeline is: %s' %NFP)

In [4]:
#%reset -f -s dhist in out array

Control Runs Here


In [5]:
#scat_geometry = 'saxs'  #suport 'saxs', 'gi_saxs', 'ang_saxs' (for anisotropics saxs or flow-xpcs)
#scat_geometry = 'saxs'
qphi_analysis = True
scat_geometry = 'saxs'  #suport 'saxs', 'gi_saxs', 'ang_saxs' (for anisotropics saxs or flow-xpcs)
#scat_geometry = 'gi_waxs'  #suport 'saxs', 'gi_saxs', 'ang_saxs' (for anisotropics saxs or flow-xpcs)
                           # gi_waxs define a simple box-shaped ROI 
force_compress = False #True   #force to compress data 
bin_frame = False   #generally make bin_frame as False
para_compress = True    #parallel compress
run_fit_form = False    #run fit form factor 
run_waterfall =  False   #run waterfall analysis
run_t_ROI_Inten = True  #run  ROI intensity as a function of time
run_invariant_analysis = False
run_one_time =  True  #run  one-time
#run_fit_g2 = True       #run  fit one-time, the default function is "stretched exponential"
fit_g2_func = 'stretched'
run_two_time =  True #True    #run  two-time
run_four_time =  True #False   #run  four-time
run_xsvs=  False #False         #run visibility analysis
att_pdf_report = True    #attach the pdf report to CHX olog
qth_interest = 1 #the intested single qth             
use_sqnorm = True    #if True, use sq to normalize intensity
use_imgsum_norm= True  #if True use imgsum to normalize intensity for one-time calculatoin
pdf_version='_%s'%get_today_date()     #for pdf report name
run_dose = True #True  #run dose_depend analysis

if scat_geometry == 'gi_saxs':run_xsvs= False;use_sqnorm=False
if scat_geometry == 'gi_waxs':use_sqnorm = False;
if scat_geometry != 'saxs':qphi_analysis = False;scat_geometry_ = scat_geometry  
else:scat_geometry_ = ['','ang_'][qphi_analysis]+ scat_geometry

In [6]:
taus=None;g2=None;tausb=None;g2b=None;g12b=None;taus4=None;g4=None;times_xsv=None;contrast_factorL=None; lag_steps = None

Make a directory for saving results


In [7]:
CYCLE= '2017_3'  #change clycle here
path = '/XF11ID/analysis/%s/masks/'%CYCLE
username =  getpass.getuser()


#username = 'hkoerner'    
data_dir0  = create_user_folder(CYCLE, username)
print( data_dir0 )


Results from this analysis will be stashed in the directory /XF11ID/analysis/2017_3/lwiegart/Results/
/XF11ID/analysis/2017_3/lwiegart/Results/

Load ROI defined by "XPCS_Setup" Pipeline


In [8]:
#uid = 'd70175' #(scan num: 6773) (Measurement: P2VP 101 Si1: XPCS T=0.036, 1s exp, 1200 frames, feedback ON )
#fp = path + 'uid='+uid+'_roi_mask.pkl'

if qphi_analysis:
    fp = path + 'roi_mask_Octo13_Coralpor_wide_phi.pkl'
else: 
    fp = path + 'roi_mask_Octo13_Coralpor.pkl'
roi_mask,qval_dict = cpk.load( open(fp, 'rb' )  )  #for load the saved roi data
print('Load roi mask from file: %s.'%fp)
#ley/XPCS_Single_2ley/XPCS_Single_2
if scat_geometry =='gi_saxs':    
    fp = path + 'uid='+uid+'_roi_masks.pkl'
    roi_masks,qval_dicts = cpk.load( open(fp, 'rb' )  )  #for load the saved roi data
    print('Load roi mask for static gisaxs analysis from file: %s.'%fp)
    fp = path + 'uid='+uid+'_qmap.pkl'
    print('Load gi-saxs q-mask from file: %s.'%fp)
    qr_map, qz_map, ticks, Qrs, Qzs,  Qr, Qz, inc_x0,refl_x0, refl_y0 = cpk.load( open(fp, 'rb' )  )


Load roi mask from file: /XF11ID/analysis/2017_3/masks/roi_mask_Octo13_Coralpor_wide_phi.pkl.

In [9]:
%run /home/yuzhang/chxanalys_link/chxanalys/chx_generic_functions.py

Load Metadata & Image Data

Change this line to give a uid


In [10]:
uid = '89ac4a'  # (scan num: 7822) (Measurement: 750Hz 2k CoralPor )

uid = 'd94a8d' #(scan num: 7833 (Measurement: test 750Hz 5k mbs_h=.05 CoralPor
uid = 'f4779f' # (scan num: 7834) (Measurement: test 750Hz 5k mbs_h=.05 Sumitomo_1 )
# corr function for rubber in flat cell
uid = '14d805' #(scan num: 7835 (Measurement: test 750Hz 200fr mbs_h=.05 part ISumitomo_1        
uid = 'c9830d' #(scan num: 7837 (Measurement: test expt=.035 T=.036 200fr mbs_h=.05 part IISumitomo_1        "
uid = '561670' #(scan num: 7838 (Measurement: test expt=.2 T=.0068 200fr mbs_h=.05 part IISumitomo_1
uid = '4234b1' #(scan num: 7839 (Measurement: test expt=5 T=.00025 200fr mbs_h=.05 part IVSumitomo_1
uid = 'e2612d' #(scan num: 7840 (Measurement: test expt=30 T=.0000467 120fr mbs_h=.05 part VSumitomo_1
# CoralPor static reference
uid = '367eca' #(scan num: 7841 (Measurement: test expt=.00134 T=1 5000 mbs_h=.05 part ICoralPor .
uid = '4596bf' #(scan num: 7842 (Measurement: expt=1 T=.00138 3600 mbs_h=.05 part IICoralPor -> feedback on after 14min...
uid = 'd5fb2a' #(scan num: 7861 (Measurement: expt=1 imnum=3600 CoralPol -> T=.00138
# CoralPor periodic: 200um step @1000um/s 2s wait
#uid = '19a10f' #(scan num: 7846 (Measurement: expt=.01 T=1 10k mbs_h=.05 periodic 200um/2s wait/1000um/s CoralPor
uid = '215cab' # (scan num: 7847) (Measurement: expt=.01 T=1 10k mbs_05 no moving CoralPor )

# Stan's samples:
#uid = '367270' #(scan num: 7850 (Measurement: expt=.1 imnum=600anPES1
#uid = '0a69a7' #(scan num: 7851 (Measurement: expt=.1 imnum=600 anPES1s
#uid = '9c9fe8' #(scan num: 7853 (Measurement: expt=.1 imnum=600 AB-AK-10704-174
#uid = '4e180d' #(scan num: 7855 (Measurement: expt=.1 imnum=600 Dynacoll_7360
#uid = 'e3df5d' #(scan num: 7857 (Measurement: expt=.1 imnum=600 A140-3
#uid = 'cf1f15' #(scan num: 7859 (Measurement: expt=.1 imnum=600 B140-4

# elastomere in tensile stage, pre-stretch
#uid = 'ea24ed' #(scan num: 7864 (Measurement: expt=.2 imnum=200 T=.0068 pre-stretchSumitomo I   
#uid = '27f842' #(scan num: 7865 (Measurement: expt=5 imnum=200 T=.00025 pre-stretchSumitomo I
# elastomere in tensile stage, periodic stretch: 100um @1mm/s 2s wait
#uid = 'a9e074' #(scan num: 7866 (Measurement: expt=.2 imnum=200 T=.0068 100um @1mm/s 2s wait Sumitomo I
# elastomere after periodic stretch, 0N
#uid = '571234' #(scan num: 7868 (Measurement: expt=5 imnum=200 T=.00025 post-stretch @0N Sumitomo I
# elastomere in tensile stage, periodic stretch: 10um @.1mm/s 2s wait
#uid = '146b1c' #(scan num: 7869 (Measurement: expt=.2 imnum=400 T=.0068 10um @0.1mm/s 2s wait Sumitomo I
# elastomere in tensile stage, periodic stretch: 50um @.1mm/s 2s wait
#uid = '628655' #(scan num: 7870 (Measurement: expt=.2 imnum=400 T=.0068 50um @0.1mm/s 2s wait Sumitomo I   
# elastomere in tensile stage, periodic stretch: 50um @1.0mm/s 2s wait
#uid = 'bb7fda' #(scan num: 7871 (Measurement: expt=.2 imnum=400 T=.0068 50um @1.0mm/s 2s wait Sumitomo I
# elastomere in tensile stage, periodic stretch: 100um @1.0mm/s 2s wait
#uid = '793b6f' #(scan num: 7872 (Measurement: expt=.2 imnum=400 T=.0068 100um @1.0mm/s 2s wait Sumitomo I

# Loctite
#uid = '0f637f' #(scan num: 7874 (Measurement: expt=.5 imnum=200 Loc_epox_5min
#uid = 'ee1e99' #(scan num: 7876 (Measurement: expt=.5 imnum=200 Loc_epox_5min  # bubbles!!
#uid = '6f8e42' #(scan num: 7880 (Measurement: expt=.5 imnum=200 Loc_epox_5min  
# Loctite cement
#uid = '9f07d5' #(scan num: 7881 (Measurement: expt=.00134 imnum=2000 Loc_concrete  
#uid = '16494f' #(scan num: 7882 (Measurement: expt=.1 imnum=200 Loc_concrete 
#uid = '391446' #(scan num: 7883 (Measurement: expt=1 imnum=200 Loc_concrete
#uid = '9d5333' #(scan num: 7884 (Measurement: expt=1 imnum=200 Loc_concrete
#uid = '290284' #(scan num: 7885 (Measurement: expt=1 imnum=200 Loc_concrete
#uid = 'f21397' #(scan num: 7886 (Measurement: expt=1 imnum=200 Loc_concrete
# stretching ANPSE1s
#pre-stretch @~0N
#uid = 'cb75dc' #(scan num: 7889 (Measurement: expt=.1 imnum=600 ANPES1s_insitu 
#uid = '541bc5' #(scan num: 7890 (Measurement: expt=.1 imnum=600 ANPES1s_insitu
#50% @1mm/s
#uid = '77a531'  # (scan num: 7891) (Measurement: expt=.05 imnum=1200 50% stretch @1mm/s 
#uid = '0372db' #(scan num: 7892 (Measurement: expt=.1 imnum=600 50% stretch @1mm/s ANPES1s_insitu
#uid = '5c83ab' #(scan num: 7893 (Measurement: expt=.1 imnum=600 50% stretch @1mm/s ANPES1s_insitu
# stretched to 100% at 1mm/s
#uid = '2b79c4' #(scan num: 7894 (Measurement: expt=.05 imnum=1200 100% stretch @1mm/s ANPES1s_insitu
#uid = '1e8d2f' #(scan num: 7895 (Measurement: expt=.1 imnum=600 post 100% stretch @1mm/s ANPES1s_insitu
# stretched to 500% at 1mm/s
#uid = 'f02ced' #(scan num: 7896 (Measurement: expt=.1 imnum=600 500% stretch @1mm/s ANPES1s_insitu
#uid = '08f67a' #(scan num: 7898 (Measurement: expt=.1 imnum=600 post 500% stretch @1mm/s ANPES1s_insitu        "
# stretched to 1000% at 1mm/s
#uid = '6992f3' #(scan num: 7899 (Measurement: expt=.1 imnum=600 1000% stretch @1mm/s ANPES1s_insitu   
#uid = 'eaa51d' #(scan num: 7901 (Measurement: expt=.1 imnum=600 post 1000% stretch @1mm/s ANPES1s_insitu
# stretched to 1500% at 1mm/s (and broke)
#uid = '5688972f-6d95-497c-9962-36a355edcb8f' #(scan num: 7902 (Measurement: expt=5 imnum=10 post 1000% stretch @1mm/s WAXS ANPES1s_insitu

# stretching filled elastomere:
#uid = 'cbdc17' #(scan num: 7904 (Measurement: .2s 200fr 50% stretch ANPES1s_insitu -> sample name wrong!!
#uid = '3740f0' #(scan num: 7905 (Measurement: .2s 200fr post 50% stretch ANPES1s_insitu -> sample name wrong
#uid = '3ae7c0' #(scan num: 7906 (Measurement: .2s 200fr post 50% stretch SumitomoI 
#uid = 'a89075' #(scan num: 7907 (Measurement: .2s 200fr  100% stretch SumitomoI -> sample broke!

# elastomere periodic stretching partII
#uid = '7781ef' #(scan num: 7908 (Measurement: .2s 200fr  pre-stretch SumitomoI 
#uid = '4cbb0c' #(scan num: 7909 (Measurement: .2s 200fr  pre-stretch SumitomoI
#uid = '98cea1' #(scan num: 7910 (Measurement: .2s 200fr  pre-stretch SumitomoI 
#uid = 'c489db' #(scan num: 7911 (Measurement: .2s 200fr  pre-stretch SumitomoI 
#uid = '7838bc' #(scan num: 7912 (Measurement: .2s 200fr  pre-stretch SumitomoI
#uid = 'c19897' #(scan num: 7913 (Measurement: .2s 200fr  100um periodic stretch @1mm/s 2s wait SumitomoI 
#uid = 'e77c5c' #(scan num: 7914 (Measurement: .2s 200fr  100um periodic stretch 2 @1mm/s 2s wait SumitomoI
#uid = 'c02ece' #(scan num: 7915 (Measurement: .2s 200fr  500um periodic stretch @1mm/s 2s wait SumitomoI
#uid = '338760' #(scan num: 7916 (Measurement: .2s 200fr  1000um periodic stretch @1mm/s 2s wait SumitomoI
# stretching:
uid = '62e14e' #(scan num: 7917 (Measurement: .2s 400fr 25% stretch @1mm/s SumitomoI
#uid = 'c06a00' #(scan num: 7918 (Measurement: .2s 400fr post 25% stretch @1mm/s SumitomoI
#uid = '8c527a' #(scan num: 7919 (Measurement: .2s 400fr post 25% stretch @1mm/s SumitomoI     
#uid = '27bdf1' #(scan num: 7920 (Measurement: .2s 400fr 50% stretch @1mm/s SumitomoI   
#uid = 'c2b646' #(scan num: 7921 (Measurement: .2s 400fr post 50% stretch @1mm/s SumitomoI
#uid = 'ffeeeb6e-c110-41dd-a7fe-da512cc35cb5' #(scan num: 7922 (Measurement: .2s 400fr post 50% stretch @1mm/s SumitomoI 
#uid = '0641d5' #(scan num: 7923 (Measurement: .2s 400fr 75% stretch @1mm/s SumitomoI   
#uid = '3af89f' #(scan num: 7924 (Measurement: .2s 400fr post 75% stretch @1mm/s SumitomoI
#uid = '765fc0' #(scan num: 7925 (Measurement: .2s 400fr post 75% stretch @1mm/s SumitomoI   
#uid = '6c2a38' #(scan num: 7926 (Measurement: .2s 400fr 100% stretch @1mm/s SumitomoI
#uid = '06c001' #(scan num: 7927 (Measurement: .2s 400fr post 100% stretch @1mm/s SumitomoI
#uid = 'a97646' #(scan num: 7928 (Measurement: .2s 400fr post 100% stretch @1mm/s SumitomoI
#uid = 'd6708c' #(scan num: 7929 (Measurement: .2s 400fr 125% stretch @1mm/s SumitomoI
#uid = '0a8365' #(scan num: 7930 (Measurement: .2s 400fr post 125% stretch @1mm/s SumitomoI  
#uid = '1e462c' #(scan num: 7931 (Measurement: .2s 400fr post 125% stretch @1mm/s SumitomoI
#uid = 'c372ae' #(scan num: 7932 (Measurement: .2s 400fr 150% stretch @1mm/s SumitomoI -> moved off sample?
#uid = 'f066e7' #(scan num: 7934 (Measurement: .2s 400fr 175% stretch @1mm/s SumitomoI 
#uid = '4c2ec8' #(scan num: 7935 (Measurement: .2s 400fr 200% stretch @1mm/s SumitomoI
#uid = '4d59c1' #(scan num: 7936 (Measurement: .2s 400fr 225% stretch @1mm/s SumitomoI 

# Loctite cement continued..
#uid = '72a751' #(scan num: 7937 (Measurement: 1s 200fr Loc_concrete
#uid = '3e1594' #(scan num: 7938) (Measurement: 5s 100fr T=.2 Loc_concrete )
#uid = '6d5527' #(scan num: 7939 (Measurement: 5s 100fr T=.2 Loc_concrete 
#uid = '621337' #(scan num: 7940 (Measurement: 5s 100fr T=.036 Loc_concrete   

# Elastomer periodic stretching
#uid = 'f163ce' #(scan num: 7941 (Measurement: .2s 200fr T=.0068 0N pre-stretch SumitomoI       
#uid = '7d801b' #(scan num: 7942 (Measurement: 5s 200fr T=.00025 0N pre-stretch SumitomoI
#uid = 'ff771e' #(scan num: 7944 (Measurement: .2s 200fr 100um periodic stretch SumitomoI
#uid = '26f7e0' #(scan num: 7945 (Measurement: .2s 200fr 50% stretch SumitomoI       
#uid = '74ae35' #(scan num: 7946 (Measurement: .2s 200fr 100% stretch SumitomoI 
#uid = '271bdf' #(scan num: 7947 (Measurement: .2s 200fr 150% stretch SumitomoI 

# stretching ANPES2:
#uid = '692457' #(scan num: 7948 (Measurement: .2s 200fr pre-stretch stretch ANPES2
#uid = 'dd7d1533' #(scan num: 7949 (Measurement: .2s 200fr pre-stretch stretch ANPES2  
# aborted: sample moved...start over
#uid = 'a001c8' #(scan num: 7956 (Measurement: .2s 200fr  pre-stretch ANPES2  
#uid = '36e3ef' #(scan num: 7957 (Measurement: WAXS 17.5 pre-stretch ANPES2
#uid = '5fbde2' #(scan num: 7958 (Measurement: .2s 200fr  50% stretch ANPES2 
#uid = '740486' #(scan num: 7959 (Measurement: WAXS 17.5 50% stretch ANPES2
#uid = '9c80b3' #(scan num: 7960 (Measurement: .2s 200fr  50% post stretch ANPES2
#uid = '7d7dea' #(scan num: 7961 (Measurement: .2s 200fr  100% stretch ANPES2
#uid = 'b8808a' #(scan num: 7962 (Measurement: WAXS 17.5 50% stretch ANPES2 -> 100% stretch
#uid = '7ba2f0' # (scan num: 7963) (Measurement: .2s 200fr 100% post stretch ANPES2 )
#uid = 'f4fe95' #(scan num: 7964 (Measurement: .2s 200fr  150% stretch ANPES2
#uid = 'fbc249' #(scan num: 7965 (Measurement: WAXS 17.5 150% stretch ANPES2 
#uid = 'a5afb0' #(scan num: 7966 (Measurement: .2s 200fr  150% post stretch ANPES2
#uid = '23394c' #(scan num: 7967 (Measurement: .2s 200fr  200%  stretch ANPES2
#uid = 'd1c9cb' #(scan num: 7968 (Measurement: WAXS 17.5 200% stretch ANPES2
#uid = '26b185' #(scan num: 7969 (Measurement: .2s 200fr  200%  post stretch ANPES2
#uid = 'dfbc99' #(scan num: 7970 (Measurement: .2s 200fr  250%  stretch ANPES2
#uid = 'c81fea' #(scan num: 7971 (Measurement: WAXS 17.5 250% stretch ANPES2  
#uid = '0806dd' #(scan num: 7972 (Measurement: .2s 200fr  250%  post stretch ANPES2 
#uid = '31488e' #(scan num: 7973 (Measurement: .2s 200fr  300%  stretch ANPES2 -> slipped out

# constant force measurements:
#uid = '000a39' #(scan num: 7975 (Measurement: .2s 200fr  stretch to 2.5N no referenceSumitomoI
#uid = '08815c8d' #(scan num: 7976 (Measurement: .2s 200fr  stretch to 2.5N no reference IISumitomoI
#uid = 'c4ce64' #(scan num: 7977 (Measurement: .2s 200fr  static reference (rubber) SumitomoI -> not very static
#uid = '47a21a' #(scan num: 7978 (Measurement: .2s 200fr  static reference (rubber) SumitomoI -> new rubber
#uid = '7c7547' #(scan num: 7979 (Measurement: .2s 200fr  static reference (rubber) SumitomoI -> getting more static...Kapton tape relaxes?
#uid = '56cef0' #(scan num: 7980 (Measurement: .2s 200fr  static reference (rubber) SumitomoI -> static enough...
#uid = '376136' #(scan num: 7981 (Measurement: .2s 200fr  pre-stretch + reference SumitomoI -> relaxation of mounted sample
#uid = '324142' #(scan num: 7982 (Measurement: .2s 200fr  pre-stretch + reference SumitomoI
#uid = '028187' #(scan num: 7983 (Measurement: .2s 200fr  100um at 10um_s + reference SumitomoI
#uid = '8ab741' #(scan num: 7984 (Measurement: .2s 200fr  100um at 1000um_s + reference SumitomoI
#uid = '4e1268' #(scan num: 7985 (Measurement: .2s 200fr  100um at 1000um_s 6s wait periodic + reference SumitomoI
#uid = '269930' #(scan num: 7986 (Measurement: .2s 200fr holding at 2.5N + reference SumitomoI 
#uid = '7f441d' #(scan num: 7987 (Measurement: .2s 200fr holding at 2.5N + reference SumitomoI 
#uid = 'e347c1' #(scan num: 7988 (Measurement: .2s 200fr holding at 2.5N + reference SumitomoI 

# periodic stretching III
#uid = 'f0641f' #(scan num: 7989 (Measurement: .2s 200fr pre-stretch, right after mounting SumitomoI
#uid = '9a1b18' #(scan num: 7990 (Measurement: .2s 200fr pre-stretch, adjusted to ~0N SumitomoI
#uid = '496e49' #(scan num: 7991 (Measurement: .2s 200fr pre-stretch SumitomoI  
#uid = '723519' #(scan num: 7992 (Measurement: .2s 400fr pre-stretch SumitomoI
#uid = 'ce3b2b' #(scan num: 7993 (Measurement: .2s 400fr stretch 250um 1mm_s periodic 2s wait SumitomoI
#uid = '0697a4' #(scan num: 7994 (Measurement: .2s 400fr 2nd stretch 250um 1mm_s periodic 2s wait SumitomoI
#uid = '37580f' #(scan num: 7995 (Measurement: .2s 400fr 3rd stretch 250um 1mm_s periodic 2s wait SumitomoI
#uid = '8a5fc2' #(scan num: 7996 (Measurement: .2s 400fr 4th stretch 250um 1mm_s periodic 2s wait SumitomoI -> fresh spot?
#uid = 'b31c6a' #(scan num: 7997 (Measurement: .2s 400fr 5th stretch 250um 1mm_s periodic 4s wait SumitomoI
#uid = '75094c' #(scan num: 7998 (Measurement: .2s 400fr post-stretch SumitomoI

In [11]:
#get_last_uids( -1 )

In [ ]:


In [12]:
sud = get_sid_filenames(db[uid])
print ('scan_id, full-uid, data path are:  %s--%s--%s'%(sud[0], sud[1], sud[2][0] ))
#start_time, stop_time = '2017-2-24  12:23:00', '2017-2-24  13:42:00' 
#sids, uids, fuids  = find_uids(start_time, stop_time)


scan_id, full-uid, data path are:  7917--62e14e37-128c-4474-9f53-92c25123daac--/XF11ID/data/2017/10/14/844b6e4e-828e-4729-8694_323

In [13]:
data_dir = os.path.join(data_dir0, '%s/'%uid)
os.makedirs(data_dir, exist_ok=True)
print('Results from this analysis will be stashed in the directory %s' % data_dir)
uidstr = 'uid=%s'%uid


Results from this analysis will be stashed in the directory /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/

Don't Change these lines below here

  • get metadata

In [14]:
md = get_meta_data( uid )
  • get data

In [15]:
imgs = load_data( uid, md['detector'], reverse= True  )
md.update( imgs.md );Nimg = len(imgs);
#if 'number of images'  not in list(md.keys()):
md['number of images']  = Nimg
pixel_mask =  1- np.int_( np.array( imgs.md['pixel_mask'], dtype= bool)  )
print( 'The data are: %s' %imgs )

md['acquire period' ] = md['cam_acquire_period']
md['exposure time'] =  md['cam_acquire_time']


The data are: Sliced and/or processed EigerImages2. Original repr:
    <Frames>
    Length: 400 frames
    Frame Shape: 2167 x 2070
    Pixel Datatype: uint32

In [16]:
print_dict( md,  ['suid', 'number of images', 'uid', 'scan_id', 'start_time', 'stop_time', 'sample', 'Measurement',
                  'acquire period', 'exposure time',  
         'det_distance', 'beam_center_x', 'beam_center_y','transmission' ] )


suid--> 62e14e
number of images--> 400
uid--> 62e14e37-128c-4474-9f53-92c25123daac
scan_id--> 7917
start_time--> 2017-10-14 21:08:19
stop_time--> 2017-10-14 21:09:41
sample--> SumitomoI
Measurement--> .2s 400fr 25% stretch @1mm/s SumitomoI
acquire period--> 0.2
exposure time--> 0.19999000430107117
det_distance--> 10.077709615
beam_center_x--> 1121.0
beam_center_y--> 1236.0
transmission--> 0.0068388531571917325

Overwrite Some Metadata if Wrong Input

Define incident beam center (also define reflection beam center for gisaxs)


In [17]:
if scat_geometry =='gi_saxs':
    inc_x0 =  md['beam_center_x']
    inc_y0 =  imgs[0].shape[0] - md['beam_center_y'] 
    
    refl_x0 =  md['beam_center_x']
    refl_y0 =  imgs[0].shape[0] -  1666
    print( "inc_x0, inc_y0, ref_x0,ref_y0 are: %s %s %s %s."%(inc_x0, inc_y0, refl_x0, refl_y0) )
else:
    inc_x0 =  imgs[0].shape[0] - md['beam_center_y']   
    inc_y0=   md['beam_center_x']

In [18]:
print(inc_x0, inc_y0)


931.0 1121.0

In [19]:
dpix, lambda_, Ldet,  exposuretime, timeperframe, center = check_lost_metadata(
    md, Nimg, inc_x0 = inc_x0, inc_y0=   inc_y0, pixelsize = 7.5*10*(-5) )

setup_pargs=dict(uid=uidstr, dpix= dpix, Ldet=Ldet, lambda_= lambda_, exposuretime=exposuretime,
        timeperframe=timeperframe, center=center, path= data_dir)

print_dict( setup_pargs )


The metadata: beam_center_x has been changed to 1121.0.
The metadata: beam_center_y has been changed to 931.0.
Ldet--> 10077.709615
exposuretime--> 0.19999
dpix--> 0.0750000035623
path--> /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/
center--> [931, 1121]
lambda_--> 1.28494
uid--> uid=62e14e
timeperframe--> 0.2

In [20]:
setup_pargs


Out[20]:
{'Ldet': 10077.709615,
 'center': [931, 1121],
 'dpix': 0.075000003562308848,
 'exposuretime': 0.19999,
 'lambda_': 1.2849435,
 'path': '/XF11ID/analysis/2017_3/lwiegart/Results/62e14e/',
 'timeperframe': 0.2,
 'uid': 'uid=62e14e'}

Apply Mask

  • load and plot mask if exist
  • otherwise create a mask using Mask pipeline
  • Reverse the mask in y-direction due to the coordination difference between python and Eiger software
  • Reverse images in y-direction
  • Apply the mask

Change the blow line to give mask filename


In [21]:
if scat_geometry == 'gi_saxs':
    mask_path = '/XF11ID/analysis/2017_3/masks/'
    #mask_name =  'Nov16_4M-GiSAXS_mask.npy'
    mask_name =  'Sept22_GISAXS.npy'
    
elif scat_geometry == 'saxs':
    mask_path = '/XF11ID/analysis/2017_3/masks/'
    mask_name = 'Octo13_SAXS.npy'

In [22]:
mask = load_mask(mask_path, mask_name, plot_ =  False, image_name = uidstr + '_mask', reverse= True ) 
mask *= pixel_mask
show_img(mask,image_name = uidstr + '_mask', save=True, path=data_dir, aspect=1, center=center[::-1])
mask_load=mask.copy()
imgsa = apply_mask( imgs, mask )


Check several frames average intensity


In [24]:
img_choice_N = 3
img_samp_index = random.sample( range(len(imgs)), img_choice_N) 
avg_img =  get_avg_img( imgsa, img_samp_index, plot_ = False, uid =uidstr)
if avg_img.max() == 0:
    print('There are no photons recorded for this uid: %s'%uid)
    print('The data analysis should be terminated! Please try another uid.')

In [25]:
#%run /home/yuzhang/chxanalys_link/chxanalys/chx_generic_functions.py

In [ ]:


In [26]:
#show_img( imgsa[1000],  vmin=.1, vmax= 1e1, logs=True, aspect=1,
#         image_name= uidstr + '_img_avg',  save=True, path=data_dir,  cmap = cmap_albula )

In [27]:
show_img( imgs[10],  vmin=.0, vmax= 1e1, logs=False, aspect=1, #save_format='tif',
         image_name= uidstr + '_img_avg',  save=True, path=data_dir, cmap=cmap_albula,center=center[::-1] )


Compress Data

  • Generate a compressed data with filename
  • Replace old mask with a new mask with removed hot pixels
  • Do average image
  • Do each image sum
  • Find badframe_list for where image sum above bad_pixel_threshold
  • Check shutter open frame to get good time series

In [30]:
compress=True
photon_occ = len( np.where(avg_img)[0] ) / ( imgsa[0].size)
#compress =  photon_occ < .4  #if the photon ocupation < 0.5, do compress
print ("The non-zeros photon occupation is %s."%( photon_occ))
print("Will " + 'Always ' + ['NOT', 'DO'][compress]  + " apply compress process.")


The non-zeros photon occupation is 0.14040069643689154.
Will Always DO apply compress process.

In [31]:
good_start = 5  #5  #make the good_start at least 0

In [32]:
bin_frame =  False # True  #generally make bin_frame as False
if bin_frame:
    bin_frame_number=  5
    timeperframe = md['acquire period' ] * bin_frame_number
else:
    bin_frame_number =1

In [33]:
import time
t0= time.time()
if bin_frame_number==1:
    filename = '/XF11ID/analysis/Compressed_Data' +'/uid_%s.cmp'%md['uid']
else:
    filename = '/XF11ID/analysis/Compressed_Data' +'/uid_%s_bined--%s.cmp'%(md['uid'],bin_frame_number) 
mask, avg_img, imgsum, bad_frame_list = compress_eigerdata(imgs, mask, md, filename, 
         force_compress= force_compress,  para_compress= para_compress,  bad_pixel_threshold = 1e14,
                        bins=bin_frame_number, num_sub= 100, num_max_para_process= 500, with_pickle=True  )
min_inten = 10    
good_start = max(good_start, np.where( np.array(imgsum) > min_inten )[0][0] )    
print ('The good_start frame number is: %s '%good_start)
FD = Multifile(filename, good_start, len(imgs)//bin_frame_number)
#FD = Multifile(filename, good_start, 100)
uid_ = uidstr + '_fra_%s_%s'%(FD.beg, FD.end)
print( uid_ )
plot1D( y = imgsum[ np.array( [i for i in np.arange(good_start, len(imgsum)) if i not in bad_frame_list])],
       title =uidstr + '_imgsum', xlabel='Frame', ylabel='Total_Intensity', legend='imgsum'   )
Nimg = Nimg/bin_frame_number

run_time(t0)


Using already created compressed file with filename as :/XF11ID/analysis/Compressed_Data/uid_62e14e37-128c-4474-9f53-92c25123daac.cmp.
The good_start frame number is: 5 
uid=62e14e_fra_5_400
Total time: 0.525 sec

In [34]:
show_img( avg_img,   vmin=.0001, vmax= 5e4, logs=True, aspect=1, #save_format='tif',
         image_name= uidstr + '_img_avg',  save=True, path=data_dir,  cmap = cmap_albula, center=center[::-1] )


/opt/conda_envs/analysis/lib/python3.5/site-packages/matplotlib/colorbar.py:821: MaskedArrayFutureWarning: setting an item on a masked array which has a shared mask will not copy the mask and also change the original mask array in the future.
Check the NumPy 1.11 release notes for more information.
  y[self._inside] = norm(yi)

Get bad frame list by a polynominal fit


In [35]:
good_end= None # 2000  
if good_end is not None:
    FD = Multifile(filename, good_start, min( len(imgs)//bin_frame_number, good_end) )
    uid_ = uidstr + '_fra_%s_%s'%(FD.beg, FD.end)
    print( uid_ )

In [36]:
re_define_good_start =False
if re_define_good_start:
    good_start = 10
    good_end = 19700
    FD = Multifile(filename, good_start, good_end) 
    uid_ = uidstr + '_fra_%s_%s'%(FD.beg, FD.end)
    print( FD.beg, FD.end)

In [37]:
bad_frame_list =  get_bad_frame_list( imgsum, fit='both',  plot=True,polyfit_order = 30,                                      
                        scale= 3.5,  good_start = good_start, good_end=good_end, uid= uidstr, path=data_dir)

print( 'The bad frame list length is: %s'%len(bad_frame_list) )


/home/yuzhang/chxanalys_link/chxanalys/chx_generic_functions.py:1109: RankWarning: Polyfit may be poorly conditioned
  pol = np.polyfit(x, y, order)
The bad frame list length is: 5

In [ ]:

Creat new mask by masking the bad pixels and get new avg_img


In [38]:
imgsum_y = imgsum[ np.array( [i for i in np.arange( len(imgsum)) if i not in bad_frame_list])]
imgsum_x = np.arange( len( imgsum_y))
save_lists(  [imgsum_x, imgsum_y], label=['Frame', 'Total_Intensity'],
           filename=uidstr + '_img_sum_t', path= data_dir  )

Plot time~ total intensity of each frame


In [39]:
plot1D( y = imgsum_y, title = uidstr + '_img_sum_t', xlabel='Frame', c='b',
       ylabel='Total_Intensity', legend='imgsum', save=True, path=data_dir)


Static Analysis

SAXS Scattering Geometry


In [40]:
if scat_geometry =='saxs':
    ## Get circular average| * Do plot and save q~iq
    hmask = create_hot_pixel_mask( avg_img, threshold = 1e2, center=center, center_radius= 100)
    mask = mask * hmask
    qp_saxs, iq_saxs, q_saxs = get_circular_average( avg_img, mask * hmask, pargs=setup_pargs  )
    plot_circular_average( qp_saxs, iq_saxs, q_saxs,  pargs=setup_pargs, 
                      xlim=[q_saxs.min(), q_saxs.max()*1.0], ylim = [iq_saxs.min(), iq_saxs.max()] )
#mask =np.array( mask * hmask, dtype=bool)



In [45]:
#%run ~/chxanalys_link/chxanalys/chx_compress_analysis.py

In [42]:
if scat_geometry =='saxs':    
    if run_fit_form:        
        form_res = fit_form_factor( q_saxs,iq_saxs,  guess_values={'radius': 2500, 'sigma':0.05, 
         'delta_rho':1E-10 },  fit_range=[0.0001, 0.015], fit_variables={'radius': T, 'sigma':T, 
         'delta_rho':T},  res_pargs=setup_pargs, xlim=[0.0001, 0.015])  
        
    qr = np.array( [qval_dict[k][0] for k in sorted( qval_dict.keys())] )
    print(len(qr))
    show_ROI_on_image( avg_img, roi_mask, center, label_on = False, rwidth = 840, alpha=.9,  
                 save=True, path=data_dir, uid=uidstr, vmin= 1e-3,
                 vmax= 1e3, #np.max(avg_img),
                 aspect=1,
                 show_roi_edge=True,
                 show_ang_cor = True) 
    plot_qIq_with_ROI( q_saxs, iq_saxs, np.unique(qr), logs=True, uid=uidstr, xlim=[0.001,0.1],
                  ylim = [iq_saxs.min(), iq_saxs.max()*2],  save=True, path=data_dir)


81
/opt/conda_envs/analysis/lib/python3.5/site-packages/matplotlib/colorbar.py:821: MaskedArrayFutureWarning: setting an item on a masked array which has a shared mask will not copy the mask and also change the original mask array in the future.
Check the NumPy 1.11 release notes for more information.
  y[self._inside] = norm(yi)

Time Depedent I(q) Analysis


In [48]:
if scat_geometry =='saxs':
    Nimg = FD.end - FD.beg 
    time_edge = create_time_slice( Nimg, slice_num= 3, slice_width= 1, edges = None )
    time_edge =  np.array( time_edge ) + good_start
    #print( time_edge )    
    qpt, iqst, qt = get_t_iqc( FD, time_edge, mask, pargs=setup_pargs, nx=1500, show_progress= False )
    plot_t_iqc( qt, iqst, time_edge, pargs=setup_pargs, xlim=[qt.min(), qt.max()],
           ylim = [iqst.min(), iqst.max()], save=True )


The file: uid=62e14e_q_Iqt is saved in /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/

In [50]:
if run_invariant_analysis:
    if scat_geometry =='saxs':
        invariant = get_iq_invariant( qt, iqst )
        time_stamp = time_edge[:,0] * timeperframe

    if scat_geometry =='saxs':
        plot_q2_iq( qt, iqst, time_stamp,pargs=setup_pargs,ylim=[ -0.001, 0.01] , 
                   xlim=[0.007,0.2],legend_size= 6  )

    if scat_geometry =='saxs':
        plot_time_iq_invariant( time_stamp, invariant, pargs=setup_pargs,  )

    if False:
        iq_int = np.zeros( len(iqst) )
        fig, ax = plt.subplots()
        q = qt
        for i in range(iqst.shape[0]):
            yi = iqst[i] * q**2
            iq_int[i] = yi.sum()
            time_labeli = 'time_%s s'%( round(  time_edge[i][0] * timeperframe, 3) )
            plot1D( x = q, y = yi, legend= time_labeli, xlabel='Q (A-1)', ylabel='I(q)*Q^2', title='I(q)*Q^2 ~ time',
                   m=markers[i], c = colors[i], ax=ax, ylim=[ -0.001, 0.01] , xlim=[0.007,0.2],
                  legend_size=4)

        #print( iq_int )

In [ ]:


In [ ]:

GiSAXS Scattering Geometry


In [51]:
if scat_geometry =='gi_saxs':    
    plot_qzr_map(  qr_map, qz_map, inc_x0, ticks = ticks, data= avg_img, uid= uidstr, path = data_dir   )

Static Analysis for gisaxs


In [52]:
if scat_geometry =='gi_saxs':    
    #roi_masks, qval_dicts = get_gisaxs_roi( Qrs, Qzs, qr_map, qz_map, mask= mask )
    show_qzr_roi( avg_img, roi_masks, inc_x0, ticks[:4], alpha=0.5, save=True, path=data_dir, uid=uidstr )

In [53]:
if  scat_geometry =='gi_saxs':    
    Nimg = FD.end - FD.beg 
    time_edge = create_time_slice( N= Nimg, slice_num= 2, slice_width= 2, edges = None )
    time_edge =  np.array( time_edge ) + good_start
    print( time_edge )    
    qrt_pds = get_t_qrc( FD, time_edge, Qrs, Qzs, qr_map, qz_map, mask=mask, path=data_dir, uid = uidstr )    
    plot_qrt_pds( qrt_pds, time_edge, qz_index = 0, uid = uidstr, path =  data_dir )

Make a Profile Plot


In [54]:
if  scat_geometry =='gi_saxs':
    if run_waterfall:
        xcorners= [ 1100, 1250, 1250, 1100 ]
        ycorners= [ 850, 850, 950, 950 ]   
        waterfall_roi_size = [ xcorners[1] - xcorners[0],  ycorners[2] - ycorners[1]  ]
        waterfall_roi =  create_rectangle_mask(  avg_img, xcorners, ycorners   )
        #show_img( waterfall_roi * avg_img,  aspect=1,vmin=.001, vmax=1, logs=True, )
        wat = cal_waterfallc( FD, waterfall_roi, qindex= 1, bin_waterfall=True,
                              waterfall_roi_size = waterfall_roi_size,save =True, path=data_dir, uid=uidstr)

In [55]:
if  scat_geometry =='gi_saxs':
    if run_waterfall:
        plot_waterfallc( wat, qindex=1, aspect=None, vmin=1, vmax= np.max( wat), uid=uidstr, save =True, 
                        path=data_dir, beg= FD.beg)

Dynamic Analysis for gi_saxs


In [56]:
if scat_geometry =='gi_saxs':       
    show_qzr_roi( avg_img, roi_mask, inc_x0, ticks[:4], alpha=0.5, save=True, path=data_dir, uid=uidstr )        
    ## Get 1D Curve (Q||-intensity¶)
    qr_1d_pds = cal_1d_qr( avg_img, Qr, Qz, qr_map, qz_map, inc_x0= None, mask=mask, setup_pargs=setup_pargs )
    plot_qr_1d_with_ROI( qr_1d_pds, qr_center=np.unique( np.array(list( qval_dict.values() ) )[:,0] ),
                    loglog=False, save=True, uid=uidstr, path = data_dir)

GiWAXS Scattering Geometry


In [57]:
if scat_geometry =='gi_waxs':
    badpixel = np.where( avg_img[:600,:] >=300 )
    roi_mask[badpixel] = 0
    show_ROI_on_image( avg_img, roi_mask, label_on = True,  alpha=.5,
                 save=True, path=data_dir, uid=uidstr, vmin=0.1, vmax=5)
  • Extract the labeled array

In [58]:
qind, pixelist = roi.extract_label_indices(roi_mask)
noqs = len(np.unique(qind))
  • Number of pixels in each q box

In [59]:
nopr = np.bincount(qind, minlength=(noqs+1))[1:]
nopr


Out[59]:
array([  533,   533,  1241,  1041,   317,  1241,  1945,  1945,   945,
         645,  2649,  2649,  1733,  2649,  3359,  3359,  2443,  3359,
        4065,  4002,  3149,  4002,  4706,  4249,  3855,  4433,  5477,
        1911,  4361,  4651,  6155,  5854,  5241,  5853,  6720,  6565,
        5802,  6565,  7047,  7269,  6147,  7269,  7002,  7964,  6496,
        7964,  7822,  7812,  6912,  8597,  8529,  9330,  7595,  1720,
        9246, 10045,  8328,  7110,  9954, 10750,  9041,  7461, 10000,
       11452,  9754,  7809, 11373, 11628, 10455,  8168, 12069,  2059,
        7786, 12793, 13259, 13476, 14262, 10752, 14982, 15696, 16222])

Check one ROI intensity


In [60]:
roi_inten = check_ROI_intensity( avg_img, roi_mask, ring_number= 5, uid =uidstr ) #roi starting from 1


Do a waterfall analysis


In [61]:
#run_waterfall = False

In [62]:
qth_interest = 5 #the second ring. #qth_interest starting from 1
if scat_geometry =='saxs' or scat_geometry =='gi_waxs':
    if run_waterfall:    
        wat = cal_waterfallc( FD, roi_mask, qindex= qth_interest, save =True, path=data_dir, uid=uidstr)
        plot_waterfallc( wat, qth_interest, aspect= None, vmin=1e-1, vmax= wat.max(), uid=uidstr, save =True, 
                        path=data_dir, beg= FD.beg, cmap = cmap_vge )

In [63]:
ring_avg = None    
if run_t_ROI_Inten:
    times_roi, mean_int_sets = cal_each_ring_mean_intensityc(FD, roi_mask, timeperframe = None, multi_cor=True  ) 
    plot_each_ring_mean_intensityc( times_roi, mean_int_sets,  uid = uidstr, save=True, path=data_dir )
    roi_avg = np.average( mean_int_sets, axis=0)


  1%|          | 1/81 [00:00<00:08,  9.75it/s]
Starting assign the tasks...
100%|██████████| 81/81 [00:17<00:00,  4.83it/s]
  0%|          | 0/81 [00:00<?, ?it/s]
Starting running the tasks...
100%|██████████| 81/81 [00:01<00:00, 50.37it/s] 
ROI mean_intensit calculation is DONE!
The file: uid=62e14e_t_ROIs is saved in /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/

One time Correlation

Note : Enter the number of buffers for Muliti tau one time correlation number of buffers has to be even. More details in https://github.com/scikit-beam/scikit-beam/blob/master/skbeam/core/correlation.py

if define another good_series


In [64]:
define_good_series = False
#define_good_series = True

if define_good_series:
    good_start = 1
    FD = Multifile(filename, beg = good_start, end = 400) #end=1000)
    uid_ = uidstr + '_fra_%s_%s'%(FD.beg, FD.end)
    print( uid_ )

In [65]:
if use_sqnorm:norm = get_pixelist_interp_iq( qp_saxs, iq_saxs, roi_mask, center)
else:norm=None 
if use_imgsum_norm:imgsum_ = imgsum
else:imgsum_ = None    
import time

In [66]:
#show_img( FD.rdframe(10), label_array=roi_mask, aspect=1, center=center )

In [67]:
if run_one_time: 
    t0 = time.time()    
    g2, lag_steps  = cal_g2p( FD,  roi_mask, bad_frame_list,good_start, num_buf = 8, num_lev= None,
                            imgsum= imgsum_, norm=norm )
    run_time(t0)


In this g2 calculation, the buf and lev number are: 8--7--
5 Bad frames involved and will be discarded!
395 frames will be processed...
  0%|          | 0/81 [00:00<?, ?it/s]
Starting assign the tasks...
100%|██████████| 81/81 [00:18<00:00,  4.49it/s]
 80%|████████  | 65/81 [00:00<00:00, 390.42it/s]
Starting running the tasks...
100%|██████████| 81/81 [00:03<00:00, 23.61it/s] 
G2 calculation DONE!
Total time: 45.479 sec


In [68]:
lag_steps = lag_steps[:g2.shape[0]]

In [69]:
if run_one_time:
    
    taus = lag_steps * timeperframe    
    try:
        g2_pds = save_g2_general( g2, taus=taus,qr= np.array( list( qval_dict.values() ) )[:,0],
                                            qz = np.array( list( qval_dict.values() ) )[:,1],
                             uid=uid_+'_g2.csv', path= data_dir, return_res=True )
    except:
        g2_pds = save_g2_general( g2, taus=taus,qr= np.array( list( qval_dict.values() ) )[:,0],                                             
                             uid=uid_+'_g2.csv', path= data_dir, return_res=True )


The correlation function is saved in /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/ with filename as uid=62e14e_fra_5_400_g2.csv

Fit g2


In [70]:
if run_one_time:
    g2_fit_result, taus_fit, g2_fit = get_g2_fit_general( g2,  taus, 
                function = fit_g2_func,  vlim=[0.95, 1.05], fit_range= None,  
            fit_variables={'baseline':True, 'beta': True, 'alpha':False,'relaxation_rate':True},                                  
            guess_values={'baseline':1.0,'beta': 0.1,'alpha':1.0,'relaxation_rate':0.0100,},
            guess_limits = dict( baseline =[1, 1.8], alpha=[0, 2],
                        beta = [0, 1], relaxation_rate= [0.00001, 5000]) ) 
    g2_fit_paras = save_g2_fit_para_tocsv(g2_fit_result,  filename= uid_  +'_g2_fit_paras.csv', path=data_dir )


The g2 fitting parameters are saved in /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/uid=62e14e_fra_5_400_g2_fit_paras.csv

In [71]:
print(scat_geometry_)


ang_saxs

In [72]:
if run_one_time:
    plot_g2_general( g2_dict={1:g2, 2:g2_fit}, taus_dict={1:taus, 2:taus_fit}, vlim=[0.95, 1.05],
                qval_dict = qval_dict, fit_res= g2_fit_result,  geometry= scat_geometry_,filename= uid_+'_g2', 
        path= data_dir, function= fit_g2_func,  ylabel='g2', append_name=  '_fit')


The combined image is saved as: /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/uid=62e14e_fra_5_400_g2_fit__joint.png

In [73]:
if run_one_time:
    try:
        if False:
            fs, fe = 0, 9
            fs,fe=0, 2
            qval_dict_ = {k:qval_dict[k] for k in list(qval_dict.keys())[fs:fe]  }
            D0, qrate_fit_res = get_q_rate_fit_general(  qval_dict_, g2_fit_paras['relaxation_rate'][fs:fe], 
                                                       geometry=  scat_geometry_ )
            plot_q_rate_fit_general( qval_dict_, g2_fit_paras['relaxation_rate'][fs:fe],  qrate_fit_res, 
                                    geometry= scat_geometry_,uid=uid_  , path= data_dir )
        else:
            D0, qrate_fit_res = get_q_rate_fit_general(  qval_dict, g2_fit_paras['relaxation_rate'],
                                        fit_range=[0, 26],   geometry= scat_geometry_ )    
            plot_q_rate_fit_general( qval_dict, g2_fit_paras['relaxation_rate'],  qrate_fit_res,   
                                geometry=  scat_geometry_,uid=uid_  ,
                                    show_fit=False, path= data_dir, plot_all_range=False)
    except:
        print('sorry, no dispersion relation fit available...')


The fitted diffusion coefficient D0 is:  8.962e+01   A^2S-1
The fitted diffusion coefficient D0 is:  4.822e+01   A^2S-1
The fitted diffusion coefficient D0 is:  1.129e+02   A^2S-1
The fitted diffusion coefficient D0 is:  4.654e+01   A^2S-1

In [74]:
#plot1D( x= qr, y=g2_fit_paras['beta'], ls='-', m = 'o', c='b', ylabel=r'$\beta$', xlabel=r'$Q( \AA^{-1} ) $' )

For two-time


In [75]:
define_good_series = False
#define_good_series = True
if define_good_series:
    good_start = 5
    FD = Multifile(filename, beg = good_start, end = 1000)
    uid_ = uidstr + '_fra_%s_%s'%(FD.beg, FD.end)
    print( uid_ )

In [76]:
#%run chxanalys_link/chxanalys/chx_generic_functions.py

In [77]:
data_pixel = None
if run_two_time:    
    data_pixel =   Get_Pixel_Arrayc( FD, pixelist,  norm= norm ).get_data()


100%|██████████| 395/395 [00:04<00:00, 92.35it/s] 

In [78]:
import time
t0=time.time()
g12b=None
if run_two_time:     
    g12b = auto_two_Arrayc(  data_pixel,  roi_mask, index = None   )
    if run_dose:
        np.save( data_dir + 'uid=%s_g12b'%uid, g12b)
run_time( t0 )


100%|██████████| 81/81 [00:03<00:00, 24.87it/s]
Total time: 3.693 sec

In [79]:
#%run chxanalys_link/chxanalys/Two_Time_Correlation_Function.py

In [80]:
if run_two_time:
    show_C12(g12b, q_ind=15, qlabel=qval_dict,N1= FD.beg,logs=False, N2=min( FD.end,10000), vmin= .98, vmax=1.25, 
             timeperframe=timeperframe,save=True, path= data_dir, uid = uid_ ,cmap=cmap_albula)



In [81]:
multi_tau_steps = False
if run_two_time:
    if lag_steps is None:
        num_bufs=8
        noframes = FD.end - FD.beg
        num_levels = int(np.log( noframes/(num_bufs-1))/np.log(2) +1) +1
        tot_channels, lag_steps, dict_lag = multi_tau_lags(num_levels, num_bufs)
        max_taus= lag_steps.max()
        
    #max_taus= lag_steps.max()  
    max_taus = Nimg    
    t0=time.time()
    #tausb = np.arange( g2b.shape[0])[:max_taus] *timeperframe
    if multi_tau_steps:
        lag_steps_ = lag_steps[   lag_steps <= g12b.shape[0] ]
        g2b = get_one_time_from_two_time(g12b)[lag_steps_]
        tausb = lag_steps_ *timeperframe
    else:
        tausb = (np.arange( g12b.shape[0]) *timeperframe)[:-1] #-200 
        g2b = (get_one_time_from_two_time(g12b))[:-1]   #-200
    run_time(t0)
         
    g2b_pds = save_g2_general( g2b, taus=tausb, qr= np.array( list( qval_dict.values() ) )[:,0],
                              qz=None, uid=uid_ +'_g2b.csv', path= data_dir, return_res=True )


Total time: 0.048 sec
The correlation function is saved in /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/ with filename as uid=62e14e_fra_5_400_g2b.csv

In [82]:
if run_two_time:    
    g2b_fit_result, tausb_fit, g2b_fit = get_g2_fit_general( g2b,  tausb, 
                function = fit_g2_func,  vlim=[0.95, 1.05], fit_range= None,  
            fit_variables={'baseline':False, 'beta': True, 'alpha':False,'relaxation_rate':True},                                  
            guess_values={'baseline':1.0,'beta': 0.15,'alpha':1.0,'relaxation_rate':1,},
            guess_limits = dict( baseline =[1, 1.8], alpha=[0, 2],
                        beta = [0, 1], relaxation_rate= [0.000001, 5000]) ) 
    g2b_fit_paras = save_g2_fit_para_tocsv(g2b_fit_result,  filename= uid_  +'_g2b_fit_paras.csv', path=data_dir )


The g2 fitting parameters are saved in /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/uid=62e14e_fra_5_400_g2b_fit_paras.csv

In [83]:
#plot1D( x = tausb[1:], y =g2b[1:,0], ylim=[0.95, 1.46], xlim = [0.0001, 10], m='', c='r', ls = '-',
#       logx=True, title='one_time_corelation', xlabel = r"$\tau $ $(s)$",    )

In [ ]:


In [84]:
if run_two_time:
    plot_g2_general( g2_dict={1:g2b, 2:g2b_fit}, taus_dict={1:tausb, 2:tausb_fit}, vlim=[0.95, 1.05],
                qval_dict=qval_dict, fit_res= g2b_fit_result,  geometry=scat_geometry_,filename=uid_+'_g2', 
                    path= data_dir, function= fit_g2_func,  ylabel='g2', append_name=  '_b_fit')


The combined image is saved as: /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/uid=62e14e_fra_5_400_g2_b_fit__joint.png

In [85]:
if run_two_time:    
    try:
        if False:
            fs, fe = 0,9
            fs, fe = 0,12
            qval_dict_ = {k:qval_dict[k] for k in list(qval_dict.keys())[fs:fe]  }
            D0b, qrate_fit_resb = get_q_rate_fit_general(  qval_dict_, g2b_fit_paras['relaxation_rate'][fs:fe], geometry= scat_geometry_ )
            plot_q_rate_fit_general( qval_dict_, g2b_fit_paras['relaxation_rate'][fs:fe],  qrate_fit_resb, 
                                geometry= scat_geometry_,uid=uid_ +'_two_time' , path= data_dir )
        else:
            D0b, qrate_fit_resb = get_q_rate_fit_general(  qval_dict, g2b_fit_paras['relaxation_rate'],
                                            fit_range=[1, 10],  geometry= scat_geometry_ )
            plot_q_rate_fit_general( qval_dict, g2b_fit_paras['relaxation_rate'],  qrate_fit_resb,   
                                geometry= scat_geometry_,uid=uid_ +'_two_time', show_fit=False,path= data_dir, plot_all_range= True )
    except:
        print('sorry, no fit of dispersion available...')


The fitted diffusion coefficient D0 is:  1.985e+02   A^2S-1
The fitted diffusion coefficient D0 is:  9.156e+01   A^2S-1
The fitted diffusion coefficient D0 is:  1.084e+02   A^2S-1
The fitted diffusion coefficient D0 is:  1.116e+02   A^2S-1

In [86]:
if run_two_time and run_one_time:
    plot_g2_general( g2_dict={1:g2, 2:g2b}, taus_dict={1:taus, 2:tausb},vlim=[0.99, 1.007],
                qval_dict=qval_dict, g2_labels=['from_one_time', 'from_two_time'],
            geometry=scat_geometry_,filename=uid_+'_g2_two_g2', path= data_dir, ylabel='g2', )


The combined image is saved as: /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/uid=62e14e_fra_5_400_g2_two_g2__joint.png

Run Dose dependent analysis


In [87]:
if run_dose:
    get_two_time_mulit_uids( [uid], roi_mask,  norm= norm,  bin_frame_number=1, 
                        path= data_dir0, force_generate=False )


UID: 62e14e is in processing...
The two time correlation function for uid=62e14e is already calculated. Just pass...

In [88]:
try:
    print( md['transmission'] )
except:
    md['transmission'] =1


0.0068388531571917325

In [ ]:


In [89]:
if run_dose:
    N = len(imgs)
    print(N)
    #exposure_dose = md['transmission'] * exposuretime* np.int_([  N/32, N/16, N/8, N/4 ,N/2, 3*N/4, N*0.99 ])
    exposure_dose = md['transmission'] * exposuretime* np.int_([   N/8, N/4 ,N/2, 3*N/4, N*0.99 ])
    
    print( exposure_dose )


400
[ 0.06838511  0.13677023  0.27354045  0.41031068  0.5416101 ]

In [90]:
if run_dose:
    taus_uids, g2_uids = get_series_one_time_mulit_uids( [ uid ],  qval_dict, good_start=good_start,  
                    path= data_dir0, exposure_dose = exposure_dose,  num_bufs =8, save_g2= False,
                                                   dead_time = 0, trans = [ md['transmission'] ] )


UID: 62e14e is in processing...
uid: 62e14e--> fra_num_by_dose: [ 50 100 200 300 396]
/opt/conda_envs/analysis/lib/python3.5/site-packages/chxanalys-unknown-py3.5.egg/chxanalys/chx_xpcs_xsvs_jupyter_V1.py:560: UserWarning: Warning: the dose value is too large, and please check the maxium dose in this data set and give a smaller dose value. We will use the maxium dose of the data.

In [ ]:


In [91]:
if run_dose:    
    plot_dose_g2( taus_uids, g2_uids, ylim=[0.98, 1.2], vshift= 0.00,
                 qval_dict = qval_dict, fit_res= None,  geometry= scat_geometry_,
                 filename= '%s_dose_analysis'%uid_, 
                path= data_dir, function= None,  ylabel='g2_Dose', g2_labels= None, append_name=  '' )


The combined image is saved as: /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/uid=62e14e_fra_5_400_dose_analysis__joint.png

In [ ]:


In [92]:
if run_dose:
    qth_interest = 2
    plot_dose_g2( taus_uids, g2_uids, qth_interest= qth_interest, ylim=[0.98, 1.25], vshift= 0.00,
                 qval_dict = qval_dict, fit_res= None,  geometry= scat_geometry_,
                 filename= '%s_dose_analysis'%uidstr, 
                path= data_dir, function= None,  ylabel='g2_Dose', g2_labels= None, append_name=  '' )


Four Time Correlation


In [93]:
import time
if run_four_time:
    t0=time.time()
    g4 = get_four_time_from_two_time(g12b, g2=g2b)[:max_taus]
    run_time(t0)


Total time: 0.126 sec

In [94]:
if run_four_time:
    taus4 = np.arange( g4.shape[0])*timeperframe        
    g4_pds = save_g2_general( g4, taus=taus4, qr=np.array( list( qval_dict.values() ) )[:,0],
                             qz=None, uid=uid_ +'_g4.csv', path= data_dir, return_res=True )


The correlation function is saved in /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/ with filename as uid=62e14e_fra_5_400_g4.csv

In [95]:
if run_four_time:
    plot_g2_general( g2_dict={1:g4}, taus_dict={1:taus4},vlim=[0.95, 1.05], qval_dict=qval_dict, fit_res= None, 
                geometry=scat_geometry_,filename=uid_+'_g4',path= data_dir,   ylabel='g4')


The combined image is saved as: /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/uid=62e14e_fra_5_400_g4__joint.png

Speckle Visiblity


In [96]:
#run_xsvs =True

In [97]:
if run_xsvs:    
    max_cts = get_max_countc(FD, roi_mask )    
    max_cts = 15 #for eiger 500 K
    qind, pixelist = roi.extract_label_indices(   roi_mask  )
    noqs = len( np.unique(qind) )
    nopr = np.bincount(qind, minlength=(noqs+1))[1:]
    #time_steps = np.array( utils.geometric_series(2,   len(imgs)   ) )
    time_steps = [0,1]  #only run the first two levels
    num_times = len(time_steps)    
    times_xsvs = exposuretime + (2**(  np.arange( len(time_steps) ) ) -1 ) * timeperframe   
    print( 'The max counts are: %s'%max_cts )

Do historam


In [98]:
if run_xsvs:
    if roi_avg is  None:
        times_roi, mean_int_sets = cal_each_ring_mean_intensityc(FD, roi_mask, timeperframe = None,  ) 
        roi_avg = np.average( mean_int_sets, axis=0)
    
    t0=time.time()
    spec_bins, spec_his, spec_std  =  xsvsp( FD, np.int_(roi_mask), norm=None,
                max_cts=int(max_cts+2),  bad_images=bad_frame_list, only_two_levels=True )    
    spec_kmean =  np.array(  [roi_avg * 2**j for j in  range( spec_his.shape[0] )] )
    run_time(t0)
    spec_pds =  save_bin_his_std( spec_bins, spec_his, spec_std, filename=uid_+'_spec_res.csv', path=data_dir )

Do historam fit by negtive binominal function with maximum likehood method


In [99]:
if run_xsvs:    
    ML_val, KL_val,K_ = get_xsvs_fit(  spec_his, spec_kmean,  spec_std, max_bins=2, varyK= False ) #True  )
    #print( 'The observed average photon counts are: %s'%np.round(K_mean,4))
    #print( 'The fitted average photon counts are: %s'%np.round(K_,4)) 
    print( 'The difference sum of average photon counts between fit and data are: %s'%np.round( 
            abs(np.sum( spec_kmean[0,:] - K_ )),4))
    print( '#'*30)
    qth=   0 
    print( 'The fitted M for Qth= %s are: %s'%(qth, ML_val[qth]) )
    print( K_[qth])
    print( '#'*30)

Plot fit results


In [ ]:


In [100]:
if run_xsvs:   
    qr = [qval_dict[k][0] for k in list(qval_dict.keys()) ]
    plot_xsvs_fit(  spec_his, ML_val, KL_val, K_mean = spec_kmean, spec_std=spec_std,
                  xlim = [0,10], vlim =[.9, 1.1],
        uid=uid_, qth= qth_interest, logy= True, times= times_xsvs, q_ring_center=qr, path=data_dir)
    
    plot_xsvs_fit(  spec_his, ML_val, KL_val, K_mean = spec_kmean, spec_std = spec_std,
                  xlim = [0,15], vlim =[.9, 1.1],
        uid=uid_, qth= None, logy= True, times= times_xsvs, q_ring_center=qr, path=data_dir )

Get contrast


In [101]:
if run_xsvs:
    contrast_factorL = get_contrast( ML_val)
    spec_km_pds = save_KM(  spec_kmean, KL_val, ML_val, qs=qr, level_time=times_xsvs, uid=uid_, path = data_dir )
    #spec_km_pds

Plot contrast with g2 restuls


In [102]:
if run_xsvs:    
    plot_g2_contrast( contrast_factorL, g2b, times_xsvs, tausb, qr, 
                     vlim=[0.8,1.2], qth = qth_interest, uid=uid_,path = data_dir, legend_size=14)

    plot_g2_contrast( contrast_factorL, g2b, times_xsvs, tausb, qr, 
                     vlim=[0.8,1.2], qth = None, uid=uid_,path = data_dir, legend_size=4)

In [103]:
#from chxanalys.chx_libs import cmap_vge, cmap_albula, Javascript

Export Results to a HDF5 File


In [ ]:


In [104]:
md['mask_file']= mask_path + mask_name
md['roi_mask_file']= fp
md['mask'] = mask
md['NOTEBOOK_FULL_PATH'] =  data_dir + get_current_pipeline_fullpath(NFP).split('/')[-1]
md['good_start'] = good_start
md['bad_frame_list'] = bad_frame_list
md['avg_img'] = avg_img
md['roi_mask'] = roi_mask
md['setup_pargs'] = setup_pargs
if scat_geometry == 'gi_saxs':        
    md['Qr'] = Qr
    md['Qz'] = Qz
    md['qval_dict'] = qval_dict
    md['beam_center_x'] =  inc_x0
    md['beam_center_y']=   inc_y0
    md['beam_refl_center_x'] = refl_x0
    md['beam_refl_center_y'] = refl_y0


elif scat_geometry == 'gi_waxs':
    md['beam_center_x'] =  center[1]
    md['beam_center_y']=  center[0]
else:
    md['qr']= qr
    #md['qr_edge'] = qr_edge
    md['qval_dict'] = qval_dict
    md['beam_center_x'] =  center[1]
    md['beam_center_y']=  center[0]            

md['beg'] = FD.beg
md['end'] = FD.end
md['qth_interest'] = qth_interest
md['metadata_file'] = data_dir + 'uid=%s_md.pkl'%uid
psave_obj(  md, data_dir + 'uid=%s_md.pkl'%uid ) #save the setup parameters
save_dict_csv( md,  data_dir + 'uid=%s_md.csv'%uid, 'w')

Exdt = {} 
if scat_geometry == 'gi_saxs':  
    for k,v in zip( ['md', 'roi_mask','qval_dict','avg_img','mask','pixel_mask', 'imgsum', 'bad_frame_list', 'qr_1d_pds'], 
                [md,    roi_mask, qval_dict, avg_img,mask,pixel_mask, imgsum, bad_frame_list, qr_1d_pds] ):
        Exdt[ k ] = v
elif scat_geometry == 'saxs': 
    for k,v in zip( ['md', 'q_saxs', 'iq_saxs','iqst','qt','roi_mask','qval_dict','avg_img','mask','pixel_mask', 'imgsum', 'bad_frame_list'], 
                [md, q_saxs, iq_saxs, iqst, qt,roi_mask, qval_dict, avg_img,mask,pixel_mask, imgsum, bad_frame_list] ):
        Exdt[ k ] = v
elif scat_geometry == 'gi_waxs': 
    for k,v in zip( ['md', 'roi_mask','qval_dict','avg_img','mask','pixel_mask', 'imgsum', 'bad_frame_list'], 
                [md,       roi_mask, qval_dict, avg_img,mask,pixel_mask, imgsum, bad_frame_list] ):
        Exdt[ k ] = v
        
if run_waterfall:Exdt['wat'] =  wat
if run_t_ROI_Inten:Exdt['times_roi'] = times_roi;Exdt['mean_int_sets']=mean_int_sets
if run_one_time:
    if run_invariant_analysis:
        for k,v in zip( ['taus','g2','g2_fit_paras', 'time_stamp','invariant'], [taus,g2,g2_fit_paras,time_stamp,invariant] ):Exdt[ k ] = v
    else:
        for k,v in zip( ['taus','g2','g2_fit_paras'  ], [taus,g2,g2_fit_paras ] ):Exdt[ k ] = v
            
if run_two_time:
    for k,v in zip( ['tausb','g2b','g2b_fit_paras', 'g12b'], [tausb,g2b,g2b_fit_paras,g12b] ):Exdt[ k ] = v
    #for k,v in zip( ['tausb','g2b','g2b_fit_paras', ], [tausb,g2b,g2b_fit_paras] ):Exdt[ k ] = v    
if run_dose:
    for k,v in zip( [ 'taus_uids', 'g2_uids' ], [taus_uids, g2_uids] ):Exdt[ k ] = v
if run_four_time:
    for k,v in zip( ['taus4','g4'], [taus4,g4] ):Exdt[ k ] = v
if run_xsvs:
    for k,v in zip( ['spec_kmean','spec_pds','times_xsvs','spec_km_pds','contrast_factorL'], 
                   [ spec_kmean,spec_pds,times_xsvs,spec_km_pds,contrast_factorL] ):Exdt[ k ] = v

In [105]:
#%run chxanalys_link/chxanalys/Create_Report.py

In [106]:
if qphi_analysis:
    export_xpcs_results_to_h5( 'uid=%s_phi_Res.h5'%md['uid'], data_dir, export_dict = Exdt )
else:
    export_xpcs_results_to_h5( 'uid=%s_Res.h5'%md['uid'], data_dir, export_dict = Exdt )    
#export_xpcs_results_to_h5( 'uid=%s_Res.h5'%md['uid'], data_dir, export_dict = Exdt )
#extract_dict = extract_xpcs_results_from_h5( filename = 'uid=%s_Res.h5'%md['uid'], import_dir = data_dir )


The xpcs analysis results are exported to /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/ with filename as uid=62e14e37-128c-4474-9f53-92c25123daac_phi_Res.h5

In [107]:
#extract_dict = extract_xpcs_results_from_h5( filename = 'uid=%s_Res.h5'%md['uid'], import_dir = data_dir )

Create PDF Report


In [108]:
uid


Out[108]:
'62e14e'

In [109]:
pdf_out_dir = os.path.join('/XF11ID/analysis/', CYCLE, username, 'Results/')

pdf_filename = "XPCS_Analysis_Report2_for_uid=%s%s.pdf"%(uid,pdf_version)
if run_xsvs:
    pdf_filename = "XPCS_XSVS_Analysis_Report_for_uid=%s%s.pdf"%(uid,pdf_version)

In [110]:
%run /home/yuzhang/chxanalys_link/chxanalys/Create_Report.py

In [111]:
#md['detector_distance'] = 4.8884902

In [112]:
make_pdf_report( data_dir, uid, pdf_out_dir, pdf_filename, username, 
                    run_fit_form,run_one_time, run_two_time, run_four_time, run_xsvs, run_dose,
                report_type= scat_geometry, report_invariant= run_invariant_analysis,
               md = md )


****************************************
The pdf report is created with filename as: /XF11ID/analysis/2017_3/lwiegart/Results/XPCS_Analysis_Report2_for_uid=62e14e_11-10-2017.pdf
****************************************

Attach the PDF report to Olog


In [113]:
#%run  /home/yuzhang/chxanalys_link/chxanalys/chx_olog.py

In [114]:
if att_pdf_report:     
    os.environ['HTTPS_PROXY'] = 'https://proxy:8888'
    os.environ['no_proxy'] = 'cs.nsls2.local,localhost,127.0.0.1'
    update_olog_uid_with_file( uid[:6], text='Add XPCS Analysis PDF Report', 
                              filename=pdf_out_dir + pdf_filename, append_name='_r1' )


The url=https://logbook.nsls2.bnl.gov/Olog-11-ID/Olog was successfully updated with Add XPCS Analysis PDF Report and with the attachments

The End!


In [115]:
uid


Out[115]:
'62e14e'

Save the current pipeline in Results folder


In [116]:
save_current_pipeline( NFP, data_dir)


This pipeline: XPCS_Single_2017_V7.ipynb is saved in /XF11ID/analysis/2017_3/lwiegart/Results/62e14e/.

In [117]:
get_current_pipeline_fullpath(NFP)


Out[117]:
'/XF11ID/analysis/2017_3/lwiegart/XPCS_Single_2017_V7.ipynb'

In [ ]:


In [ ]:


In [ ]:


In [ ]:


In [ ]:


In [ ]:


In [ ]:


In [ ]:


In [ ]: