This notebook presents how to perform maximum-likelihood parameter estimation for multiple neurons. The neurons depend on each other through a set of weights.


In [1]:
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
import random
import csv
%matplotlib inline

In [2]:
import os
import sys
sys.path.append(os.path.join(os.getcwd(),'..'))
sys.path.append(os.path.join(os.getcwd(),'..','code'))
sys.path.append(os.path.join(os.getcwd(),'..','data'))

In [3]:
import filters
import likelihood_functions as lk
import PoissonProcessClasses as PP
import auxiliary_functions as auxfun

In [4]:
import imp
imp.reload(filters)
imp.reload(lk)
imp.reload(auxfun)
imp.reload(PP)


Out[4]:
<module 'PoissonProcessClasses' from '/Users/val/MEGAsync/GLM_PythonModules/notebooks/../code/PoissonProcessClasses.py'>

In [ ]:


In [5]:
# Number of neurons
nofCells = 2

Reading input-output data:


In [6]:
# creating the path to the data
data_path = os.path.join(os.getcwd(),'..','data')

# reading stimulus
Stim = np.array(pd.read_csv(os.path.join(data_path,'Stim2.csv'),header = None))

# reading location of spikes
# (lengths of tsp sequences are not equal so reading them line by line)
tsp_list = []
with open(os.path.join(data_path,'tsp2.csv')) as csvfile:
    tspreader = csv.reader(csvfile)
    for row in tspreader:
        tsp_list.append(row)

Extracting a spike train from spike positions:


In [7]:
dt = 0.01
y_list = []
for tsp in tsp_list:
    tsp = np.array(tsp).astype(np.float)
    tsp_int = np.ceil((tsp - dt*0.001)/dt)
    tsp_int = np.reshape(tsp_int,(tsp_int.shape[0],1))
    tsp_int = tsp_int.astype(int)
    y_list.append(np.array([item in tsp_int for item in np.arange(Stim.shape[0]/dt)+1]).astype(int))

Creating filters:


In [8]:
# create a stimulus filter
kpeaks = np.array([0,round(20/3)])
pars_k = {'neye':5,'n':5,'kpeaks':kpeaks,'b':3}
K,K_orth,kt_domain = filters.createStimulusBasis(pars_k, nkt = 20)

In [9]:
# create a post-spike filter
hpeaks = np.array([0.1,2])
pars_h = {'n':5,'hpeaks':hpeaks,'b':.4}
H,H_orth,ht_domain = filters.createPostSpikeBasis(pars_h,dt)

In [10]:
# Interpolate Post Spike Filter
MSP = auxfun.makeInterpMatrix(len(ht_domain),1)
MSP[0,0] = 0
H_orth = np.dot(MSP,H_orth)


(2,)
(486, 486)

Conditional Intensity (spike rate): $$\lambda_{\beta}(i) = \exp(K(\beta_k)*Stim + H(\beta_h)*y + \sum_{j\ne i}w_j I(\beta_{I})*y_j) + \mu$$

$$\lambda_{\beta}(i) = \exp(M_k\beta_k + M_h \beta_h + Y w + \mu)$$

Creating a matrix of covariates:


In [11]:
M_k = lk.construct_M_k(Stim,K,dt)

In [ ]:


In [12]:
M_h_list = []
for tsp in tsp_list:
    tsp = np.array(tsp).astype(np.float)
    M_h_list.append(lk.construct_M_h(tsp,H_orth,dt,Stim))

In [13]:
# creating a matrix of output covariates
Y = np.array(y_list).T

In [14]:
# tsp_list = []
# for i in range(nofCells):
#    tsp_list.append(auxfun.simSpikes(np.hstack((coeff_k,coeff_h)),M,dt))

In [ ]:


In [ ]:


In [ ]:


In [15]:
M_list = []
for i in range(len(M_h_list)):
    # exclude the i'th spike-train
    M_list.append(np.hstack((M_k,M_h_list[i],np.delete(Y,i,1),np.ones((M_k.shape[0],1)))))
    #M_list.append(np.hstack((M_k,M_h_list[i],np.ones((M_h.shape[0],1)))))

Conditional intensity as a function of the covariates: $$ \lambda_{\beta} = \exp(M\beta) $$

Create a Poisson process model with this intensity:


In [ ]:

Setting initial parameters:


In [16]:
coeff_k0 = np.array([ 0.061453,0.284916,0.860335,1.256983,0.910615,0.488660,-0.887091,0.097441,0.026607,-0.090147])
coeff_h0 = np.zeros((5,))
coeff_w0 = np.zeros((nofCells,))
mu_0 = 0

pars0 = np.hstack((coeff_k0,coeff_h0,coeff_w0,mu_0))
pars0 = np.hstack((coeff_k0,coeff_h0,mu_0))
pars0 = np.zeros((17,))

Fitting the likelihood:


In [17]:
res_list = []
for i in range(len(y_list)):
    model = PP.PPModel(M_list[i].T,dt = dt/100)
    res_list.append(model.fit(y_list[i],start_coef = pars0,maxiter = 500, method = 'L-BFGS-B'))


25.0
-2370.91555267
-2651.66031375
-2586.84402508
-2790.27181698
-2944.31850754
-3064.36090651
-3142.99215553
-3219.60005413
-3263.66116582
-3383.31699087
-3438.24346929
-3491.1313386
-3530.93088474
-3598.41697668
-3665.57706446
-3632.92946272
-3691.88463173
-3717.88464084
-3740.41582874
-3746.57514425
-3749.53197558
-3752.08731379
-3758.09575081
-3772.35362298
-3794.4806474
-3831.87625386
-3886.7354796
-3922.49715109
-3956.50693795
-3992.33699331
-4125.18462257
-4413.87686531
-3618.3163967
-4489.29639788
-4712.42177482
-4846.83333052
-4941.95107109
-4976.29120482
-4985.05683941
-4990.6265096
-4980.08277636
-4991.13691361
-4991.81479836
-4992.74356026
-4993.32243034
-4993.8265394
-4994.28515079
-4994.93743572
-4995.92758122
-4998.08679556
-5001.80665683
-5009.3330824
-5028.92469084
-5063.46188559
-5100.54858457
-5082.19656408
-5134.97708252
-5169.41107218
-5176.02593733
-5182.78927326
-5190.64548854
-5195.61328077
-5206.827882
-5211.64687652
-5215.66415765
-5216.459103
-5216.73117095
-5216.96570182
-5217.92537964
-5218.84804553
-5214.8568745
-5219.76149983
-5221.94581152
-5224.99930041
-5227.51427801
-5230.07366114
-5232.08205942
-5234.49391462
-5236.00745256
-5236.77112158
-5234.19047347
-5236.96288686
-5237.28065186
-5237.55576274
-5237.83437768
-5238.26784428
-5238.91357713
-5239.4416856
-5240.03259778
-5240.38422929
-5240.61202065
-5240.74249226
-5240.84223611
-5240.97756665
-5241.14572131
-5241.33762828
-5241.46198039
-5241.53432305
-5241.57663391
-5241.60793621
-5241.657586
-5241.67242461
-5241.6985662
-5241.72338478
-5241.78368546
-5241.87846479
-5242.04250449
-5241.8455671
-5242.11365686
-5242.23579456
-5242.27712791
-5242.32743413
-5242.36188426
-5242.42620682
-5242.50132188
-5242.60490277
-5242.69673184
-5242.78103178
-5242.85574473
-5242.93364959
-5243.04625199
-5243.17633454
-5243.29126017
-5243.52185804
-5243.58381484
-5243.70902017
-5243.76856991
-5243.83212835
-5243.92299469
-5244.12228584
-5244.37299932
-5244.6237321
-5244.95310661
-5245.21453522
-5245.43591806
-5245.68847056
-5245.87811408
-5246.02432817
-5246.23208477
-5246.73440449
-5247.43055631
-5246.49968542
-5247.7861928
-5248.48984445
-5248.70837068
-5248.82376816
-5248.94171716
-5249.12812429
-5248.92810934
-5249.19459735
-5249.33022802
-5249.45234661
-5249.53744959
-5249.6525196
-5249.65465141
-5249.70111882
-5249.75369347
-5249.77292227
-5249.78139817
-5249.80738182
-5249.8613442
-5249.89176511
-5249.94904821
-5249.97116243
-5249.99092321
-5250.00776678
-5250.01867905
-5250.02242029
-5250.02457096
-5250.02761837
-5250.031998
-5250.03501853
-5250.03607133
-5250.03706158
-5250.03749317
-5250.03807894
-5250.03931201
-5250.04065261
-5250.04295989
-5250.04466623
-5250.04727147
-5250.04950278
-5250.0527466
-5250.05827763
-5250.06719257
-5250.09926607
-5250.12884545
-5250.16756401
-5250.21956749
-5249.57177201
-5250.23045222
-5250.25396939
-5250.2667597
-5250.28227888
-5250.2922554
-5250.29547981
-5250.29556815
-5250.29599233
-5250.29662663
-5250.2974736
-5250.29937299
-5250.30172238
-5250.3038942
-5250.29200202
-5250.3044797
-5250.30510519
-5250.30566913
-5250.30706209
-5250.30886193
-5250.31355209
-5250.29798597
-5250.31499431
-5250.32012882
-5250.32419422
-5250.32728607
-5250.33175538
-5250.3392102
-5250.343001
-5250.3564578
-5250.36573414
-5250.37980921
-5250.40475956
-5250.43266253
-5250.45973465
-5250.48032272
-5250.49507322
-5250.509067
-5250.52548568
-5250.52819498
-5250.52856953
-5250.52884062
-5250.52901684
-5250.52933328
-5250.52979826
-5250.53088444
-5250.53353329
-5250.53683412
-5250.54039769
-5250.54397019
-5250.54644431
-5250.54974225
-5250.55424729
-5250.5611905
-5250.5662612
-5250.5704387
-5250.57667319
-5250.57827326
-5250.57967047
-5250.58168593
-5250.58478212
-5250.58686789
-5250.59125986
-5250.59300744
-5250.59382512
-5250.59297764
-5250.59398393
-5250.59420296
-5250.5943665
-5250.59460396
-5250.59488442
-5250.59543712
-5250.59720154
-5250.5984468
-5250.59989977
-5250.60056173
-5250.60284366
-5250.60495606
-5250.60940094
-5250.61642994
-5250.63166694
-5250.63442217
-5250.66275562
-5250.67218451
-5250.67736878
-5250.67491877
-5250.67848594
-5250.67923231
-5250.68033282
-5250.68158567
-5250.6842222
-5250.6881393
-5250.64630872
-5250.68946099
-5250.6924047
-5250.69417512
-5250.69506109
-5250.69686379
-5250.70029576
-5250.70650824
-5250.7198153
-5250.70573127
-5250.73046068
-5250.7416548
-5250.75037375
-5250.75210151
-5250.75256011
-5250.75300902
-5250.75368385
-5250.7545818
-5250.75563085
-5250.75485025
-5250.75640043
-5250.75808567
-5250.76040419
-5250.76541188
-5250.77381035
-5250.76749898
-5250.77710169
-5250.7816741
-5250.78299306
-5250.78103127
-5250.78321029
-5250.78330364
-5250.7835555
-5250.78407533
-5250.78476431
-5250.78530644
-5250.78607912
-5250.78655473
-5250.78703244
-5250.78796228
-5250.7882216
-5250.79031414
-5250.79199278
-5250.79406346
-5250.79684953
-5250.80129381
-5250.80596502
-5250.8093405
-5250.81150139
-5250.81348027
-5250.81717134
-5250.81842981
-5250.82161439
-5250.82327159
-5250.82429169
-5250.82592049
-5250.8263085
-5250.82709821
-5250.82892097
-5250.82915717
-5250.8296557
-5250.83023218
-5250.83074302
-5250.8313058
-5250.83198066
-5250.8323412
-5250.83140549
-5250.83241362
-5250.83253722
-5250.83256371
-5250.83257814
-5250.8325929
-5250.83261788
-5250.83268199
-5250.83274294
-5250.83285816
-5250.83269687
-5250.83289534
-5250.83296315
-5250.83300414
-5250.83305446
-5250.83308969
-5250.83312021
-5250.83314627
-5250.83317245
-5250.83319105
-5250.83321233
-5250.83323002
-5250.83328334
-5250.83333097
-5250.83334344
-5250.83347546
-5250.83354373
-5250.83362066
-5250.83371776
-5250.83386476
-5250.83412816
-5250.8344084
-5250.83489645
-5250.83552543
-5250.83536268
-5250.83589429
-5250.83615149
-5250.83631113
-5250.83639516
-5250.83651539
-5250.83661817
-5250.83668658
-5250.83675934
-5250.83675971
-5250.83677757
-5250.83678584
25.0
-2141.43271759
-2308.07835564
-2590.62706446
-2907.65163453
-2991.37137139
-3093.65323549
-3144.76708044
-3228.52606029
-3293.61206225
-3423.4113976
-3543.1712311
-3647.78183752
-3693.22562846
-3719.07042585
-3726.25394773
-3733.73801429
-3734.7789823
-3737.61313262
-3743.75123606
-3751.44395426
-3763.63758072
-3795.13656962
-3821.74578445
-3845.76161606
-3866.88298862
-3881.96761294
-3906.37811917
-3954.88126813
-3670.76222836
-3987.56829942
-4064.02339064
-4240.42592022
-4031.81906202
-4347.60149089
-4558.74259964
-4815.60830664
-4869.95751428
-4928.88506814
-4947.1347854
-4955.46274129
-4960.96688316
-4963.53847882
-4966.05574236
-4966.44920185
-4967.69862353
-4968.14473341
-4968.4409208
-4968.7179071
-4969.11211838
-4969.91741926
-4970.43751313
-4971.58807451
-4972.66378657
-4973.78085876
-4976.4120325
-4979.56881406
-4981.84049553
-4985.39402348
-4987.97669241
-4990.43216552
-4998.83072955
-5005.84534273
-5023.19515585
-5021.84602924
-5035.75422664
-5058.4584796
-5107.35128382
-4981.69976858
-5119.49728188
-5147.49540697
-5163.71926227
-5168.14345087
-5167.7769191
-5168.83874205
-5169.51744941
-5170.17495675
-5171.38186677
-5172.56622386
-5171.98602898
-5172.89604137
-5173.04723607
-5173.1131547
-5173.14228791
-5173.17007054
-5173.35144325
-5173.59427476
-5174.06945922
-5173.04829352
-5174.30222332
-5174.74222611
-5175.05993713
-5175.33037086
-5175.71884051
-5174.8822644
-5175.87509496
-5176.25711625
-5176.42152385
-5176.51640466
-5176.57237215
-5176.63152133
-5176.79554761
-5176.83082044
-5177.00258682
-5177.04550908
-5177.07178487
-5177.08658709
-5177.08886644
-5177.10292274
-5177.10686869
-5177.12446497
-5177.15009358
-5177.20266935
-5177.1468503
-5177.2224253
-5177.26462911
-5177.24807858
-5177.27259896
-5177.28522941
-5177.29380093
-5177.32785851
-5177.38820951
-5177.40266539
-5177.62929561
-5177.86420229
-5178.24115606
-5178.65998445
-5179.23011323
-5177.95525217
-5179.30062286
-5179.42712886
-5179.4933018
-5179.54593988
-5179.70756966
-5179.87051956
-5179.77434229
-5179.91099399
-5179.94714115
-5179.96762482
-5179.99924067
-5180.0233659
-5180.05292965
-5180.06659753
-5180.07755341
-5180.08396334
-5180.11591732
-5180.1528826
-5180.19496389
-5180.3049979
-5180.39575881
-5180.52137657
-5180.66269918
-5180.60391993
-5180.726096
-5180.79898039
-5180.78447041
-5180.81140932
-5180.82373981
-5180.84629029
-5180.86557515
-5180.91284795
-5180.88562655
-5180.92794031
-5180.947174
-5180.95333219
-5180.95484454
-5180.95767638
-5180.95955628
-5180.96175852
-5180.9643114
-5180.96733409
-5180.97091886
-5180.97911203
-5180.99111208
-5180.9687406
-5180.9955792
-5181.00499328
-5181.01036312
-5181.01507035
-5181.02188449
-5181.03233044
-5181.03915123
-5181.0508089
-5181.06156886
-5181.08606183
-5181.14619548
-5181.12980645
-5181.18085473
-5181.23080722
-5181.2687461
-5181.30627498
-5181.33935046
-5181.35065974
-5181.3231878
-5181.35325715
-5181.35441275
-5181.35568524
-5181.35651547
-5181.35727473
-5181.35839872
-5181.36003437
-5181.36122944
-5181.36279734
-5181.36383733
-5181.36636414
-5181.37076561
-5181.378281
-5181.38614765
-5181.39850041
-5181.41566732
-5181.42396467
-5181.44867784
-5181.44805012
-5181.45328782
-5181.45368876
-5181.45719953
-5181.45884516
-5181.46037651
-5181.46243386
-5181.46656548
-5181.47048386
-5181.45140052
-5181.47158436
-5181.47373357
-5181.47609719
-5181.47997192
-5181.48504426
-5181.48567703
-5181.49288178
-5181.49598529
-5181.50007793
-5181.50660921
-5181.52241572
-5181.54315047
-5181.49172335
-5181.55128245
-5181.56192799
-5181.56596262
-5181.56772765
-5181.57370823
-5181.58096292
-5181.58069002
-5181.5844852
-5181.58765834
-5181.58956236
-5181.59122708
-5181.59293482
-5181.59510623
-5181.59530889
-5181.59648353
-5181.5987041
-5181.60047891
-5181.60299508
-5181.60956471
-5181.62011594
-5181.61546571
-5181.62753078
-5181.63922128
-5181.64716744
-5181.64883963
-5181.65109102
-5181.653203
-5181.65231354
-5181.65380515
-5181.65487299
-5181.65527692
-5181.65541763
-5181.6559491
-5181.65641468
-5181.65570946
-5181.65666897
-5181.65704356
-5181.65718429
-5181.65738412
-5181.65783959
-5181.65811887
-5181.65889629
-5181.6593123
-5181.66009887
-5181.66045008
-5181.6612886
-5181.6626206
-5181.66460381
-5181.66697426
-5181.67108063
-5181.67406084
-5181.67547706
-5181.67610269
-5181.67690067
-5181.67794115
-5181.67925581
-5181.68134341
-5181.67182029
-5181.68196905
-5181.68371844
-5181.68464622
-5181.68518164
-5181.68608188
-5181.68825241
-5181.69047341
-5181.69217535
-5181.69345085
-5181.69363618
-5181.6942526
-5181.69472412
-5181.69532495
-5181.69551558
-5181.69559407
-5181.69562791
-5181.69569748
-5181.69573036
-5181.6958586
-5181.69601478
-5181.6948184
-5181.69609512
-5181.6963391
-5181.69661104
-5181.69686957
-5181.69703253
-5181.69685495
-5181.69707575
-5181.6971359
-5181.69716957
-5181.69720045
-5181.69723368
-5181.69723787
/Users/val/MEGAsync/GLM_PythonModules/notebooks/../code/PoissonProcessClasses.py:169: FutureWarning: comparison to `None` will result in an elementwise object comparison in the future.
  if start_coef==None:
/Users/val/anaconda/lib/python3.4/site-packages/scipy/optimize/_minimize.py:362: RuntimeWarning: Method L-BFGS-B does not use Hessian information (hess).
  RuntimeWarning)
/Users/val/MEGAsync/GLM_PythonModules/notebooks/../code/PoissonProcessClasses.py:171: OptimizeWarning: Unknown solver options: maxfev
  res = optimize.minimize(self.negLogL,start_coef,jac = self.gradNegLogL,hess = self.hessNegLogL, args = y, options = opts, method = method)

Specifying the true parameters:


In [18]:
k_coeff = np.array([0.061453, 0.284916, 0.860335, 1.256983, 0.910615, 0.488660, -0.887091, 0.097441, 0.026607, -0.090147])
h_coeff = np.array([-15.18,38.24,-67.58,-14.06,-3.36])

In [19]:
for i in range(len(res_list)):
    k_coeff_predicted = res_list[i].x[:10]
    h_coeff_predicted = res_list[i].x[10:15]
    print('Estimated dc for neuron '+str(i)+': '+str(res_list[i].x[-1]))
    fig,axs = plt.subplots(1,2,figsize = (10,5))
    fig.suptitle('Neuron%d'%(i+1))
    axs[0].plot(-kt_domain[::-1],np.dot(K,k_coeff_predicted),'r',label = 'predicted')
    axs[0].set_title('Stimulus Filter')
    axs[0].hold(True)
    axs[0].plot(-kt_domain[::-1],np.dot(K,k_coeff),'b',label = 'true')
    axs[0].plot(-kt_domain[::-1],np.dot(K,pars0[:10]),'g',label = 'initial')
    axs[0].set_xlabel('Time')
    axs[0].legend(loc = 'upper left')
    axs[1].set_title('Post-Spike Filter')
    axs[1].plot(ht_domain,np.dot(H_orth,h_coeff_predicted),'r',label = 'predicted')
    axs[1].plot(ht_domain,np.dot(H_orth,h_coeff),'b',label = 'true')
    axs[1].plot(ht_domain,np.dot(H_orth,coeff_h0[:H_orth.shape[1]]),'g',label = 'initial')
    axs[1].set_title('Post-Spike Filter')
    axs[1].set_xlabel('Time')
    axs[1].legend(loc = 'upper right')


Estimated dc for neuron 0: 3.10101102304
Estimated dc for neuron 1: 3.14900338762

Extracting the weight matrix:


In [20]:
W = np.array([np.hstack((res_list[i].x[-(nofCells):-nofCells+i],0,res_list[i].x[-nofCells+i:-1])) for i in range(len(res_list))])
print(W)


[[ 0.         -0.03272112]
 [-0.08846552  0.        ]]

Note: the stimulus and the post-spike estimates can be different for different neurons.

Note: there might be some scale issue. Need to normalize these weights in some way.


In [ ]:


In [ ]:


In [ ]:


In [ ]: