This notebook will perform analysis of functional connectivity on simulated data.


In [1]:
import os,sys
import numpy
%matplotlib inline
import matplotlib.pyplot as plt
sys.path.insert(0,'../utils')
from mkdesign import create_design_singlecondition
from nipy.modalities.fmri.hemodynamic_models import spm_hrf,compute_regressor
from make_data import make_continuous_data

In [2]:
data=make_continuous_data(N=200)
print('correlation without activation:',numpy.corrcoef(data.T)[0,1])

plt.plot(range(data.shape[0]),data[:,0],color='blue')
plt.plot(range(data.shape[0]),data[:,1],color='red')


correlation without activation: -0.652475027859
Out[2]:
[<matplotlib.lines.Line2D at 0x7fb7d7131b70>]

Now let's add on an activation signal to both voxels


In [3]:
design_ts,design=create_design_singlecondition(blockiness=1.0,offset=30,blocklength=20,deslength=data.shape[0])
regressor,_=compute_regressor(design,'spm',numpy.arange(0,len(design_ts)))

regressor*=50.
data_act=data+numpy.hstack((regressor,regressor))
plt.plot(range(data.shape[0]),data_act[:,0],color='blue')
plt.plot(range(data.shape[0]),data_act[:,1],color='red')
print ('correlation with activation:',numpy.corrcoef(data_act.T)[0,1])


../utils/mkdesign.py:25: VisibleDeprecationWarning: using a non-integer number instead of an integer will result in an error in the future
  design[b:b+blocklength]=1
correlation with activation: 0.762358680294

How can we address this problem? A general solution is to first run a general linear model to remove the task effect and then compute the correlation on the residuals.


In [4]:
X=numpy.vstack((regressor.T,numpy.ones(data.shape[0]))).T
beta_hat=numpy.linalg.inv(X.T.dot(X)).dot(X.T).dot(data_act)
y_est=X.dot(beta_hat)
resid=data_act - y_est
print ('correlation of residuals:',numpy.corrcoef(resid.T)[0,1])


correlation of residuals: -0.652759628313

What happens if we get the hemodynamic model wrong? Let's use the temporal derivative model to generate an HRF that is lagged compared to the canonical.


In [5]:
regressor_td,_=compute_regressor(design,'spm_time',numpy.arange(0,len(design_ts)))
regressor_lagged=regressor_td.dot(numpy.array([1,0.5]))*50

In [6]:
plt.plot(regressor_lagged)
plt.plot(regressor)


Out[6]:
[<matplotlib.lines.Line2D at 0x7fb7d6dbbeb8>]

In [8]:
data_lagged=data+numpy.vstack((regressor_lagged,regressor_lagged)).T

beta_hat_lag=numpy.linalg.inv(X.T.dot(X)).dot(X.T).dot(data_lagged)
plt.subplot(211)
y_est_lag=X.dot(beta_hat_lag)
plt.plot(y_est)
plt.plot(data_lagged)
resid=data_lagged - y_est_lag
print ('correlation of residuals:',numpy.corrcoef(resid.T)[0,1])
plt.subplot(212)
plt.plot(resid)


correlation of residuals: 0.870128173351
Out[8]:
[<matplotlib.lines.Line2D at 0x7fb7ce2955f8>,
 <matplotlib.lines.Line2D at 0x7fb7ce26d160>]

Let's see if using a more flexible basis set, like an FIR model, will allow us to get rid of the task-induced correlation.


In [9]:
regressor_fir,_=compute_regressor(design,'fir',numpy.arange(0,len(design_ts)),fir_delays=range(28))

In [10]:
regressor_fir.shape


Out[10]:
(200, 28)

In [11]:
X_fir=numpy.vstack((regressor_fir.T,numpy.ones(data.shape[0]))).T
beta_hat_fir=numpy.linalg.inv(X_fir.T.dot(X_fir)).dot(X_fir.T).dot(data_lagged)
plt.subplot(211)
y_est_fir=X_fir.dot(beta_hat_fir)
plt.plot(y_est)
plt.plot(data_lagged)
resid=data_lagged - y_est_fir
print ('correlation of residuals:',numpy.corrcoef(resid.T)[0,1])
plt.subplot(212)
plt.plot(resid)


correlation of residuals: -0.683175104691
Out[11]:
[<matplotlib.lines.Line2D at 0x7fb7ce1dcf28>,
 <matplotlib.lines.Line2D at 0x7fb7ce1e3208>]

In [ ]:


In [ ]: