scona

scona is a tool to perform network analysis over correlation networks of brain regions. This tutorial will go through the basic functionality of scona, taking us from our inputs (a matrix of structural regional measures over subjects) to a report of local network measures for each brain region, and network level comparisons to a cohort of random graphs of the same degree.


In [2]:
import numpy as np
import networkx as nx
import scona as scn
import scona.datasets as datasets

Importing data

A scona analysis starts with four inputs.

  • regional_measures A pandas DataFrame with subjects as rows. The columns should include structural measures for each brain region, as well as any subject-wise covariates.
  • names A list of names of the brain regions. This will be used to specify which columns of the regional_measures matrix to want to correlate over.
  • covars (optional) A list of your covariates. This will be used to specify which columns of regional_measure you wish to correct for.
  • centroids A list of tuples representing the cartesian coordinates of brain regions. This list should be in the same order as the list of brain regions to accurately assign coordinates to regions. The coordinates are expected to obey the convention the the x=0 plane is the same plane that separates the left and right hemispheres of the brain.

In [3]:
# Read in sample data from the NSPN WhitakerVertes PNAS 2016 paper.
df, names, covars, centroids = datasets.NSPN_WhitakerVertes_PNAS2016.import_data()

In [4]:
df.head()


Out[4]:
Unnamed: 0 nspn_id occ centre study_primary age_scan sex male age_bin mri_centre ... rh_supramarginal_part5 rh_supramarginal_part6 rh_supramarginal_part7 rh_frontalpole_part1 rh_temporalpole_part1 rh_transversetemporal_part1 rh_insula_part1 rh_insula_part2 rh_insula_part3 rh_insula_part4
0 0 10356 0 Cambridge 2K_Cohort 20.761 Female 0.0 4 WBIC ... 2.592 2.841 2.318 2.486 3.526 2.638 3.308 2.583 3.188 3.089
1 1 10702 0 Cambridge 2K_Cohort 16.055 Male 1.0 2 WBIC ... 3.448 3.283 2.740 3.225 4.044 3.040 3.867 2.943 3.478 3.609
2 2 10736 0 Cambridge 2K_Cohort 14.897 Female 0.0 1 WBIC ... 3.526 3.269 3.076 3.133 3.900 2.914 3.894 2.898 3.720 3.580
3 3 10778 0 Cambridge 2K_Cohort 20.022 Female 0.0 4 WBIC ... 2.830 2.917 2.647 2.796 3.401 3.045 3.138 2.739 2.833 3.349
4 4 10794 0 Cambridge 2K_Cohort 14.656 Female 0.0 1 WBIC ... 2.689 3.294 2.820 2.539 2.151 2.734 2.791 2.935 3.538 3.403

5 rows × 324 columns

Create a correlation matrix

We calculate residuals of the matrix df for the columns of names, correcting for the columns in covars.


In [5]:
df_res = scn.create_residuals_df(df, names, covars)

In [6]:
df_res


Out[6]:
lh_bankssts_part1 lh_bankssts_part2 lh_caudalanteriorcingulate_part1 lh_caudalmiddlefrontal_part1 lh_caudalmiddlefrontal_part2 lh_caudalmiddlefrontal_part3 lh_caudalmiddlefrontal_part4 lh_cuneus_part1 lh_cuneus_part2 lh_entorhinal_part1 ... rh_supramarginal_part5 rh_supramarginal_part6 rh_supramarginal_part7 rh_frontalpole_part1 rh_temporalpole_part1 rh_transversetemporal_part1 rh_insula_part1 rh_insula_part2 rh_insula_part3 rh_insula_part4
0 -0.016677 -0.036253 0.035892 -0.004586 0.040114 -0.005168 -0.252694 -0.170037 -0.192492 -0.119414 ... -0.421125 -0.123926 -0.37903 -0.436609 -0.143411 -0.103404 -0.321707 -0.159219 -0.19032 -0.393141
1 0.280323 0.354747 0.482892 0.697414 0.406114 0.445832 0.390306 0.326963 0.389508 0.340586 ... 0.434875 0.318074 0.04297 0.302391 0.374589 0.298596 0.237293 0.200781 0.09968 0.126859
2 0.168323 0.081747 0.365892 0.412414 0.283114 0.187832 0.576306 -0.061037 -0.062492 0.374586 ... 0.512875 0.304074 0.37897 0.210391 0.230589 0.172596 0.264293 0.155781 0.34168 0.097859
3 -0.086677 -0.249253 -0.404108 -0.362586 -0.046886 -0.154168 -0.156694 0.029963 -0.246492 -0.009414 ... -0.183125 -0.047926 -0.05003 -0.126609 -0.268411 0.303596 -0.491707 -0.003219 -0.54532 -0.133141
4 0.400323 0.136747 0.125892 -0.218586 -1.003886 -0.142168 -0.752694 -0.091037 -0.550492 -1.223414 ... -0.324125 0.329074 0.12297 -0.383609 -1.518411 -0.007404 -0.838707 0.192781 0.15968 -0.079141
5 0.010323 -0.090253 -0.248108 -0.131586 -0.080886 -0.110168 -0.142694 0.146963 -0.047492 0.155586 ... -0.097125 -0.074926 -0.12803 -0.114609 -0.241411 0.209596 0.326293 0.082781 0.26868 0.099859
6 -0.118677 0.133747 0.089892 0.047414 0.268114 0.017832 -0.037694 0.061963 0.028508 -0.214414 ... 0.080875 0.027074 0.00797 -0.067609 0.310589 -0.107404 -0.033707 -0.015219 -0.09132 0.048859
7 0.106323 0.043747 0.355892 0.174414 0.249114 0.068832 -0.017694 0.120963 0.095508 0.333586 ... 0.427875 0.134074 0.25697 -0.229609 -0.214411 0.329596 -0.355707 0.015781 -0.02732 -0.118141
8 0.433323 0.037747 0.191892 0.103414 0.206114 -0.091168 0.210306 -0.105037 0.009508 0.388586 ... 0.103875 0.182074 0.31697 -0.083609 0.176589 0.062596 0.404293 0.024781 -0.05932 0.041859
9 0.191323 0.081747 0.093892 0.100414 0.012114 0.180832 0.123306 -0.062037 -0.180492 -0.323414 ... -0.197125 -0.022926 -0.17103 0.527391 0.004589 0.123596 0.413293 -0.185219 -0.21632 0.065859
10 -1.036677 -0.759253 -0.621108 0.058414 -0.298886 0.005832 -0.419694 0.348963 0.232508 -0.997414 ... -0.463125 0.153074 0.01097 -0.540609 -0.149411 0.290596 -0.078707 0.073781 0.30068 0.333859
11 0.183323 0.155747 -0.100108 0.186414 -0.036886 0.206832 0.154306 0.020963 -0.053492 0.004586 ... 0.149875 -0.113926 -0.11203 0.015391 0.233589 0.134596 -0.206707 -0.039219 -0.35132 0.220859
12 -0.058677 -0.222253 -0.230108 0.373414 0.219114 0.234832 0.018306 0.026963 0.090508 0.301586 ... 0.102875 0.059074 -0.05203 0.061391 -0.193411 0.127596 0.292293 -0.027219 0.22968 0.481859
13 0.200323 -0.048253 0.032892 -0.106586 0.048114 -0.502168 -0.108694 0.127963 -0.104492 -0.043414 ... 0.634875 0.150074 0.30797 0.413391 -0.463411 0.388596 0.216293 0.074781 0.23568 0.192859
14 0.221323 0.374747 0.107892 -0.115586 0.152114 -0.089168 0.032306 0.180963 0.287508 -0.422414 ... -0.086125 -0.217926 -0.06003 0.198391 0.171589 0.026596 0.124293 0.050781 -0.02332 -0.113141
15 0.075323 -0.141253 -0.161108 0.162414 -0.100886 -0.026168 -0.145694 0.209963 -0.070492 -0.518414 ... 0.266875 0.019074 -0.21203 0.296391 -0.047411 -0.364404 -0.380707 0.210781 0.23068 0.030859
16 -0.047677 0.226747 -0.067108 0.019414 0.165114 -0.168168 -0.124694 -0.312037 -0.035492 0.112586 ... 0.056875 -0.091926 -0.03103 0.301391 0.392589 0.115596 0.027293 0.234781 0.11168 0.111859
17 0.055323 -0.055253 0.164892 -0.322586 -0.029886 -0.139168 -0.103694 -0.004037 0.076508 0.069586 ... -0.157125 -0.117926 0.03697 -0.286609 0.243589 0.162596 0.045293 0.148781 0.22168 0.050859
18 -0.327677 0.128747 -0.260108 0.251414 -0.100886 0.145832 -0.091694 0.002963 0.038508 0.231586 ... 0.261875 0.341074 0.20797 0.031391 0.052589 -0.204404 0.224293 -0.111219 -0.08532 0.079859
19 -0.012677 -0.090253 0.499892 -0.071586 0.111114 0.023832 0.103306 -0.278037 0.053508 0.236586 ... -0.003125 0.033074 -0.08003 -0.083609 0.069589 0.085596 0.416293 -0.049219 0.27268 0.344859
20 0.052323 -0.096253 0.879892 -0.218586 -0.087886 -0.296168 -0.231694 -0.080037 -0.212492 -0.596414 ... 0.255875 -0.167926 -0.18103 0.095391 -0.579411 -0.360404 0.013293 -0.129219 -0.25632 0.101859
21 -0.028677 -0.236253 -0.091108 -0.280586 -0.182886 -0.320168 0.036306 0.103963 0.198508 -0.637414 ... -0.242125 -0.060926 -0.35703 -0.273609 -0.645411 -0.026404 -0.240707 0.046781 0.16268 0.029859
22 0.235323 0.102747 0.235892 0.103414 0.131114 -0.083168 0.164306 0.311963 0.240508 -0.716414 ... 0.374875 0.073074 -0.16303 0.303391 0.011589 0.207596 0.238293 0.056781 -0.27732 0.288859
23 0.120323 0.066747 -0.144108 0.195414 0.097114 0.122832 0.139306 0.166963 0.033508 0.176586 ... 0.321875 0.009074 -0.23003 0.320391 0.749589 0.300596 0.069293 0.103781 0.02568 -0.048141
24 0.084323 0.094747 -0.203108 -0.025586 -0.042886 0.127832 0.332306 0.005963 0.178508 -0.394414 ... 0.356875 0.079074 0.10597 0.071391 -1.264411 0.001596 -0.204707 -0.084219 -0.05932 -0.320141
25 0.098323 0.030747 0.304892 0.160414 0.197114 0.275832 0.115306 0.185963 0.222508 0.053586 ... -0.141125 0.200074 -0.05003 0.162391 0.155589 -0.153404 0.403293 0.026781 0.33068 0.282859
26 0.024323 -0.112253 0.335892 0.332414 0.285114 0.348832 0.333306 0.214963 0.120508 0.118586 ... 0.263875 0.127074 0.17397 0.191391 0.249589 0.214596 0.029293 0.233781 0.14168 0.342859
27 0.215323 -0.008253 -0.041108 0.083414 0.159114 -0.089168 0.300306 0.156963 0.088508 0.425586 ... -0.098125 0.280074 0.28197 0.000391 0.808589 0.006596 -0.036707 0.331781 0.16468 -0.137141
28 -0.170677 -0.324253 -0.035108 -0.098586 -0.178886 0.071832 -0.251694 -0.172037 -0.109492 0.553586 ... -0.119125 0.094074 0.13297 -0.610609 -0.063411 0.001596 0.141293 0.106781 0.47968 0.154859
29 -0.102677 -0.035253 -0.117108 0.318414 0.114114 0.035832 0.131306 -0.108037 0.029508 0.241586 ... 0.008875 -0.023926 -0.34803 -0.158609 0.240589 -0.392404 0.222293 -0.063219 -0.24132 -0.042141
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
267 -0.604677 -0.283253 -0.478108 -0.215586 -0.244886 -0.302168 -0.239694 -0.234037 -0.143492 -0.014414 ... -0.118125 -0.104926 -0.22403 -0.305609 -0.048411 -0.260404 -0.130707 -0.320219 -0.17832 -0.153141
268 -0.090677 -0.174253 -0.005108 -0.201586 -0.215886 -0.309168 -0.170694 -0.136037 -0.316492 -0.344414 ... 0.110875 -0.069926 -0.21103 -0.029609 -0.027411 -0.401404 -0.151707 0.014781 0.04168 -0.192141
269 -0.357677 -0.119253 0.010892 -0.054586 0.006114 0.097832 -0.164694 -0.214037 -0.363492 0.059586 ... -0.299125 -0.240926 -0.48403 -0.392609 -0.081411 -0.217404 -0.334707 -0.190219 0.18768 0.153859
270 0.166323 0.225747 0.357892 0.109414 -0.017886 0.109832 0.233306 -0.187037 -0.090492 -0.414414 ... 0.089875 -0.149926 0.09397 0.096391 -0.332411 -0.022404 0.040293 0.066781 0.14768 0.005859
271 -0.056677 0.032747 0.074892 -0.072586 0.140114 0.023832 0.040306 -0.314037 -0.106492 -0.014414 ... -0.155125 -0.173926 -0.16003 0.053391 0.091589 0.107596 0.312293 0.226781 -0.02432 -0.068141
272 -0.159677 -0.130253 -0.410108 -0.154586 -0.204886 -0.074168 -0.299694 -0.113037 -0.417492 0.117586 ... -0.390125 -0.220926 -0.15603 -0.479609 -0.070411 -0.258404 -0.035707 -0.082219 -0.24732 -0.106141
273 -0.044677 -0.066253 -0.054108 0.289414 0.055114 0.143832 0.033306 -0.118037 -0.057492 -0.447414 ... -0.160125 -0.059926 -0.21903 -0.042609 -0.973411 -0.063404 -0.117707 0.005781 0.16668 -0.132141
274 0.591323 0.204747 0.964892 0.083414 0.247114 0.060832 0.243306 0.269963 0.017508 -0.211414 ... 0.078875 -0.030926 -0.10203 -0.002609 0.377589 0.035596 -0.297707 0.018781 -0.00032 0.275859
275 0.290323 0.104747 0.129892 -0.112586 0.008114 0.015832 0.017306 0.164963 0.092508 0.136586 ... 0.151875 -0.044926 0.03697 -0.131609 0.205589 0.024596 0.012293 -0.011219 -0.04732 0.044859
276 -0.197677 -0.123253 -0.140108 -0.063586 -0.268886 -0.137168 -0.176694 -0.029037 -0.158492 -1.103414 ... -0.183125 -0.317926 -0.23103 -0.011609 -0.146411 -0.106404 -0.307707 -0.149219 -0.46132 -0.449141
277 -0.115677 -0.187253 -0.163108 0.037414 -0.177886 0.092832 -0.158694 0.042963 -0.052492 0.311586 ... -0.040125 -0.144926 0.04497 -0.074609 0.241589 -0.098404 0.155293 -0.269219 -0.11532 0.224859
278 -0.020677 0.037747 0.184892 -0.231586 0.115114 -0.216168 0.078306 -0.119037 0.026508 0.500586 ... 0.208875 -0.269926 -0.07503 0.109391 0.752589 -0.303404 -0.075707 -0.149219 -0.27532 -0.155141
279 -0.197677 -0.128253 -0.123108 0.221414 0.141114 -0.052168 0.001306 -0.256037 -0.088492 -0.018414 ... -0.179125 0.056074 -0.25603 -0.312609 0.540589 -0.281404 0.288293 -0.092219 -0.17332 0.202859
280 0.040323 0.115747 -0.322108 -0.163586 0.034114 -0.075168 -0.051694 -0.072037 0.161508 -0.070414 ... 0.193875 -0.043926 -0.00603 -0.025609 0.129589 0.148596 -0.181707 -0.166219 0.11968 0.239859
281 -0.081677 -0.254253 -0.144108 -0.035586 -0.221886 0.103832 -0.140694 -0.201037 -0.285492 -0.566414 ... 0.021875 -0.259926 -0.06303 -0.563609 -0.486411 -0.403404 0.208293 -0.200219 -0.38132 -0.004141
282 0.304323 0.275747 0.050892 0.406414 -0.260886 0.197832 -0.060694 0.071963 0.272508 0.086586 ... -0.065125 -0.018926 -0.05403 -0.079609 -0.345411 0.003596 0.190293 0.157781 -0.07332 -0.016141
283 -0.305677 -0.238253 -0.178108 -0.360586 -0.077886 -0.201168 -0.061694 -0.185037 -0.128492 -0.393414 ... 0.182875 0.455074 0.07497 -0.132609 -0.289411 -0.127404 0.108293 -0.408219 -0.02232 -0.117141
284 -0.027677 -0.158253 0.325892 -0.156586 -0.180886 -0.258168 -0.495694 -0.088037 -0.003492 -0.279414 ... -0.426125 -0.618926 -0.58403 -0.095609 0.234589 0.208596 -0.040707 -0.310219 0.08468 -0.363141
285 -0.216677 -0.194253 -0.189108 -0.320586 -0.349886 -0.364168 -0.040694 -0.165037 -0.418492 -0.879414 ... 0.286875 0.135074 -0.15703 -0.095609 -0.683411 0.179596 -0.602707 -0.162219 -0.05432 -0.057141
286 -0.073677 -0.038253 0.199892 0.169414 0.149114 -0.143168 -0.025694 -0.127037 0.317508 -0.432414 ... 0.125875 -0.206926 -0.10503 -0.109609 -0.489411 0.110596 0.186293 -0.040219 -0.09332 0.113859
287 0.136323 -0.043253 -0.317108 -0.010586 -0.026886 -0.001168 0.083306 -0.012037 -0.007492 -0.286414 ... 0.049875 0.157074 -0.14303 0.171391 0.224589 -0.221404 0.475293 -0.159219 -0.05332 -0.012141
288 -0.147677 -0.059253 0.028892 -0.108586 -0.090886 0.235832 -0.101694 0.177963 0.338508 0.024586 ... -0.163125 0.239074 -0.00603 0.187391 0.066589 0.212596 -0.525707 0.003781 0.26568 -0.061141
289 -0.209677 0.044747 0.043892 -0.130586 -0.178886 -0.134168 -0.152694 -0.034037 -0.011492 0.174586 ... 0.039875 -0.199926 -0.02903 -0.080609 0.426589 -0.293404 -0.201707 -0.118219 -0.17732 -0.079141
290 -0.403677 0.044747 -0.360108 -0.476586 -0.218886 -0.099168 0.142306 -0.432037 -0.274492 0.156586 ... -0.027125 -0.110926 -0.17803 0.067391 0.109589 -0.089404 -0.283707 -0.212219 -0.07232 0.050859
291 -0.053677 -0.079253 0.122892 0.079414 -0.123886 0.224832 0.018306 0.144963 0.000508 0.409586 ... -0.114125 0.290074 -0.06003 -0.331609 0.194589 0.309596 -0.296707 0.559781 0.36268 -0.069141
292 -0.065677 -0.112253 -0.013108 -0.233586 0.178114 -0.049168 -0.103694 0.024963 0.088508 -0.169414 ... -0.417125 -0.198926 0.01897 0.430391 0.005589 0.126596 0.327293 0.250781 0.18468 0.247859
293 0.014323 0.237747 -0.093108 0.100414 0.068114 -0.174168 -0.113694 0.013963 0.121508 -0.030414 ... 0.227875 -0.383926 -0.02703 0.099391 -0.410411 -0.010404 -0.075707 0.181781 0.29768 0.126859
294 0.066323 0.071747 -0.301108 -0.021586 0.082114 -0.131168 -0.339694 -0.174037 -0.332492 0.478586 ... 0.329875 0.022074 0.23197 -0.335609 0.068589 -0.007404 0.165293 0.018781 -0.20932 0.309859
295 0.053323 -0.269253 0.174892 -0.041586 -0.318886 -0.093168 -0.199694 0.185963 -0.272492 0.304586 ... -0.215125 -0.466926 0.03697 0.109391 -0.486411 -0.272404 -0.329707 -0.204219 0.12468 -0.165141
296 -0.206677 0.044747 0.205892 -0.159586 0.087114 -0.107168 -0.191694 0.220963 -0.106492 -0.066414 ... -0.022125 0.280074 0.14397 0.225391 0.312589 0.366596 0.028293 0.178781 0.47468 -0.205141

297 rows × 308 columns

Now we create a correlation matrix over the columns of df_res


In [7]:
M = scn.create_corrmat(df_res, method='pearson')

Create a weighted graph

A short sidenote on the BrainNetwork class: This is a very lightweight subclass of the Networkx.Graph class. This means that any methods you can use on a Networkx.Graph object can also be used on a BrainNetwork object, although the reverse is not true. We have added various methods which allow us to keep track of measures that have already been calculated, which, especially later on when one is dealing with 10^3 random graphs, saves a lot of time.
All scona measures are implemented in such a way that they can be used on a regular Networkx.Graph object. For example, instead of G.threshold(10) you can use scn.threshold_graph(G, 10).
Also you can create a BrainNetwork from a Networkx.Graph G, using scn.BrainNetwork(network=G)

Initialise a weighted graph G from the correlation matrix M. The parcellation and centroids arguments are used to label nodes with names and coordinates respectively.


In [8]:
G = scn.BrainNetwork(network=M, parcellation=names, centroids=centroids)

Threshold to create a binary graph

We threshold G at cost 10 to create a binary graph with 10% as many edges as the complete graph G. Ordinarily when thresholding one takes the 10% of edges with the highest weight. In our case, because we want the resulting graph to be connected, we calculate a minimum spanning tree first. If you want to omit this step, you can pass the argument mst=False to threshold. The threshold method does not edit objects inplace


In [9]:
H = G.threshold(10)

Calculate nodal summary.

calculate_nodal_measures will compute and record the following nodal measures

  • average_dist (if centroids available)
  • total_dist (if centroids available)
  • betweenness
  • closeness
  • clustering coefficient
  • degree
  • interhem (if centroids are available)
  • interhem_proportion (if centroids are available)
  • nodal partition
  • participation coefficient under partition calculated above
  • shortest_path_length

export_nodal_measure returns nodal attributes in a DataFrame. Let's try it now.


In [11]:
H.report_nodal_measures().head()


Out[11]:
centroids name x y z
0 [-27.965157, -19.013702, 17.919528] lh_bankssts_part1 -27.9652 -19.0137 17.9195
1 [-14.455663, -13.693461, 13.713674] lh_bankssts_part2 -14.4557 -13.6935 13.7137
2 [-33.906934, -22.284672, -15.821168] lh_caudalanteriorcingulate_part1 -33.9069 -22.2847 -15.8212
3 [-17.305373, -53.431573, -36.017154] lh_caudalmiddlefrontal_part1 -17.3054 -53.4316 -36.0172
4 [-22.265823, -64.366296, -37.674831] lh_caudalmiddlefrontal_part2 -22.2658 -64.3663 -37.6748

Use calculate_nodal_measures to fill in a bunch of nodal measures


In [12]:
H.calculate_nodal_measures()


        Calculating participation coefficient -           may take a little while

In [14]:
H.report_nodal_measures().head()


Out[14]:
betweenness centroids closeness clustering degree module name participation_coefficient shortest_path_length x y z
0 0.00824713 [-27.965157, -19.013702, 17.919528] 0.495961 0.3358 47 0 lh_bankssts_part1 0.717067 0.00824713 -27.9652 -19.0137 17.9195
1 0.0124798 [-14.455663, -13.693461, 13.713674] 0.507438 0.278788 55 0 lh_bankssts_part2 0.809587 0.0124798 -14.4557 -13.6935 13.7137
2 0 [-33.906934, -22.284672, -15.821168] 0.336254 1 2 1 lh_caudalanteriorcingulate_part1 0.75 0 -33.9069 -22.2847 -15.8212
3 0.0120765 [-17.305373, -53.431573, -36.017154] 0.525685 0.383485 83 2 lh_caudalmiddlefrontal_part1 0.459864 0.0120765 -17.3054 -53.4316 -36.0172
4 0.0292617 [-22.265823, -64.366296, -37.674831] 0.549195 0.293617 95 2 lh_caudalmiddlefrontal_part2 0.688753 0.0292617 -22.2658 -64.3663 -37.6748

We can also add measures as one might normally add nodal attributes to a networkx graph


In [15]:
nx.set_node_attributes(H, name="hat", values={x: x**2 for x in H.nodes})

These show up in our DataFrame too


In [17]:
H.report_nodal_measures(columns=['name', 'degree', 'hat']).head()


Out[17]:
degree hat name
0 47 0 lh_bankssts_part1
1 55 1 lh_bankssts_part2
2 2 4 lh_caudalanteriorcingulate_part1
3 83 9 lh_caudalmiddlefrontal_part1
4 95 16 lh_caudalmiddlefrontal_part2

Calculate Global measures


In [18]:
H.calculate_global_measures()


Out[18]:
{'assortativity': 0.09076922258276784,
 'average_clustering': 0.4498887255891581,
 'average_shortest_path_length': 2.376242649858285,
 'efficiency': 0.47983958611582617,
 'modularity': 0.3828553111606414}

In [20]:
H.rich_club();

Create a GraphBundle

The GraphBundle object is the scona way to handle across network comparisons. What is it? Essentially it's a python dictionary with BrainNetwork objects as values.


In [21]:
brain_bundle = scn.GraphBundle([H], ['NSPN_cost=10'])

This creates a dictionary-like object with BrainNetwork H keyed by 'NSPN_cost=10'


In [22]:
brain_bundle


Out[22]:
{'NSPN_cost=10': <scona.classes.BrainNetwork at 0x7f3f247f6fd0>}

Now add a series of random_graphs created by edge swap randomisation of H (keyed by 'NSPN_cost=10')


In [23]:
# Note that 10 is not usually a sufficient number of random graphs to do meaningful analysis,
# it is used here for time considerations
brain_bundle.create_random_graphs('NSPN_cost=10', 10)


        Creating 10 random graphs - may take a little while

In [24]:
brain_bundle


Out[24]:
{'NSPN_cost=10': <scona.classes.BrainNetwork at 0x7f3f247f6fd0>,
 'NSPN_cost=10_R1': <scona.classes.BrainNetwork at 0x7f3f138b7f28>,
 'NSPN_cost=10_R10': <scona.classes.BrainNetwork at 0x7f3f134b8828>,
 'NSPN_cost=10_R2': <scona.classes.BrainNetwork at 0x7f3f138b7e80>,
 'NSPN_cost=10_R3': <scona.classes.BrainNetwork at 0x7f3f138b7780>,
 'NSPN_cost=10_R4': <scona.classes.BrainNetwork at 0x7f3f138b7da0>,
 'NSPN_cost=10_R5': <scona.classes.BrainNetwork at 0x7f3f1388ae10>,
 'NSPN_cost=10_R6': <scona.classes.BrainNetwork at 0x7f3f134b8a58>,
 'NSPN_cost=10_R7': <scona.classes.BrainNetwork at 0x7f3f134b84a8>,
 'NSPN_cost=10_R8': <scona.classes.BrainNetwork at 0x7f3f134b8390>,
 'NSPN_cost=10_R9': <scona.classes.BrainNetwork at 0x7f3f134b83c8>}

Report on a GraphBundle

The following method will calculate global measures ( if they have not already been calculated) for all of the graphs in graph_bundle and report the results in a DataFrame. We can do the same for rich club coefficients below.


In [25]:
brain_bundle.report_global_measures()


Out[25]:
assortativity average_clustering average_shortest_path_length efficiency modularity
NSPN_cost=10 0.090769 0.449889 2.376243 0.479840 0.382855
NSPN_cost=10_R1 -0.083993 0.222571 2.086721 0.519147 0.000000
NSPN_cost=10_R10 -0.087652 0.222462 2.087821 0.518918 0.000000
NSPN_cost=10_R2 -0.080897 0.230384 2.090423 0.518615 0.000000
NSPN_cost=10_R3 -0.102626 0.221894 2.082745 0.519803 0.000000
NSPN_cost=10_R4 -0.072122 0.221924 2.087842 0.518902 0.000000
NSPN_cost=10_R5 -0.086952 0.232706 2.084754 0.519357 0.000000
NSPN_cost=10_R6 -0.084438 0.223727 2.085325 0.519440 0.000000
NSPN_cost=10_R7 -0.080285 0.224930 2.087525 0.519003 0.000000
NSPN_cost=10_R8 -0.070332 0.225730 2.085135 0.519281 0.000000
NSPN_cost=10_R9 -0.086510 0.227159 2.090782 0.518527 0.000000

In [26]:
brain_bundle.report_rich_club()


Out[26]:
0 1 2 3 4 5 6 7 8 9 ... 96 97 98 99 100 101 102 103 104 105
NSPN_cost=10 0.100004 0.103228 0.107244 0.112039 0.117842 0.122398 0.127975 0.131899 0.136820 0.141069 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R1 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 0.127226 0.131092 0.135825 0.139877 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R10 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 0.127226 0.131150 0.135885 0.139908 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
NSPN_cost=10_R2 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 0.127226 0.131092 0.135855 0.139971 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R3 0.100004 0.103228 0.107175 0.111920 0.117589 0.122003 0.127282 0.131150 0.135885 0.139940 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R4 0.100004 0.103228 0.107175 0.111920 0.117564 0.121976 0.127282 0.131150 0.135885 0.139940 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R5 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 0.127254 0.131150 0.135915 0.139940 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R6 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 0.127226 0.131092 0.135885 0.139908 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R7 0.100004 0.103228 0.107175 0.111920 0.117564 0.121976 0.127254 0.131150 0.135915 0.139971 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R8 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 0.127226 0.131121 0.135915 0.139940 ... 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
NSPN_cost=10_R9 0.100004 0.103228 0.107175 0.111920 0.117564 0.121976 0.127254 0.131150 0.135915 0.140034 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0

11 rows × 106 columns