Graph Recovery Performance in High Dimensions

Assess how well GFGL can recover both the precision matrix structure and change-points as a function of problem size. The ideas is to fix the number of data-points across a set of experiments performed with different numbers of variables. The number of jumps/changepoints will be fixed, but the size of these jumps can be varied. Consider the convex combination:

Theta[2] = alpha x Theta[1] + (1-alpha) x Theta[3]

For each experiment I fix:

  • T=180 // Number of data points
  • cps = [60,120] // Changepoint positions
  • s = 0.2*p // Number of active edges

and vary:

  • p = [5,10,20] // Dimension of problem
  • alpha = [0.15, 0.25, 0.35, 0.5] // Size of jump

In [4]:
import networkx as nx
import matplotlib.pyplot as plt
import sys
sys.path.append('../')

from graphtime import GroupFusedGraphLasso
from graphtime.simulate import *
from graphtime.utils import get_change_points, plot_data_with_cps

In [6]:
for p in [5,10,20]:
    s = 0.2*p # Sparsity for this size of experiment
    for alpha in [0.15,0.25,0.35,0.5]:
        # Simulate graphs
        for k in [0,1,2]:
            DGS = DynamicGraphicalModel(p, seed=2)
            DGS.graphs = ErdosRenyiPrecisionGraph(DGS.n_vertices, s)

In [ ]: