In [8]:
from IPython.display import Image

CNTK 201B: Hands On Labs Image Recognition

This hands-on lab shows how to implement image recognition task using convolution network with CNTK v2 Python API. You will start with a basic feedforward CNN architecture to classify CIFAR dataset, then you will keep adding advanced features to your network. Finally, you will implement a VGG net and residual net like the one that won ImageNet competition but smaller in size.

Introduction

In this hands-on, you will practice the following:

  • Understanding subset of CNTK python API needed for image classification task.
  • Write a custom convolution network to classify CIFAR dataset.
  • Modifying the network structure by adding:
    • Dropout layer.
    • Batchnormalization layer.
  • Implement a VGG style network.
  • Introduction to Residual Nets (RESNET).
  • Implement and train RESNET network.

Prerequisites

CNTK 201A hands-on lab, in which you will download and prepare CIFAR dataset is a prerequisites for this lab. This tutorial depends on CNTK v2, so before starting this lab you will need to install CNTK v2. Furthermore, all the tutorials in this lab are done in python, therefore, you will need a basic knowledge of Python.

CNTK 102 lab is recommended but not a prerequisite for this tutorial. However, a basic understanding of Deep Learning is needed. Familiarity with basic convolution operations is highly desirable (Refer to CNTK tutorial 103D).

Dataset

You will use CIFAR 10 dataset, from https://www.cs.toronto.edu/~kriz/cifar.html, during this tutorial. The dataset contains 50000 training images and 10000 test images, all images are 32 x 32 x 3. Each image is classified as one of 10 classes as shown below:


In [9]:
# Figure 1
Image(url="https://cntk.ai/jup/201/cifar-10.png", width=500, height=500)


Out[9]:

The above image is from: https://www.cs.toronto.edu/~kriz/cifar.html

Convolution Neural Network (CNN)

We recommend completing CNTK 103D tutorial before proceeding. Here is a brief recap of Convolution Neural Network (CNN). CNN is a feedforward network comprise of a bunch of layers in such a way that the output of one layer is fed to the next layer (There are more complex architecture that skip layers, we will discuss one of those at the end of this lab). Usually, CNN start with alternating between convolution layer and pooling layer (downsample), then end up with fully connected layer for the classification part.

Convolution layer

Convolution layer consist of multiple 2D convolution kernels applied on the input image or the previous layer, each convolution kernel outputs a feature map.


In [10]:
# Figure 2
Image(url="https://cntk.ai/jup/201/Conv2D.png")


Out[10]:

The stack of feature maps output are the input to the next layer.


In [11]:
# Figure 3
Image(url="https://cntk.ai/jup/201/Conv2DFeatures.png")


Out[11]:

Gradient-Based Learning Applied to Document Recognition, Proceedings of the IEEE, 86(11):2278-2324, November 1998 Y. LeCun, L. Bottou, Y. Bengio and P. Haffner

In CNTK:

Here the convolution layer in Python:

def Convolution(filter_shape,        # e.g. (3,3)
                num_filters,         # e.g. 64
                activation,          # relu or None...etc.
                init,                # Random initialization
                pad,                 # True or False
                strides)             # strides e.g. (1,1)

Pooling layer

In most CNN vision architecture, each convolution layer is succeeded by a pooling layer, so they keep alternating until the fully connected layer.

The purpose of the pooling layer is as follow:

  • Reduce the dimensionality of the previous layer, which speed up the network.
  • Provide a limited translation invariant.

Here an example of max pooling with a stride of 2:


In [12]:
# Figure 4
Image(url="https://cntk.ai/jup/201/MaxPooling.png", width=400, height=400)


Out[12]:

In CNTK:

Here the pooling layer in Python:

# Max pooling
def MaxPooling(filter_shape,  # e.g. (3,3)
               strides,       # (2,2)
               pad)           # True or False

# Average pooling
def AveragePooling(filter_shape,  # e.g. (3,3)
                   strides,       # (2,2)
                   pad)           # True or False

Dropout layer

Dropout layer takes a probability value as an input, the value is called the dropout rate. Let us say the dropout rate is 0.5, what this layer does it pick at random 50% of the nodes from the previous layer and drop them out of the network. This behavior help regularize the network.

Dropout: A Simple Way to Prevent Neural Networks from Overfitting Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov

In CNTK:

Dropout layer in Python:

# Dropout
def Dropout(prob)    # dropout rate e.g. 0.5

Batch normalization (BN)

Batch normalization is a way to make the input to each layer has zero mean and unit variance. BN help the network converge faster and keep the input of each layer around zero. BN has two learnable parameters called gamma and beta, the purpose of those parameters is for the network to decide for itself if the normalized input is what is best or the raw input.

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift Sergey Ioffe, Christian Szegedy

In CNTK:

Batch normalization layer in Python:

# Batch normalization
def BatchNormalization(map_rank)  # For image map_rank=1

Microsoft Cognitive Network Toolkit (CNTK)

CNTK is a highly flexible computation graphs, each node take inputs as tensors and produce tensors as the result of the computation. Each node is exposed in Python API, which give you the flexibility of creating any custom graphs, you can also define your own node in Python or C++ using CPU, GPU or both.

For Deep learning, you can use the low level API directly or you can use CNTK layered API. We will start with the low level API, then switch to the layered API in this lab.

So let's first import the needed modules for this lab.


In [82]:
from __future__ import print_function # Use a function definition from future version (say 3.x from 2.7 interpreter)

import matplotlib.pyplot as plt
import math
import numpy as np
import os
import PIL
import sys
import scipy
import scipy.cluster.hierarchy as hier
try: 
    from urllib.request import urlopen 
except ImportError: 
    from urllib import urlopen

import cntk as C

In the block below, we check if we are running this notebook in the CNTK internal test machines by looking for environment variables defined there. We then select the right target device (GPU vs CPU) to test this notebook. In other cases, we use CNTK's default policy to use the best available device (GPU, if available, else CPU).


In [14]:
if 'TEST_DEVICE' in os.environ:
    if os.environ['TEST_DEVICE'] == 'cpu':
        C.device.try_set_default_device(C.device.cpu())
    else:
        C.device.try_set_default_device(C.device.gpu(0))

In [15]:
# Figure 5
Image(url="https://cntk.ai/jup/201/CNN.png")


Out[15]:

Now that we imported the needed modules, let's implement our first CNN, as shown in Figure 5 above.

Let's implement the above network using CNTK layer API:


In [59]:
def create_basic_model(input, out_dims):
    with C.layers.default_options(init=C.glorot_uniform(), activation=C.relu):
        net = C.layers.Convolution((5,5), 32, pad=True, name='filters0')(input)
        net = C.layers.MaxPooling((3,3), strides=(2,2))(net)

        net = C.layers.Convolution((5,5), 32, pad=True, name='filters1')(net)
        net = C.layers.MaxPooling((3,3), strides=(2,2))(net)

        net = C.layers.Convolution((5,5), 64, pad=True, name='filters2')(net)
        net = C.layers.MaxPooling((3,3), strides=(2,2))(net)
    
        net = C.layers.Dense(64)(net)
        net = C.layers.Dense(out_dims, activation=None)(net)
    
    return net

To train the above model we need two things:

  • Read the training images and their corresponding labels.
  • Define a cost function, compute the cost for each mini-batch and update the model weights according to the cost value.

To read the data in CNTK, we will use CNTK readers which handle data augmentation and can fetch data in parallel.

Example of a map text file:

S:\data\CIFAR-10\train\00001.png    9
S:\data\CIFAR-10\train\00002.png    9
S:\data\CIFAR-10\train\00003.png    4
S:\data\CIFAR-10\train\00004.png    1
S:\data\CIFAR-10\train\00005.png    1

In [17]:
# Determine the data path for testing
# Check for an environment variable defined in CNTK's test infrastructure
envvar = 'CNTK_EXTERNAL_TESTDATA_SOURCE_DIRECTORY'
def is_test(): return envvar in os.environ

if is_test():
    data_path = os.path.join(os.environ[envvar],'Image','CIFAR','v0','tutorial201')
    data_path = os.path.normpath(data_path)
else:
    data_path = os.path.join('data', 'CIFAR-10')

# model dimensions
image_height = 32
image_width  = 32
num_channels = 3
num_classes  = 10

import cntk.io.transforms as xforms 
#
# Define the reader for both training and evaluation action.
#
def create_reader(map_file, mean_file, train):
    print("Reading map file:", map_file)
    print("Reading mean file:", mean_file)
    
    if not os.path.exists(map_file) or not os.path.exists(mean_file):
        raise RuntimeError("This tutorials depends 201A tutorials, please run 201A first.")

    # transformation pipeline for the features has jitter/crop only when training
    transforms = []
    # train uses data augmentation (translation only)
    if train:
        transforms += [
            xforms.crop(crop_type='randomside', side_ratio=0.8) 
        ]
    transforms += [
        xforms.scale(width=image_width, height=image_height, channels=num_channels, interpolations='linear'),
        xforms.mean(mean_file)
    ]
    # deserializer
    return C.io.MinibatchSource(C.io.ImageDeserializer(map_file, C.io.StreamDefs(
        features = C.io.StreamDef(field='image', transforms=transforms), # first column in map file is referred to as 'image'
        labels   = C.io.StreamDef(field='label', shape=num_classes)      # and second as 'label'
    )))

In [18]:
# Create the train and test readers
reader_train = create_reader(os.path.join(data_path, 'train_map.txt'), 
                             os.path.join(data_path, 'CIFAR-10_mean.xml'), True)
reader_test  = create_reader(os.path.join(data_path, 'test_map.txt'), 
                             os.path.join(data_path, 'CIFAR-10_mean.xml'), False)


Reading map file: data\CIFAR-10\train_map.txt
Reading mean file: data\CIFAR-10\CIFAR-10_mean.xml
Reading map file: data\CIFAR-10\test_map.txt
Reading mean file: data\CIFAR-10\CIFAR-10_mean.xml

Now let us write the the training and validation loop.


In [19]:
#
# Train and evaluate the network.
#
def train_and_evaluate(reader_train, reader_test, max_epochs, model_func):
    # Input variables denoting the features and label data
    input_var = C.input_variable((num_channels, image_height, image_width))
    label_var = C.input_variable((num_classes))

    # Normalize the input
    feature_scale = 1.0 / 256.0
    input_var_norm = C.element_times(feature_scale, input_var)
    
    # apply model to input
    z = model_func(input_var_norm, out_dims=10)

    #
    # Training action
    #

    # loss and metric
    ce = C.cross_entropy_with_softmax(z, label_var)
    pe = C.classification_error(z, label_var)

    # training config
    epoch_size     = 50000
    minibatch_size = 64

    # Set training parameters
    lr_per_minibatch       = C.learning_rate_schedule([0.01]*10 + [0.003]*10 + [0.001], 
                                                      C.UnitType.minibatch, epoch_size)
    momentum_time_constant = C.momentum_as_time_constant_schedule(-minibatch_size/np.log(0.9))
    l2_reg_weight          = 0.001
    
    # trainer object
    learner = C.momentum_sgd(z.parameters, 
                             lr = lr_per_minibatch, 
                             momentum = momentum_time_constant, 
                             l2_regularization_weight=l2_reg_weight)
    progress_printer = C.logging.ProgressPrinter(tag='Training', num_epochs=max_epochs)
    trainer = C.Trainer(z, (ce, pe), [learner], [progress_printer])

    # define mapping from reader streams to network inputs
    input_map = {
        input_var: reader_train.streams.features,
        label_var: reader_train.streams.labels
    }

    C.logging.log_number_of_parameters(z) ; print()

    # perform model training
    batch_index = 0
    plot_data = {'batchindex':[], 'loss':[], 'error':[]}
    for epoch in range(max_epochs):       # loop over epochs
        sample_count = 0
        while sample_count < epoch_size:  # loop over minibatches in the epoch
            data = reader_train.next_minibatch(min(minibatch_size, epoch_size - sample_count), 
                                               input_map=input_map) # fetch minibatch.
            trainer.train_minibatch(data)                                   # update model with it

            sample_count += data[label_var].num_samples                     # count samples processed so far
            
            # For visualization...            
            plot_data['batchindex'].append(batch_index)
            plot_data['loss'].append(trainer.previous_minibatch_loss_average)
            plot_data['error'].append(trainer.previous_minibatch_evaluation_average)
            
            batch_index += 1
        trainer.summarize_training_progress()
        
    #
    # Evaluation action
    #
    epoch_size     = 10000
    minibatch_size = 16

    # process minibatches and evaluate the model
    metric_numer    = 0
    metric_denom    = 0
    sample_count    = 0
    minibatch_index = 0

    while sample_count < epoch_size:
        current_minibatch = min(minibatch_size, epoch_size - sample_count)

        # Fetch next test min batch.
        data = reader_test.next_minibatch(current_minibatch, input_map=input_map)

        # minibatch data to be trained with
        metric_numer += trainer.test_minibatch(data) * current_minibatch
        metric_denom += current_minibatch

        # Keep track of the number of samples processed so far.
        sample_count += data[label_var].num_samples
        minibatch_index += 1

    print("")
    print("Final Results: Minibatch[1-{}]: errs = {:0.1f}% * {}".format(minibatch_index+1, (metric_numer*100.0)/metric_denom, metric_denom))
    print("")
    
    # Visualize training result:
    window_width            = 32
    loss_cumsum             = np.cumsum(np.insert(plot_data['loss'], 0, 0)) 
    error_cumsum            = np.cumsum(np.insert(plot_data['error'], 0, 0)) 

    # Moving average.
    plot_data['batchindex'] = np.insert(plot_data['batchindex'], 0, 0)[window_width:]
    plot_data['avg_loss']   = (loss_cumsum[window_width:] - loss_cumsum[:-window_width]) / window_width
    plot_data['avg_error']  = (error_cumsum[window_width:] - error_cumsum[:-window_width]) / window_width
    
    plt.figure(1)
    plt.subplot(211)
    plt.plot(plot_data["batchindex"], plot_data["avg_loss"], 'b--')
    plt.xlabel('Minibatch number')
    plt.ylabel('Loss')
    plt.title('Minibatch run vs. Training loss ')

    plt.show()

    plt.subplot(212)
    plt.plot(plot_data["batchindex"], plot_data["avg_error"], 'r--')
    plt.xlabel('Minibatch number')
    plt.ylabel('Label Prediction Error')
    plt.title('Minibatch run vs. Label Prediction Error ')
    plt.show()
    
    return C.softmax(z)

In [29]:
pred = train_and_evaluate(reader_train, 
                          reader_test, 
                          max_epochs=5, 
                          model_func=create_basic_model)


Training 116906 parameters in 10 parameter tensors.

Learning rate per minibatch: 0.01
Momentum per sample: 0.9983550962823424
Finished Epoch[1 of 5]: [Training] loss = 2.080008 * 50000, metric = 75.98% * 50000 45.011s (1110.8 samples/s);
Finished Epoch[2 of 5]: [Training] loss = 1.707727 * 50000, metric = 62.81% * 50000 9.172s (5451.4 samples/s);
Finished Epoch[3 of 5]: [Training] loss = 1.566306 * 50000, metric = 57.63% * 50000 10.066s (4967.2 samples/s);
Finished Epoch[4 of 5]: [Training] loss = 1.476099 * 50000, metric = 53.62% * 50000 10.162s (4920.3 samples/s);
Finished Epoch[5 of 5]: [Training] loss = 1.397499 * 50000, metric = 50.43% * 50000 10.094s (4953.4 samples/s);

Final Results: Minibatch[1-626]: errs = 45.8% * 10000

Although, this model is very simple, it still has too much code, we can do better. Here the same model in more terse format:


In [79]:
def create_basic_model_terse(input, out_dims):

    with C.layers.default_options(init=C.glorot_uniform(), activation=C.relu):
        model = C.layers.Sequential([
            C.layers.For(range(3), lambda i: [
                C.layers.Convolution((5,5), [1024,32,64][i], pad=True, name='filters'+str(i)),
                C.layers.MaxPooling((3,3), strides=(2,2))
                ]),
            C.layers.Dense(64),
            C.layers.Dense(out_dims, activation=None)
        ])

    return model(input)
pred_basic_model = train_and_evaluate(reader_train, 
                                      reader_test, 
                                      max_epochs=10, 
                                      model_func=create_basic_model_terse)


Training 985898 parameters in 10 parameter tensors.

Learning rate per minibatch: 0.01
Momentum per sample: 0.9983550962823424
Finished Epoch[1 of 10]: [Training] loss = 2.078778 * 50000, metric = 76.40% * 50000 107.172s (466.5 samples/s);
Finished Epoch[2 of 10]: [Training] loss = 1.724720 * 50000, metric = 64.12% * 50000 103.509s (483.0 samples/s);
Finished Epoch[3 of 10]: [Training] loss = 1.578155 * 50000, metric = 58.13% * 50000 104.081s (480.4 samples/s);
Finished Epoch[4 of 10]: [Training] loss = 1.464906 * 50000, metric = 53.26% * 50000 104.524s (478.4 samples/s);
Finished Epoch[5 of 10]: [Training] loss = 1.378042 * 50000, metric = 49.59% * 50000 104.150s (480.1 samples/s);
Finished Epoch[6 of 10]: [Training] loss = 1.303519 * 50000, metric = 46.87% * 50000 104.961s (476.4 samples/s);
Finished Epoch[7 of 10]: [Training] loss = 1.222369 * 50000, metric = 43.55% * 50000 104.190s (479.9 samples/s);
Finished Epoch[8 of 10]: [Training] loss = 1.164991 * 50000, metric = 41.33% * 50000 106.019s (471.6 samples/s);
Finished Epoch[9 of 10]: [Training] loss = 1.123755 * 50000, metric = 39.64% * 50000 103.791s (481.7 samples/s);
Finished Epoch[10 of 10]: [Training] loss = 1.071548 * 50000, metric = 37.71% * 50000 105.150s (475.5 samples/s);

Final Results: Minibatch[1-626]: errs = 35.7% * 10000


In [81]:
def ShapeFilter(f):
    nf = np.ndarray(shape=(f.shape[1],f.shape[2],f.shape[0]))
    for k in range(f.shape[0]):
        for i in range(f.shape[1]):
            for j in range(f.shape[2]):
                nf[i,j,k]=f[k,i,j]
    return nf

fig = plt.figure()
layer = pred_basic_model.parameters[-2].value
nrows = int(math.sqrt(layer.shape[0]))
ncols = int(layer.shape[0]/nrows)+1

for i in range(layer.shape[0]):
    fig.add_subplot(nrows,ncols,i+1)
    plt.imshow(ShapeFilter(layer[i,:,:,:]))
    plt.axis('off')
#plt.title('Layer '+str(k))
plt.show()



In [187]:
D = np.ones((1024,1024))
for i in range(1024):
    for j in range(1024):
        im0=np.mean(ShapeFilter(layer[i,:,:,:]), axis=2)
        im1=np.mean(ShapeFilter(layer[j,:,:,:]), axis=2)
        R = la.orthogonal_procrustes(im0,im1)
        D[i,j]=la.norm(np.subtract(np.matmul(im0, R[0]), im1),ord='fro')
print('done computing pairwise distances')


done computing pairwise distances

In [188]:
import scipy.cluster.hierarchy as hier
Z = hier.linkage(D, "average")
DG=hier.dendrogram(Z, orientation='right')
index = DG['leaves']
sD = D[index,:]
sD = sD[:,index]
fig = plt.figure()
plt.imshow(sD)
plt.axis('off')
plt.show()



In [189]:
clusts=hier.fcluster(Z, 10, criterion="maxclust")
sizes=[len([idx for idx,c in enumerate(clusts) if c == i]) for i in range(1,11)]

samples_per_clust=8
tclusts=[cid+1 for cid,csize in enumerate(sizes) if csize>samples_per_clust]

In [210]:
fig = plt.figure()
tcid=-1
for cid in tclusts:
    tcid+=1
    samples = [idx for idx,c in enumerate(clusts) if c == cid]
    for sampleid in range(samples_per_clust):
        imid0=samples[0]
        imid=samples[sampleid+1]
        im0=np.mean(ShapeFilter(layer[imid0,:,:,:]), axis=2)
        im1=np.mean(ShapeFilter(layer[imid ,:,:,:]), axis=2)
        R = la.orthogonal_procrustes(im1,im0)
        im1_r=np.matmul(im1, R[0])
        fig.add_subplot(len(tclusts),samples_per_clust,tcid*samples_per_clust + sampleid+1)
        #plt.title(str(imid))
        plt.imshow(im1_r)
        plt.axis('off')
plt.show()



In [214]:
#most_similar=np.unravel_index(np.argmin(D+np.eye(D.shape[0],D.shape[1])), D.shape)
#idx0=most_similar[0]
#idx1=most_similar[1]
idx0=151
idx1=333
im0=np.mean(ShapeFilter(layer[idx0,:,:,:]),axis=2)
im1=np.mean(ShapeFilter(layer[idx1,:,:,:]),axis=2)
R = la.orthogonal_procrustes(im0,im1)

fig = plt.figure()
fig.add_subplot(1,4,1)
plt.imshow(im0)
plt.title(str(idx0))
plt.axis('off')
fig.add_subplot(1,4,2)
plt.imshow(im1)
plt.title(str(idx1))
plt.axis('off')
fig.add_subplot(1,4,3)
plt.imshow(np.matmul(im0, R[0]))
plt.title(str(idx0) + ' x R')
plt.axis('off')
fig.add_subplot(1,4,4)
plt.imshow(np.subtract(np.matmul(im0, R[0]), im1))
plt.title(str(idx0) + ' x R - '+str(idx1))
plt.axis('off')
plt.show()



In [258]:
url='https://img0.etsystatic.com/045/0/8715327/il_570xN.670240336_jft4.jpg'
from PIL import Image
import requests
from io import BytesIO
response = requests.get(url)
img = Image.open(BytesIO(response.content))
im_mean = np.mean(im0)
im0 = np.mean(np.array(img.rotate(45)),axis=2)
im1 = np.mean(np.array(img),axis=2)
im0 -= np.mean(im0)
im1 -= np.mean(im1)
R = la.orthogonal_procrustes(im0,im1)
fig = plt.figure()
fig.add_subplot(2,2,1)
plt.imshow(im0)
plt.title('original')
plt.axis('off')
fig.add_subplot(2,2,2)
plt.imshow(im1)
plt.title('rotated')
plt.axis('off')
fig.add_subplot(2,2,3)
plt.imshow(np.dot(im0, R[0]))
plt.title('procrustes d=' +str(la.norm(np.subtract(np.dot(im0, R[0]),im1),ord='fro')))
plt.axis('off')
fig.add_subplot(2,2,4)
plt.imshow(R[0])
plt.title('R '+str(R[1]))
plt.axis('off')
plt.show()



In [248]:
np.mean(im0)


Out[248]:
1.8203823918108943e-14

In [26]:
kernels = dict((p.uid,p.value) for p in pred_basic_model.parameters if len(p.shape) == 4)

kernels


Out[26]:
{'Parameter555': array([[[[  2.39417292e-02,   5.05634695e-02,  -5.23355938e-02,
             6.89778198e-03,  -7.22344592e-02],
          [  2.65773633e-05,  -6.62465394e-02,  -1.07634421e-02,
            -8.64742920e-02,  -5.97098507e-02],
          [ -6.94595575e-02,   7.09185600e-02,   1.46833379e-02,
            -3.51637341e-02,   5.56818917e-02],
          [  6.09987229e-02,   1.11844040e-01,   9.36384872e-02,
             5.13478070e-02,   1.00434877e-01],
          [  2.03820392e-02,   2.37810835e-02,   4.86655124e-02,
            -3.91579047e-02,  -3.05855013e-02]],
 
         [[ -4.03480604e-02,  -8.23336318e-02,  -6.58527911e-02,
            -8.72577727e-02,   3.35219665e-03],
          [ -9.05144736e-02,  -1.00671545e-01,  -1.23820230e-01,
            -4.02628407e-02,  -1.27983376e-01],
          [  2.78664823e-03,   2.87091546e-02,   1.49904098e-02,
             1.00046597e-01,  -1.74462534e-02],
          [  1.79712046e-02,   1.19955793e-01,   2.03858048e-01,
             1.08805262e-01,   6.72579184e-02],
          [  2.04981351e-03,   3.34056473e-04,  -1.03836749e-02,
             3.53232250e-02,  -6.60670502e-03]],
 
         [[ -8.13013390e-02,  -6.96191341e-02,  -6.98311329e-02,
            -7.18423128e-02,  -4.51983474e-02],
          [ -1.42335027e-01,  -9.49004889e-02,  -1.59501001e-01,
            -1.24792039e-01,  -1.00295782e-01],
          [ -8.56789649e-02,  -1.25950947e-02,   7.88330436e-02,
             6.12792782e-02,  -5.86663187e-02],
          [  9.08789411e-02,   1.40578270e-01,   5.12466393e-02,
             1.56870365e-01,   1.05763398e-01],
          [  3.67039591e-02,   9.33990479e-02,   1.03616707e-01,
             5.18797375e-02,   1.05988979e-01]]],
 
 
        [[[  5.19131124e-02,   1.06950916e-01,  -1.09213386e-02,
             9.30566713e-02,   5.13367588e-04],
          [ -5.71145825e-02,   4.54437286e-02,   1.20316178e-01,
             5.10729887e-02,   1.37380287e-02],
          [  1.26860673e-02,   1.34841651e-02,   1.69211254e-02,
             8.84703100e-02,   8.81238952e-02],
          [  3.09815463e-02,   5.83900549e-02,   4.47907634e-02,
            -2.64098551e-02,   7.91737214e-02],
          [  1.07560873e-01,   2.21013427e-02,  -2.50055012e-03,
            -4.40059900e-02,  -3.35853100e-02]],
 
         [[ -6.74084807e-03,   7.00620189e-02,   2.39605345e-02,
             1.74904019e-02,  -6.30839467e-02],
          [ -7.46596828e-02,  -1.66574772e-02,   8.67273808e-02,
             6.40640929e-02,  -7.08343685e-02],
          [  2.25516949e-02,   4.69916984e-02,   2.79015899e-02,
            -3.12586501e-02,  -4.75644954e-02],
          [  2.90368404e-02,  -4.38312963e-02,  -2.80206539e-02,
             3.60693596e-02,  -2.28910688e-02],
          [  4.90575247e-02,   6.07180260e-02,  -6.14089817e-02,
             3.34257483e-02,   4.30078879e-02]],
 
         [[  1.54452631e-02,  -3.75602059e-02,   2.85464991e-02,
             6.70355037e-02,   5.71999475e-02],
          [ -3.08116004e-02,   5.65390810e-02,   6.29886687e-02,
             1.33332303e-02,  -4.01487425e-02],
          [ -4.55219820e-02,  -3.76678258e-02,   8.75836052e-03,
             4.20487635e-02,  -2.03946698e-02],
          [  2.35294122e-02,  -6.57689869e-02,   2.62954854e-03,
             4.08790037e-02,  -4.82362472e-02],
          [ -3.83315198e-02,   2.25980505e-02,  -9.33587700e-02,
             3.89685184e-02,   4.57348973e-02]]],
 
 
        [[[  3.16267125e-02,   1.13841377e-01,   4.71930243e-02,
             7.71730989e-02,   9.41501260e-02],
          [ -2.97825746e-02,   1.13404490e-01,  -2.54468862e-02,
             8.68704021e-02,   1.04954265e-01],
          [  8.38642269e-02,   1.40835673e-01,   1.04295745e-01,
             1.34645045e-01,   1.50923386e-01],
          [ -1.01644695e-01,  -8.18444565e-02,  -6.29327372e-02,
            -7.63183907e-02,  -2.54128147e-02],
          [ -4.15135697e-02,  -1.69778019e-02,  -1.29466623e-01,
            -1.03751272e-01,  -1.96219850e-02]],
 
         [[ -8.14391300e-03,   2.74161696e-02,  -1.57196932e-02,
            -3.60382497e-02,   7.58428499e-02],
          [ -7.30544552e-02,  -1.25711486e-02,   3.60125303e-02,
             4.79427539e-02,  -3.04219369e-02],
          [  1.08639449e-01,   8.83289203e-02,   6.99874684e-02,
            -1.39548117e-02,   6.43827617e-02],
          [ -8.50860178e-02,   6.82531595e-02,  -2.44427286e-02,
            -7.02482164e-02,   6.42310977e-02],
          [ -9.93458778e-02,   3.57595794e-02,  -3.14667374e-02,
             3.37774903e-02,   3.33667547e-03]],
 
         [[ -1.20527744e-01,   6.51093898e-03,  -1.60096496e-01,
            -2.16558903e-01,  -1.61721483e-01],
          [ -1.85454980e-01,  -1.25368476e-01,  -1.50430545e-01,
            -2.15749189e-01,  -1.32024512e-01],
          [ -3.46570672e-03,   9.90582407e-02,   4.18475531e-02,
            -3.32260877e-02,   5.79663366e-02],
          [ -4.04331945e-02,   4.02559713e-02,  -3.58018130e-02,
             4.18113619e-02,  -4.37282994e-02],
          [  9.33911428e-02,   1.48429275e-01,   8.49009082e-02,
             1.18525960e-02,   5.01652323e-02]]],
 
 
        ..., 
        [[[  5.16531952e-02,  -4.89028767e-02,   5.19915437e-03,
            -2.38775499e-02,  -8.65380466e-02],
          [  2.86134556e-02,   1.08818233e-01,  -3.24034109e-03,
            -5.51048759e-03,  -1.39478981e-01],
          [  1.27926156e-01,   4.37495708e-02,  -2.44585630e-02,
            -9.82186850e-03,  -7.79161006e-02],
          [  2.24813819e-03,  -4.30012774e-03,  -4.81556989e-02,
            -3.71572152e-02,   4.47960719e-02],
          [  3.54841240e-02,   1.74022522e-02,   7.51104951e-02,
             4.26523983e-02,  -1.93827003e-02]],
 
         [[  4.92911115e-02,  -6.76971823e-02,   9.73991235e-04,
             4.92247939e-02,  -7.69270286e-02],
          [  1.07506700e-02,   1.79449618e-02,   8.74574184e-02,
            -7.14915991e-02,  -6.87939003e-02],
          [  9.81062055e-02,  -3.51235941e-02,  -2.76042428e-02,
             1.03361858e-02,  -5.45755401e-02],
          [  2.46073995e-02,   4.37391065e-02,  -9.91799310e-02,
            -5.25450781e-02,  -2.90352423e-02],
          [  7.13427365e-02,  -1.03240922e-01,  -1.93291958e-02,
            -5.92804737e-02,  -4.50741164e-02]],
 
         [[  3.04703712e-02,   6.82561547e-02,   1.11323066e-01,
             7.25107342e-02,   3.05239856e-02],
          [ -6.30273297e-03,   9.94382501e-02,   9.01191384e-02,
            -2.96305623e-02,  -8.87583271e-02],
          [  7.88958594e-02,   9.46443714e-03,  -6.26614913e-02,
            -7.78641179e-02,  -1.13162270e-04],
          [  1.17380857e-01,  -2.07912773e-02,  -1.18990779e-01,
            -9.28832591e-02,   9.38402936e-02],
          [  6.67313710e-02,  -9.05550420e-02,  -4.16623242e-02,
             3.56750339e-02,  -2.22419426e-02]]],
 
 
        [[[  4.96327132e-02,   2.64044739e-02,   1.90128591e-02,
             7.32635409e-02,   8.37885886e-02],
          [  8.38768855e-02,   7.74045289e-02,  -2.88109686e-02,
            -4.48563136e-02,  -2.30719373e-02],
          [  3.31765995e-03,   1.20477140e-01,   9.79323685e-02,
             1.27675965e-01,   8.02948549e-02],
          [  1.13310196e-01,   4.30567637e-02,   1.06509373e-01,
             7.16154277e-02,  -7.41059035e-02],
          [  1.02895707e-01,   3.73673253e-02,  -5.03431223e-02,
            -4.64538112e-02,   1.35547658e-02]],
 
         [[ -9.04119536e-02,  -1.05043665e-01,  -1.11644017e-02,
             6.07574247e-02,   9.98438708e-03],
          [ -2.57783551e-02,   8.30239709e-03,   2.13283626e-03,
             5.24374731e-02,  -4.87334728e-02],
          [  1.68648232e-02,   7.12183490e-02,  -2.37198845e-02,
            -2.73414943e-02,   2.25213096e-02],
          [ -3.30777839e-02,   7.33489543e-02,   7.78749809e-02,
            -3.48902121e-02,  -5.26884608e-02],
          [  9.10464376e-02,   5.06478585e-02,  -9.28649381e-02,
            -1.19251519e-01,  -1.14584543e-01]],
 
         [[ -2.10726541e-02,   1.00306058e-02,   1.74669418e-02,
             2.43439227e-02,   4.19588424e-02],
          [ -3.08543574e-02,  -6.34277537e-02,  -8.57016444e-02,
            -3.46311145e-02,  -3.89742944e-03],
          [ -3.33312750e-02,  -9.75171998e-02,   3.33252223e-03,
            -7.43993446e-02,   1.34195592e-02],
          [ -8.04138631e-02,  -3.55073214e-02,  -3.83476838e-02,
            -4.19943333e-02,   2.62100920e-02],
          [  8.23092647e-03,   2.89024375e-02,  -9.81653482e-02,
             1.81931164e-02,   1.14698177e-02]]],
 
 
        [[[ -7.95035884e-02,   6.87463284e-02,   2.88413782e-02,
            -1.34369552e-01,   9.24529915e-04],
          [ -8.67033377e-02,   1.58084139e-01,   1.20298773e-01,
            -1.95512459e-01,   8.62087496e-03],
          [ -4.51537780e-02,   4.18725163e-02,   1.99933220e-02,
            -1.53775752e-01,   6.79800361e-02],
          [  8.62415060e-02,   1.76631790e-02,  -9.08409655e-02,
            -9.16545466e-02,   1.06636226e-01],
          [ -3.96918207e-02,   1.62794832e-02,   1.79261528e-02,
            -4.48412029e-03,  -7.14089796e-02]],
 
         [[  3.24048139e-02,   8.36934298e-02,   9.67606679e-02,
            -1.48345917e-01,   3.43892276e-02],
          [ -4.89117056e-02,   3.92164215e-02,   1.28952935e-01,
            -1.11627460e-01,   6.16032705e-02],
          [  4.05658334e-02,   1.44151926e-01,   1.20424680e-01,
            -1.57597244e-01,   5.71224578e-02],
          [  1.02543972e-01,  -1.30177727e-02,  -2.62349825e-02,
            -1.21778552e-03,   3.69218327e-02],
          [  4.74759564e-02,   3.87382880e-02,   4.81396019e-02,
            -5.58631532e-02,  -7.82909468e-02]],
 
         [[ -4.58896160e-02,   1.39324600e-03,   2.36010104e-02,
            -1.36522263e-01,   8.16522259e-03],
          [ -1.01988822e-01,   7.11936951e-02,   1.16803624e-01,
            -1.84770405e-01,   3.18165799e-03],
          [ -8.32652226e-02,   8.62923414e-02,   4.12039943e-02,
            -1.64959133e-01,  -1.32006137e-02],
          [  4.29375693e-02,   1.29687041e-02,  -1.05613463e-01,
            -3.76790948e-02,   7.10253194e-02],
          [ -6.20766021e-02,  -4.00149971e-02,  -8.38138163e-02,
            -4.16936353e-03,   2.07000915e-02]]]], dtype=float32),
 'Parameter585': array([[[[  1.83821598e-03,  -5.18088171e-04,   4.20521945e-02,
             3.66563424e-02,  -3.06152757e-02],
          [ -5.49767800e-02,   4.46847267e-02,  -4.07697037e-02,
             4.23795544e-02,  -5.47432080e-02],
          [ -2.30101719e-02,   1.85174274e-03,  -1.81943290e-02,
            -3.33242528e-02,  -1.69751644e-02],
          [ -2.12953016e-02,   1.00988485e-02,   4.45356444e-02,
            -4.49081510e-03,  -7.12572411e-03],
          [ -1.21019073e-02,   5.93596464e-03,   1.77172164e-03,
            -3.71195450e-02,  -4.07675020e-02]],
 
         [[  4.54814581e-04,  -5.19862361e-02,  -3.83651666e-02,
            -1.67429578e-02,   9.31067951e-03],
          [ -6.77872123e-03,  -1.06661264e-02,  -4.18753475e-02,
             3.15931663e-02,   8.64526071e-03],
          [ -5.25459461e-02,  -1.93543751e-02,  -3.56893055e-02,
            -4.49735038e-02,   9.97718773e-04],
          [  3.59987952e-02,   5.07787950e-02,  -2.72552855e-02,
             5.96818095e-03,   4.22959886e-02],
          [ -4.51407097e-02,  -1.89465489e-02,   7.41636241e-03,
             2.08918992e-02,  -2.76549291e-02]],
 
         [[ -4.76912260e-02,   1.70251634e-03,  -3.56042273e-02,
             3.83571833e-02,   3.59476358e-02],
          [ -5.76694533e-02,  -3.57290655e-02,   2.65836474e-02,
             3.21503207e-02,   5.44813350e-02],
          [ -3.11575131e-04,  -3.62374843e-03,   1.41551867e-02,
            -4.68176007e-02,   1.02506988e-02],
          [ -3.94740775e-02,  -4.29003201e-02,  -3.82531323e-02,
             3.89665104e-02,  -5.11826873e-02],
          [ -5.09435013e-02,   3.08522489e-02,  -1.82301980e-02,
             2.98102070e-02,   7.07281008e-03]],
 
         ..., 
         [[ -4.93161827e-02,   3.09782242e-03,   3.58708538e-02,
            -3.55293602e-02,   7.70916697e-03],
          [ -3.08536608e-02,  -6.79809018e-04,  -2.64135990e-02,
             6.67832000e-03,   4.96512130e-02],
          [  3.99983637e-02,  -3.66006494e-02,  -1.23410504e-02,
            -4.05160105e-03,  -6.16780529e-03],
          [  5.00803813e-02,  -5.64323775e-02,   2.95504816e-02,
            -4.76981178e-02,  -4.91614342e-02],
          [  2.54842527e-02,   4.36509512e-02,  -1.42249046e-02,
            -1.20703913e-02,   7.76629429e-03]],
 
         [[  2.08671838e-02,   2.79921349e-02,  -2.73204185e-02,
             4.39359844e-02,   2.17270683e-02],
          [ -3.57623883e-02,   3.99145037e-02,  -9.96823888e-03,
            -5.32746986e-02,   1.58028658e-02],
          [ -2.43062377e-02,  -3.75337116e-02,  -6.10575080e-03,
            -5.09720445e-02,  -4.44409363e-02],
          [  1.92757789e-02,  -1.90779194e-02,  -4.29453366e-02,
            -2.20011640e-02,  -2.16351300e-02],
          [ -5.11118136e-02,  -1.33057907e-02,  -1.72610190e-02,
             4.70804200e-02,  -2.95724384e-02]],
 
         [[  4.58256751e-02,  -7.01727951e-03,  -2.23236606e-02,
             1.92533210e-02,  -1.32808173e-02],
          [  3.10529699e-03,   3.54599692e-02,  -3.42459581e-03,
             5.09474911e-02,  -5.16034104e-02],
          [ -3.06577478e-02,   1.56518742e-02,   4.07394953e-02,
             5.17883860e-02,  -3.93241383e-02],
          [ -1.39683252e-02,   6.75861631e-03,  -5.94202504e-02,
            -3.72566693e-02,   1.87500566e-02],
          [  1.18270926e-02,  -3.27390246e-02,   1.73027646e-02,
            -4.56242152e-02,   3.22425328e-02]]],
 
 
        [[[  4.05393839e-02,   2.45710053e-02,   4.73754592e-02,
             6.92834184e-02,   5.44491597e-02],
          [ -4.53537405e-02,  -4.00897600e-02,  -1.44876749e-03,
            -2.65927985e-02,   2.09605582e-02],
          [ -6.90651834e-02,   4.60147783e-02,   7.82581493e-02,
             8.58293921e-02,   9.79267657e-02],
          [ -3.79119930e-03,   3.28947641e-02,   5.13919815e-02,
             4.08005975e-02,   8.98101702e-02],
          [  4.98327948e-02,   1.24281121e-03,   6.90638786e-03,
             3.31640169e-02,   4.65694368e-02]],
 
         [[ -3.41020226e-02,  -9.02827736e-03,  -1.51811843e-03,
             2.44462378e-02,   1.70013793e-02],
          [  3.14991064e-02,  -3.57370079e-02,  -8.80082510e-03,
            -4.85097244e-02,  -2.28479430e-02],
          [ -3.39693949e-02,   3.87955941e-02,  -3.87900975e-03,
            -4.43921648e-02,  -2.99779847e-02],
          [  1.81256887e-02,   3.89803164e-02,   4.76142801e-02,
             1.32078566e-02,   3.06788869e-02],
          [  3.26229744e-02,   1.53319363e-03,  -1.56590324e-02,
             1.95447728e-02,  -8.53363611e-03]],
 
         [[  3.83474417e-02,   3.14562954e-02,   4.83231843e-02,
             3.87617834e-02,   4.97625321e-02],
          [  3.84358689e-02,   5.34051657e-02,  -3.87390405e-02,
            -2.52728648e-02,   3.17100771e-02],
          [ -2.88833398e-02,  -1.80802383e-02,   1.99421383e-02,
             4.74673174e-02,   6.07009083e-02],
          [  2.10728236e-02,   4.46671396e-02,   4.17247787e-02,
             7.61889890e-02,   5.58240898e-03],
          [ -3.47804800e-02,   2.68217381e-02,   3.14136874e-03,
             5.44450581e-02,   4.67810966e-02]],
 
         ..., 
         [[ -2.69686040e-02,   1.37489205e-02,  -5.16959317e-02,
             1.52715063e-02,  -1.29919164e-02],
          [ -3.67116183e-02,   2.44652517e-02,   3.61189917e-02,
            -5.93080372e-02,   3.50123942e-02],
          [  3.01414318e-02,  -2.10541300e-02,  -3.12973037e-02,
            -1.74745750e-02,   9.88976937e-03],
          [  1.51924547e-02,   1.12304166e-02,  -9.15619079e-03,
            -3.04586850e-02,   5.46583215e-06],
          [  3.23076360e-02,  -2.78644990e-02,   3.05457488e-02,
            -3.29716913e-02,  -7.10675046e-02]],
 
         [[  4.99865264e-02,   3.07660103e-02,  -4.35212776e-02,
            -6.39055446e-02,   2.01816894e-02],
          [  4.35278118e-02,   4.47708108e-02,  -1.06090233e-02,
            -2.93906257e-02,  -3.38186100e-02],
          [ -1.84689742e-02,  -4.28863615e-02,  -3.46466601e-02,
            -5.31816296e-02,  -4.90987264e-02],
          [  5.33511909e-03,  -1.04516009e-02,  -5.64226732e-02,
            -8.16120487e-03,   2.72187795e-02],
          [ -6.04354851e-02,  -2.11931933e-02,   3.60183492e-02,
            -4.17332612e-02,   2.89267208e-02]],
 
         [[ -3.65101881e-02,  -3.91756780e-02,  -9.61808371e-04,
             1.30352657e-02,   1.40062701e-02],
          [ -9.86208022e-03,   8.53095856e-03,  -4.88941185e-02,
            -1.93161219e-02,   9.59539227e-03],
          [  2.78633609e-02,   4.15825322e-02,  -3.39814462e-02,
             5.74355647e-02,  -3.34714912e-02],
          [  1.59743987e-02,   2.30249725e-02,   1.66506087e-03,
            -3.04761287e-02,   4.16834513e-03],
          [  2.85062995e-02,   1.84212029e-02,  -8.95179715e-03,
            -2.84514856e-03,  -1.97488395e-03]]],
 
 
        [[[ -3.03238519e-02,   1.14859724e-02,   1.89731438e-02,
            -1.59648228e-02,  -5.02985939e-02],
          [ -6.34193867e-02,  -7.64791891e-02,  -4.81100678e-02,
            -6.59269746e-03,   1.26996581e-02],
          [ -6.04507364e-02,  -7.02035874e-02,  -3.37165259e-02,
            -4.33209427e-02,  -5.57122491e-02],
          [  2.44931821e-02,   2.92563923e-02,   3.67222242e-02,
             2.07337998e-02,  -2.07379684e-02],
          [ -5.54610975e-02,   2.84327958e-02,   6.13040291e-02,
             3.55439298e-02,   7.33427778e-02]],
 
         [[ -8.28782935e-03,   4.72633988e-02,  -3.94697301e-02,
             5.07239215e-02,   3.34914587e-02],
          [  1.71592571e-02,   5.57125583e-02,   4.01578322e-02,
             2.76719052e-02,   8.35362915e-03],
          [  6.33129776e-02,   1.48793850e-02,   1.29312957e-02,
            -4.47510928e-02,   3.86880189e-02],
          [ -1.30892340e-02,  -2.18609702e-02,  -6.75179064e-03,
            -2.05365177e-02,  -3.22821960e-02],
          [  7.89266266e-03,  -2.94787646e-03,   5.65261468e-02,
            -1.90119781e-02,   3.02884318e-02]],
 
         [[  2.31761411e-02,   3.14178616e-02,  -9.07026883e-03,
            -6.55211462e-03,  -2.07531881e-02],
          [ -4.86779399e-02,  -5.52672893e-03,  -4.21768986e-02,
             4.77624610e-02,  -4.81297337e-02],
          [ -4.30387147e-02,   2.75898296e-02,  -3.98907252e-02,
             6.14321232e-02,   4.50411998e-02],
          [ -6.54746145e-02,  -2.22207122e-02,   1.34193478e-03,
             3.39243859e-02,   4.16663550e-02],
          [ -9.76415128e-02,  -5.25452793e-02,  -7.22423866e-02,
            -4.07505073e-02,  -1.96705274e-02]],
 
         ..., 
         [[  6.10059910e-02,   6.24800064e-02,   5.38105927e-02,
             4.82686646e-02,   6.29018769e-02],
          [  4.42894287e-02,   4.65358905e-02,  -1.81503482e-02,
             4.30260487e-02,   3.13161686e-02],
          [  5.23229502e-02,   4.17120345e-02,   5.85998595e-02,
            -1.66098587e-02,   3.07211783e-02],
          [  3.65157379e-04,  -1.72651615e-02,   1.60836745e-02,
             2.86386572e-02,   1.07176621e-02],
          [  3.82310571e-03,  -5.03516197e-02,  -4.15393300e-02,
            -2.25550197e-02,  -5.14205769e-02]],
 
         [[  5.51603623e-02,   2.46262141e-02,   5.72502390e-02,
             1.27264718e-02,   3.00232470e-02],
          [  3.97218205e-02,   3.87761364e-04,  -3.30370199e-03,
             5.15295230e-02,   5.78924380e-02],
          [  5.26863225e-02,   2.00834107e-02,   6.92413896e-02,
             2.65612286e-02,   6.49241135e-02],
          [ -4.29507643e-02,   5.15890829e-02,   6.13630861e-02,
             2.03884896e-02,   4.48023602e-02],
          [  5.99198090e-03,  -5.26647083e-02,  -1.66972016e-03,
            -2.61572450e-02,  -5.22385072e-03]],
 
         [[ -9.78922397e-02,  -7.11344332e-02,  -5.05475141e-02,
            -5.70223331e-02,  -1.00755095e-02],
          [ -1.10243270e-02,   2.99861003e-03,  -3.26293148e-02,
            -1.06640421e-02,   5.42898439e-02],
          [  8.48379135e-02,   1.91324204e-03,   6.78689927e-02,
             8.19029734e-02,   6.49444610e-02],
          [ -2.28337292e-02,   9.08743292e-02,   6.55626878e-02,
             5.94215058e-02,   8.65638107e-02],
          [ -6.10336885e-02,  -3.16807106e-02,  -1.01658572e-02,
             2.19382532e-02,  -6.13803342e-02]]],
 
 
        ..., 
        [[[ -5.99549487e-02,   7.63225043e-03,   1.27944688e-03,
            -1.42310793e-02,  -6.86777309e-02],
          [ -7.01537728e-02,  -7.28391930e-02,  -3.77053097e-02,
             1.16193406e-02,  -3.04776095e-02],
          [ -4.96182963e-02,  -5.78717813e-02,  -5.61579615e-02,
            -8.38920751e-05,  -7.46939704e-02],
          [ -6.30423352e-02,  -1.48142567e-02,   2.22560596e-02,
             2.10659970e-02,   3.76465172e-02],
          [  4.54478748e-02,   3.53022926e-02,  -7.32560758e-04,
            -4.22464162e-02,  -5.62542230e-02]],
 
         [[ -4.17517684e-02,  -4.66123335e-02,   4.01519910e-02,
            -3.02059855e-02,   1.18873669e-02],
          [  1.04557211e-03,   1.84826348e-02,   3.51972915e-02,
             5.04010990e-02,  -3.59404795e-02],
          [ -1.66496374e-02,  -2.28538029e-02,   1.25994149e-03,
             2.34038234e-02,   3.12607959e-02],
          [ -1.32496865e-03,   3.17155421e-02,  -5.81697226e-02,
             3.48387025e-02,   8.01850669e-03],
          [ -1.59934182e-02,   3.04647908e-02,   4.82199825e-02,
            -1.66725107e-02,  -3.04796710e-03]],
 
         [[  3.97760719e-02,   5.90933226e-02,  -3.59155647e-02,
             5.36897592e-03,   1.14093376e-02],
          [  3.84225845e-02,   3.44297513e-02,   3.41030471e-02,
            -1.53990490e-02,  -4.98746186e-02],
          [  5.49701042e-02,  -8.32368992e-03,   5.79506531e-02,
            -2.93127298e-02,  -5.10807820e-02],
          [  6.30266666e-02,   4.57000546e-02,   2.73127481e-02,
             3.84850428e-02,  -4.15598862e-02],
          [  6.36115000e-02,   1.31479586e-02,   2.54570860e-02,
             2.45120227e-02,  -6.62088990e-02]],
 
         ..., 
         [[  1.05725154e-02,   3.78303677e-02,   1.48104047e-02,
             2.46640854e-02,  -3.51881084e-04],
          [  2.45887041e-02,   1.70287322e-02,   2.15395149e-02,
             2.40116715e-04,  -5.63089810e-02],
          [  2.18623336e-02,   1.94695909e-02,   3.81981023e-02,
            -3.10770627e-02,   1.16479667e-02],
          [  2.99205966e-02,  -2.12758295e-02,  -1.44869760e-02,
            -2.37755328e-02,   2.26472020e-02],
          [ -4.14586626e-02,   4.66276752e-03,   1.07208854e-02,
             1.59887988e-02,  -1.84364934e-02]],
 
         [[ -3.27451015e-03,  -3.69568542e-02,   2.55915290e-03,
            -2.39750873e-02,  -4.32150811e-02],
          [  8.27614032e-03,   3.10930721e-02,   9.00640246e-03,
            -1.89790502e-02,  -5.86301424e-02],
          [ -1.73704699e-03,   6.01518387e-03,   4.53265421e-02,
            -3.76758613e-02,   1.36700645e-02],
          [  7.69333392e-02,   7.80575396e-03,   4.27580252e-02,
             1.10149514e-02,  -3.51752117e-02],
          [  6.33217060e-05,   4.83751781e-02,  -2.35210974e-02,
             3.15656280e-03,  -5.03922924e-02]],
 
         [[ -3.68580385e-03,   7.97642209e-03,   3.60788889e-02,
            -1.64423126e-03,   7.47846020e-03],
          [ -5.27127124e-02,   1.15986634e-02,  -2.10138373e-02,
            -3.52137201e-02,  -1.31318783e-02],
          [  5.64856976e-02,  -2.68731769e-02,   4.17723022e-02,
            -1.45806326e-03,   3.83169204e-02],
          [ -9.56283417e-03,  -3.73540213e-03,   1.33798886e-02,
             1.49254948e-02,   4.67757955e-02],
          [ -5.34927361e-02,   4.37304340e-02,  -2.49121804e-02,
            -4.53436486e-02,  -3.65239196e-02]]],
 
 
        [[[ -5.27625084e-02,   1.51557224e-02,  -1.89060569e-02,
            -5.44566102e-02,   3.87139209e-02],
          [  3.01283542e-02,   2.57163923e-02,   3.18679772e-02,
            -5.59591465e-02,   1.17728496e-02],
          [ -6.97313100e-02,  -1.07945725e-02,  -3.81251760e-02,
            -2.25060061e-02,  -3.07771992e-02],
          [  3.67647130e-03,  -1.29846614e-02,   4.01621871e-02,
            -3.18781398e-02,   8.12427327e-03],
          [ -1.64347177e-04,   2.16761529e-02,  -2.48312317e-02,
            -5.89762218e-02,   4.85626124e-02]],
 
         [[ -4.46544997e-02,  -1.54412091e-02,   4.50059585e-02,
             5.58112860e-02,   5.22805713e-02],
          [ -1.75931845e-02,   3.03983409e-02,   1.85610764e-02,
             2.78159473e-02,  -5.37090097e-03],
          [  3.56323272e-02,  -4.88257520e-02,   1.93798132e-02,
            -1.59523706e-03,   5.28230565e-03],
          [ -5.94243295e-02,  -5.30229583e-02,   1.19965002e-02,
             4.39424813e-02,  -2.07464639e-02],
          [ -4.22538929e-02,  -1.22835743e-03,  -6.62289113e-02,
            -4.57022376e-02,   1.83365662e-02]],
 
         [[  2.89805457e-02,   7.76020586e-02,   5.13308272e-02,
             6.14571273e-02,   8.82312004e-03],
          [  6.35560080e-02,   5.76078007e-03,   1.70003995e-02,
             4.60072607e-02,  -2.55714618e-02],
          [ -3.99079770e-02,   1.25822460e-03,   7.59692639e-02,
             7.08790496e-02,   2.47917678e-02],
          [ -3.50829177e-02,   5.32179177e-02,  -2.58623101e-02,
             4.36630547e-02,   6.55156970e-02],
          [ -4.13743854e-02,  -3.69871855e-02,   1.48108434e-02,
             5.20978533e-02,   5.18862158e-02]],
 
         ..., 
         [[  5.15193865e-02,   2.64884867e-02,   5.78319281e-02,
             5.18928207e-02,   3.90285254e-02],
          [ -9.01175139e-04,   2.40392387e-02,  -1.74175501e-02,
             3.04904394e-02,   1.15616862e-02],
          [  4.21395199e-03,   1.96460802e-02,  -3.82130072e-02,
            -1.37441792e-02,   2.51650792e-02],
          [  3.60029414e-02,  -3.03980093e-02,  -7.24624004e-03,
            -1.36699509e-02,   5.91225326e-02],
          [  3.50374877e-02,  -3.54486108e-02,   1.87329724e-02,
             3.99473123e-02,   5.41575924e-02]],
 
         [[ -3.99670526e-02,   6.30463511e-02,  -2.19812896e-02,
             8.29454064e-02,   3.56810391e-02],
          [ -4.94511165e-02,  -2.61313561e-02,   7.44245052e-02,
             2.60852128e-02,   3.05965245e-02],
          [ -1.57852843e-02,  -2.81990599e-02,  -7.99915567e-03,
             5.49727194e-02,   5.33101670e-02],
          [  1.37413563e-02,   5.08817546e-02,  -4.19628210e-02,
             2.68660625e-03,  -4.77231704e-02],
          [  4.07062136e-02,  -5.26334979e-02,   4.62319553e-02,
            -2.71983817e-02,   4.38305996e-02]],
 
         [[ -5.19246943e-02,   1.75937228e-02,  -1.91033650e-02,
             1.88224036e-02,  -1.51925292e-02],
          [ -5.16431220e-02,   3.64090577e-02,   4.94012758e-02,
            -2.82452237e-02,   2.42267661e-02],
          [  3.35033461e-02,   3.24852504e-02,  -4.34295759e-02,
             6.31958842e-02,   4.02388684e-02],
          [  3.73636819e-02,  -5.49071431e-02,   1.08619509e-02,
             5.76923899e-02,   3.70822549e-02],
          [  3.93031090e-02,   1.86130591e-02,  -3.99392545e-02,
            -3.59833576e-02,   6.71073198e-02]]],
 
 
        [[[ -2.35304516e-02,  -7.13116443e-03,   8.68771784e-03,
             6.43234467e-03,  -2.20975354e-02],
          [ -5.32357953e-02,  -4.14220020e-02,  -3.35051678e-02,
            -5.97091503e-02,   3.89248319e-02],
          [  4.62684818e-02,  -3.71713005e-03,   8.66998453e-03,
            -8.54735970e-02,   7.39078829e-03],
          [  4.72029038e-02,   5.06190136e-02,  -4.10165079e-02,
             9.43207741e-03,  -1.42324483e-02],
          [ -4.19055112e-02,   4.39028516e-02,  -5.74686565e-03,
            -2.91285217e-02,   6.11025048e-03]],
 
         [[ -3.48234363e-02,   3.90409082e-02,  -1.81713272e-02,
             1.34727685e-02,   4.71607298e-02],
          [ -4.74247970e-02,  -4.67059668e-03,   2.92298570e-02,
             3.20939124e-02,   7.83556849e-02],
          [  3.93541493e-02,   1.13558229e-02,   2.16571130e-02,
             4.64081727e-02,   6.81392327e-02],
          [ -2.34046690e-02,  -3.07632312e-02,   1.57558396e-02,
            -4.63905372e-02,  -3.52983847e-02],
          [  5.20216227e-02,  -1.98931377e-02,  -5.30827567e-02,
             1.52195478e-02,  -7.83309527e-03]],
 
         [[ -3.08408886e-02,  -8.72199517e-03,   6.00112341e-02,
            -1.80797372e-02,  -2.50954591e-02],
          [  6.38005212e-02,  -1.91892944e-02,  -2.23110151e-02,
             4.10924517e-02,   3.35714407e-02],
          [  6.84252605e-02,   5.66108748e-02,  -1.86079834e-02,
             6.30600899e-02,  -3.39676440e-02],
          [ -3.45609337e-02,   5.66900298e-02,   3.35467310e-04,
             5.86015247e-02,   2.83930153e-02],
          [ -2.95158420e-02,   3.02464757e-02,  -4.69638258e-02,
             2.30321996e-02,   8.45336914e-03]],
 
         ..., 
         [[  1.25556197e-02,   3.06424331e-02,   4.59849313e-02,
            -4.92123216e-02,   4.54517640e-02],
          [  4.28349152e-02,  -5.89203425e-02,  -4.52999249e-02,
            -3.25429253e-02,  -1.08257635e-02],
          [  5.30938879e-02,  -1.18075684e-02,   3.73448133e-02,
            -5.20334467e-02,  -1.26912156e-02],
          [ -4.99107726e-02,  -2.46004555e-02,   5.67744896e-02,
            -3.68116833e-02,   3.64393741e-02],
          [ -1.88206520e-03,  -3.95498574e-02,   7.14994036e-04,
            -4.76817712e-02,  -5.12864776e-02]],
 
         [[ -1.63134579e-02,   1.80003769e-03,  -4.66433093e-02,
            -1.92647416e-03,   1.69823635e-02],
          [ -1.37749892e-02,  -1.46566099e-02,  -2.51475126e-02,
             5.16691618e-02,   5.22281080e-02],
          [ -1.50669627e-02,  -1.47125907e-02,  -3.71598974e-02,
            -2.92965230e-02,  -1.69406105e-02],
          [  5.12368120e-02,  -2.20912714e-02,  -3.78005207e-02,
            -1.85092248e-03,  -9.38167796e-03],
          [  3.89471203e-02,  -4.25284989e-02,   3.86789143e-02,
             4.80150469e-02,   1.52523187e-03]],
 
         [[ -8.47461000e-02,  -4.01240960e-02,  -2.75609270e-02,
            -4.12212481e-04,  -6.98188469e-02],
          [ -3.41474898e-02,   7.21069379e-03,  -1.72193106e-02,
            -4.66273017e-02,  -2.87597459e-02],
          [ -3.35719027e-02,   2.10390016e-02,  -3.72404829e-02,
            -7.54088070e-03,   8.07095319e-03],
          [ -4.69653867e-03,   7.59836212e-02,  -6.60863053e-03,
             5.17466702e-02,   4.71955985e-02],
          [ -1.82924531e-02,   3.58194299e-02,   6.36900142e-02,
             4.29734625e-02,  -1.48607790e-02]]]], dtype=float32),
 'Parameter615': array([[[[ -1.46462501e-03,   3.86789516e-02,   3.36707458e-02,
             3.33554447e-02,  -2.76294798e-02],
          [  1.29851680e-02,  -1.77597627e-02,  -3.48984194e-03,
             2.99930573e-02,   6.93852641e-03],
          [  6.46063080e-03,   4.10343930e-02,   3.62754054e-02,
            -6.59648934e-03,  -6.53368933e-03],
          [ -4.23559137e-02,  -1.63533837e-02,  -2.56741862e-03,
            -1.61297771e-03,   3.90630737e-02],
          [ -1.73873641e-02,  -1.64130181e-02,   3.35401185e-02,
             1.30134949e-03,   4.00782377e-02]],
 
         [[ -3.91883068e-02,  -2.89216973e-02,  -4.52926047e-02,
            -1.97715033e-02,  -1.84491444e-02],
          [  1.87141318e-02,   6.35926006e-03,  -6.87982095e-03,
            -2.05544885e-02,  -9.75837372e-03],
          [ -1.99332256e-02,   4.03544959e-03,   1.75810996e-02,
            -3.05243731e-02,  -3.02574113e-02],
          [  6.17048470e-03,  -2.21416950e-02,   9.12717544e-03,
            -3.03423405e-02,   6.05832273e-03],
          [  1.92333814e-02,   3.24522816e-02,   3.00590228e-02,
             4.06261906e-02,  -2.32327613e-03]],
 
         [[  5.63930627e-03,  -5.11255451e-02,   2.30961014e-02,
            -1.89213536e-03,  -2.09343098e-02],
          [  1.94954872e-02,   8.40231776e-04,  -4.19463180e-02,
             4.96332825e-04,   5.19270226e-02],
          [  2.79660989e-02,  -2.50368863e-02,  -5.69606349e-02,
            -5.29673323e-02,  -4.71854173e-02],
          [  1.10631529e-02,   2.01700069e-02,  -1.31547032e-02,
            -1.59157552e-02,  -1.34244421e-02],
          [  5.91753982e-03,  -1.43001918e-02,   1.88946689e-03,
            -2.53189448e-02,  -3.89801599e-02]],
 
         ..., 
         [[ -2.87759909e-03,   3.21791358e-02,   5.35942800e-03,
            -4.51129004e-02,   4.73968796e-02],
          [  2.50443071e-02,   1.08925235e-02,  -4.58634160e-02,
            -1.54395783e-02,   3.67860198e-02],
          [  1.71247795e-02,   6.33864151e-03,  -3.11661535e-03,
             1.97974499e-02,  -4.54705842e-02],
          [  4.23894450e-02,   5.81376860e-03,   3.21405828e-02,
             3.14068906e-02,   1.42201036e-02],
          [ -5.83624560e-03,   2.88794227e-02,  -2.50128116e-02,
             4.42296863e-02,   4.75009605e-02]],
 
         [[ -1.79920904e-02,  -2.05903058e-03,   1.52473222e-03,
            -2.56687924e-02,   2.08007954e-02],
          [  2.80969776e-02,  -3.29608433e-02,  -1.51639786e-02,
             4.02895063e-02,   2.15291195e-02],
          [ -2.66173799e-02,   3.95821109e-02,   1.96123309e-03,
             3.53154242e-02,  -1.37490714e-02],
          [ -4.47996799e-03,  -3.34280357e-02,   6.17639534e-03,
             4.57520299e-02,   4.01186757e-02],
          [ -3.64403315e-02,   4.56142984e-02,  -2.63462011e-02,
            -4.14702855e-02,  -1.86851230e-02]],
 
         [[  2.84974407e-02,  -1.33423219e-02,   1.93014164e-02,
             3.17875273e-03,   9.16335266e-03],
          [ -2.47742776e-02,   5.17016314e-02,  -2.50190143e-02,
             5.63190132e-03,   2.91061457e-02],
          [ -3.16416323e-02,   1.97911831e-05,  -3.15255001e-02,
             5.35025029e-03,  -2.27767881e-02],
          [  1.67725477e-02,   3.29661258e-02,   4.90171984e-02,
            -8.52161273e-03,   2.92220954e-02],
          [  9.49238334e-03,  -1.85052678e-02,   4.78283130e-03,
             7.67157134e-03,   2.94143637e-03]]],
 
 
        [[[  6.56914292e-03,   1.18154741e-03,  -4.55959290e-02,
            -3.67599688e-02,  -3.00227087e-02],
          [ -3.54683846e-02,   2.28222739e-02,   4.22421172e-02,
            -5.42057958e-03,   3.80754843e-02],
          [ -3.76350395e-02,  -2.51770243e-02,   6.72583561e-03,
            -4.56261002e-02,   2.11100560e-02],
          [ -5.21950144e-03,   2.84349304e-02,  -3.09791565e-02,
            -1.78919695e-02,   1.54354414e-02],
          [ -1.46018825e-02,   3.32386717e-02,  -3.98548283e-02,
             2.76260450e-02,   4.33810577e-02]],
 
         [[  2.48913914e-02,   1.50286453e-02,   3.32633406e-02,
            -2.66216174e-02,   2.37528160e-02],
          [ -2.08272506e-02,   1.01017086e-02,   5.34882816e-03,
            -1.75073706e-02,  -1.99697204e-02],
          [  2.99239997e-02,   3.62595022e-02,   1.77923813e-02,
            -3.88737372e-03,   3.57487239e-03],
          [ -1.61074139e-02,  -2.17421856e-02,   1.43341003e-02,
             1.06779244e-02,   2.58496194e-03],
          [  5.74149527e-02,   1.90568008e-02,  -2.49575060e-02,
             5.09328097e-02,  -6.52290182e-03]],
 
         [[ -3.27006951e-02,   3.08624636e-02,  -3.27757895e-02,
             1.47429062e-02,   1.07795540e-02],
          [  1.08718444e-02,  -5.20766526e-02,  -8.50400142e-03,
            -3.78518775e-02,   9.08461108e-04],
          [  1.34326490e-02,  -3.42914872e-02,  -2.16604453e-02,
            -2.79725976e-02,  -2.98702531e-02],
          [ -3.81397153e-03,  -3.71623272e-03,   1.54784694e-03,
             3.26718874e-02,  -2.05129646e-02],
          [ -1.12408702e-03,   2.40775608e-02,  -2.66736764e-02,
             1.97700374e-02,  -3.55872549e-02]],
 
         ..., 
         [[ -2.44997814e-02,   1.97701296e-03,   1.75301600e-02,
             4.13146280e-02,  -3.47917862e-02],
          [  5.46819950e-03,  -4.71959151e-02,  -7.80963525e-03,
            -2.25441307e-02,   2.32610833e-02],
          [ -3.41731161e-02,  -1.80714540e-02,   2.75593176e-02,
             1.36630265e-02,  -4.31995206e-02],
          [ -8.52269214e-03,  -1.34455068e-02,   1.49939032e-02,
            -1.56955551e-02,  -3.37190107e-02],
          [  3.04087400e-02,   5.96260512e-03,  -3.46531011e-02,
             5.67442104e-02,  -4.00076285e-02]],
 
         [[ -4.66312049e-04,  -2.01358404e-02,   1.54602760e-02,
             9.56769288e-03,  -1.13804396e-02],
          [  1.29088992e-02,  -2.41165590e-02,  -1.71185918e-02,
             3.89679000e-02,   1.81405414e-02],
          [  4.40519713e-02,  -1.74574926e-02,   2.10565683e-02,
             2.19783802e-02,   2.37227194e-02],
          [  2.50905100e-02,   3.13115269e-02,  -2.10729055e-03,
            -3.48998420e-02,  -5.32046668e-02],
          [  4.40993272e-02,  -2.04618033e-02,  -3.39377895e-02,
             1.98475849e-02,   9.01581626e-03]],
 
         [[ -2.65097618e-02,  -5.65747060e-02,   9.21259541e-03,
            -1.65074249e-03,   3.76568213e-02],
          [  1.24355406e-02,  -3.29787545e-02,  -2.15577204e-02,
             1.46770161e-02,  -4.19608802e-02],
          [ -2.09749788e-02,  -4.72543947e-02,   1.89449154e-02,
             2.45364103e-02,  -1.67468246e-02],
          [ -2.80718170e-02,   4.15441534e-03,  -4.77977917e-02,
            -2.46645976e-02,  -4.75702509e-02],
          [ -4.35221978e-02,  -5.79756349e-02,  -5.54451793e-02,
             7.76112871e-03,  -3.94049026e-02]]],
 
 
        [[[  1.79131478e-02,  -3.91364433e-02,  -3.42029706e-02,
            -2.29920372e-02,  -1.58645045e-02],
          [ -2.35835873e-02,  -3.79355475e-02,  -1.80618819e-02,
            -3.26471478e-02,   4.41864133e-02],
          [ -3.72726955e-02,   3.49822268e-02,   1.94985848e-02,
             4.59446833e-02,  -3.16366814e-02],
          [  4.28102501e-02,   8.20372067e-03,  -3.25382128e-02,
             4.40965444e-02,  -4.20511933e-03],
          [  1.80604716e-03,   5.56004373e-03,   7.90937617e-03,
             6.23599906e-03,   1.51817105e-03]],
 
         [[  3.78283858e-02,   1.79326925e-02,  -3.96056175e-02,
            -3.65943760e-02,  -4.42752196e-03],
          [ -1.14628272e-02,   1.25025176e-02,  -2.89110187e-02,
             2.67697889e-02,  -3.34830657e-02],
          [ -3.14784460e-02,  -3.16767232e-03,  -1.83022525e-02,
            -2.16592290e-02,  -2.02090666e-02],
          [ -1.45912226e-02,  -4.24850546e-02,  -7.70400325e-03,
            -4.61124303e-03,  -3.79549563e-02],
          [ -1.57175306e-02,  -1.64160796e-04,  -2.23521632e-03,
            -4.79499549e-02,   3.39961471e-03]],
 
         [[ -1.96398944e-02,   7.74900662e-03,  -3.10767023e-03,
             3.87155004e-02,   3.83774750e-02],
          [  2.09691096e-02,   2.92823464e-02,  -7.74792070e-03,
            -3.58922803e-03,   1.93795655e-02],
          [  1.53675172e-02,   2.63903495e-02,  -1.96602158e-02,
             5.13800085e-02,  -4.85997042e-03],
          [  6.76487433e-03,   3.58704515e-02,   1.47257373e-02,
             4.16208357e-02,   3.80002521e-02],
          [ -4.87346835e-02,  -9.61839408e-03,   3.77602912e-02,
            -1.75918993e-02,   2.03124853e-03]],
 
         ..., 
         [[  4.60650809e-02,   1.24635883e-02,  -4.24775817e-02,
            -9.94970370e-03,  -2.66612209e-02],
          [  6.77150907e-03,  -2.63427161e-02,   3.88876721e-03,
            -2.45314036e-02,  -3.51162534e-03],
          [ -9.59158689e-03,   3.07739824e-02,  -4.36479449e-02,
            -3.33193247e-03,  -3.68241668e-02],
          [  4.17581648e-02,   3.22014019e-02,   1.93527974e-02,
            -4.84356321e-02,  -3.86573114e-02],
          [  1.64441962e-03,   2.53410228e-02,  -1.47289243e-02,
             2.98469812e-02,  -1.62383504e-02]],
 
         [[ -1.04749138e-02,  -3.11463885e-02,  -2.34964713e-02,
            -2.50782538e-02,   1.43977236e-02],
          [ -8.80440988e-04,  -7.03987014e-03,  -2.89478600e-02,
            -4.68400642e-02,   3.25341746e-02],
          [ -4.60593402e-02,  -4.95147295e-02,  -3.83188799e-02,
            -5.70459850e-02,  -4.42190431e-02],
          [ -2.62905825e-02,  -4.62297276e-02,  -1.22783873e-02,
            -3.73827070e-02,   9.90444608e-03],
          [ -3.60584864e-03,   2.09815819e-02,   2.12183385e-03,
            -5.93332164e-02,  -1.19180128e-03]],
 
         [[  6.63263025e-03,  -2.17462629e-02,  -3.17253135e-02,
             1.41253415e-02,  -1.53330818e-03],
          [  5.53023303e-04,  -1.24851614e-02,   6.23193290e-03,
            -3.58397663e-02,  -3.73356603e-02],
          [ -5.11597842e-02,  -3.92226614e-02,  -1.48751121e-02,
            -3.50780264e-02,  -2.23972788e-03],
          [  2.86697149e-02,   4.22655568e-02,   6.94349781e-03,
             1.66637711e-02,  -3.41718197e-02],
          [  2.51592952e-03,  -1.72635005e-03,   4.02848050e-02,
             2.58652642e-02,  -1.39491577e-02]]],
 
 
        ..., 
        [[[ -4.43804823e-02,  -4.21904176e-02,   4.35506701e-02,
            -3.58492509e-02,  -2.87935436e-02],
          [  2.20765569e-03,  -3.17963809e-02,  -1.73824988e-02,
            -2.75909584e-02,   4.50510159e-02],
          [ -2.65601045e-03,   2.76373494e-02,  -2.93042623e-02,
             2.02507973e-02,  -2.71523166e-02],
          [ -1.62915252e-02,  -3.39989737e-02,   2.33072545e-02,
             2.39503421e-02,   5.35583496e-03],
          [ -4.45749722e-02,  -3.33539657e-02,   1.81337772e-03,
            -4.09881100e-02,   3.65340598e-02]],
 
         [[ -2.74785068e-02,  -1.90079268e-02,   1.13615198e-02,
             3.11651137e-02,  -1.23091694e-02],
          [  1.92885417e-02,  -5.81776025e-03,   2.79252231e-02,
             3.01646944e-02,  -1.56299844e-02],
          [ -2.15881802e-02,  -6.17205771e-03,  -6.50008768e-03,
            -6.81215450e-02,  -1.69028640e-02],
          [ -4.13960703e-02,  -5.51695712e-02,  -2.02715881e-02,
            -2.05150843e-02,  -5.34165986e-02],
          [  3.80340405e-02,   1.47108929e-02,  -5.30095510e-02,
            -3.59307230e-02,   2.54246425e-02]],
 
         [[  5.79401329e-02,   4.60500866e-02,   5.50508462e-02,
             7.08129928e-02,   5.97324269e-03],
          [ -2.98594194e-03,   2.16880236e-02,   3.99049632e-02,
             3.44800577e-02,   1.85545962e-02],
          [ -1.20255817e-02,  -6.88946694e-02,  -6.11318974e-03,
            -1.26857115e-02,   5.31391129e-02],
          [  1.75079163e-02,   1.48930112e-02,   3.19251604e-02,
             3.44332159e-02,   7.51081929e-02],
          [  2.51988899e-02,   3.38791534e-02,   1.63561385e-03,
            -3.12654069e-03,   1.23971449e-02]],
 
         ..., 
         [[  4.76166792e-02,   4.68204953e-02,   6.86994866e-02,
             5.26635069e-03,   5.34382984e-02],
          [  5.45700453e-02,   2.74775326e-02,   6.01837970e-02,
            -2.00560167e-02,   2.83475239e-02],
          [ -4.52385889e-03,  -6.24905005e-02,  -5.89696039e-03,
            -1.44687267e-02,   3.70706320e-02],
          [ -1.61269084e-02,  -1.45598082e-02,  -2.96536367e-02,
            -3.66903655e-03,   4.27097157e-02],
          [  9.44491755e-03,  -3.14331464e-02,  -1.92095917e-02,
             2.58719232e-02,  -1.51175782e-02]],
 
         [[  3.10932118e-02,  -1.57733690e-02,  -7.89816864e-03,
             2.01770104e-02,   1.43539263e-02],
          [ -2.92072929e-02,   1.54163502e-03,   2.66856290e-02,
             2.87179369e-03,  -1.89425622e-03],
          [ -4.46297303e-02,  -5.00702597e-02,  -1.06067983e-02,
            -1.18316077e-02,  -1.48076704e-03],
          [ -8.64615384e-03,   1.08018890e-02,  -2.17992289e-04,
             3.10070477e-02,  -4.39481698e-02],
          [  2.72475723e-02,   2.27792338e-02,   4.15811241e-02,
            -2.84911562e-02,   2.38854177e-02]],
 
         [[ -2.90441383e-02,   2.23925635e-02,   4.47683781e-02,
            -3.26743699e-03,  -1.58690549e-02],
          [  3.38959917e-02,  -3.51892598e-02,  -4.73812446e-02,
            -1.83819495e-02,   1.33810360e-02],
          [ -1.93231162e-02,  -1.72001473e-03,   1.15290917e-02,
            -4.22220416e-02,   1.24930497e-03],
          [  3.03168222e-03,  -5.44629851e-03,   4.48600613e-02,
             3.15327160e-02,   2.32434049e-02],
          [  3.08482852e-02,   3.87123153e-02,  -6.17486425e-03,
            -3.07166036e-02,   3.73177347e-03]]],
 
 
        [[[ -3.40808555e-02,  -1.37885530e-02,  -2.10704841e-03,
             9.50859394e-03,   4.11884934e-02],
          [ -3.52071002e-02,  -1.31946488e-03,  -1.84314661e-02,
            -1.54846795e-02,  -2.47971863e-02],
          [ -2.41222586e-02,  -2.21809428e-02,  -1.99081246e-02,
             3.62378657e-02,  -2.18687132e-02],
          [ -3.45570892e-02,   5.39079402e-03,  -4.13385481e-02,
            -2.81175710e-02,   3.52675989e-02],
          [  1.19968578e-02,  -3.09448014e-03,  -4.34242859e-02,
            -2.32777763e-02,  -4.23420593e-03]],
 
         [[  4.71551567e-02,   5.77844605e-02,   6.21429943e-02,
             9.95716453e-03,   4.84545110e-03],
          [ -2.08402928e-02,   4.12282497e-02,   7.47888023e-03,
            -8.56286660e-03,   5.66082029e-03],
          [ -2.47535333e-02,  -2.12732423e-02,   1.44094033e-02,
             1.90943014e-03,  -1.73553117e-02],
          [ -2.86970604e-02,   3.56313176e-02,   1.65181067e-02,
             3.58986668e-02,   3.33019234e-02],
          [ -4.18830849e-02,   1.38506321e-02,   8.20888858e-03,
             3.85776535e-02,  -1.41819625e-03]],
 
         [[  6.69564009e-02,   3.74809094e-02,  -3.09529658e-02,
            -1.21080354e-02,   5.33509105e-02],
          [  2.90403254e-02,   8.85483343e-03,  -4.42717671e-02,
            -3.65841463e-02,   3.76971811e-02],
          [  1.74579322e-02,   3.55958343e-02,  -4.50023077e-02,
            -4.47673611e-02,   3.63969579e-02],
          [  3.49867940e-02,  -1.11877611e-02,   3.74066457e-02,
             6.69744192e-03,   4.73330058e-02],
          [  2.36902814e-02,   4.29216884e-02,   3.01262010e-02,
             1.09803472e-02,   9.59042180e-03]],
 
         ..., 
         [[  1.51605522e-02,   2.03844309e-02,  -4.50660922e-02,
             2.27556806e-02,  -1.10168837e-03],
          [  4.70663868e-02,   4.31258716e-02,  -3.41217481e-02,
             5.46882628e-04,   3.22731696e-02],
          [  2.92688236e-03,   3.09585035e-02,   2.08077822e-02,
             2.44692415e-02,  -4.10658605e-02],
          [  5.25552221e-02,  -2.99819605e-03,   4.26228940e-02,
            -4.04957756e-02,   3.93721536e-02],
          [  2.46630311e-02,  -1.53811509e-02,   1.84941795e-02,
            -5.90119045e-03,   2.60044690e-02]],
 
         [[  1.38683626e-02,   3.00641786e-02,   2.74401680e-02,
            -1.73933432e-02,   4.99585234e-02],
          [  3.05503840e-03,  -3.77240106e-02,  -1.93631127e-02,
            -1.41403824e-03,  -2.92018969e-02],
          [  9.14635789e-03,  -3.76304425e-02,   9.02922451e-03,
            -1.28169043e-03,  -4.82666828e-02],
          [ -9.81695368e-04,   3.61474958e-04,   2.47827284e-02,
             1.51489992e-02,  -4.20549884e-02],
          [  1.18351849e-02,   2.95238029e-02,   1.25791132e-02,
            -9.81780607e-03,   9.91621148e-03]],
 
         [[  3.36881243e-02,  -4.96640652e-02,  -1.91412936e-03,
            -2.47210283e-02,   1.63412560e-02],
          [  1.54450871e-02,  -3.95383164e-02,  -5.41457273e-02,
            -5.39578795e-02,  -1.22070068e-03],
          [  4.13836678e-03,  -2.62715239e-02,   4.78568533e-03,
            -1.27495900e-02,  -4.70435135e-02],
          [ -8.34664050e-03,   9.02886689e-03,   7.28862919e-03,
            -3.00387735e-03,  -3.44927609e-02],
          [  3.11151184e-02,  -9.90553526e-04,   6.42812997e-02,
             5.31308092e-02,   4.77436148e-02]]],
 
 
        [[[ -1.35852741e-02,   1.92803554e-02,   3.36967260e-02,
             4.12851805e-03,  -2.09331065e-02],
          [  1.78341214e-02,   4.51170579e-02,  -4.22462374e-02,
             1.62392214e-03,   7.56686274e-03],
          [  4.23299223e-02,   2.40345262e-02,  -3.99637856e-02,
            -1.78907476e-02,  -1.58307217e-02],
          [  2.77182311e-02,  -3.66699249e-02,   1.61457267e-02,
            -7.48775108e-03,   5.55657223e-03],
          [ -6.54357485e-04,   2.32897550e-02,  -2.45133806e-02,
             2.11887285e-02,   4.16302308e-02]],
 
         [[ -1.56371165e-02,   1.55828912e-02,  -2.96724848e-02,
            -4.18352820e-02,  -5.71566373e-02],
          [  4.37126262e-03,   6.29374236e-02,   1.50827542e-02,
            -1.75144672e-02,  -2.57559791e-02],
          [  3.20635214e-02,  -6.04216103e-03,   1.69487335e-02,
             4.72728051e-02,   4.18112315e-02],
          [  5.07829376e-02,   4.58664224e-02,   6.16957806e-03,
             1.73259154e-02,   5.40269306e-03],
          [ -7.07200961e-03,  -1.36331720e-02,  -5.17363362e-02,
            -1.85355023e-02,  -2.75058802e-02]],
 
         [[  6.79572001e-02,   5.18042929e-02,   4.50137258e-02,
             3.33317416e-03,  -1.56053444e-02],
          [ -4.79641184e-03,   8.83362442e-02,   6.13135844e-03,
             6.01177812e-02,  -4.26228978e-02],
          [ -5.02927639e-02,   6.99487375e-03,   7.02879438e-03,
            -2.88196262e-02,  -6.22647367e-02],
          [  1.77180842e-02,   6.02844171e-02,   3.74324881e-02,
            -2.43388787e-02,  -4.58062254e-03],
          [  1.00221924e-01,   9.36251581e-02,   4.66342233e-02,
             9.25178304e-02,   1.00114308e-01]],
 
         ..., 
         [[ -1.19752884e-02,  -4.31532711e-02,   2.05772296e-02,
            -1.16644725e-02,  -1.57088973e-02],
          [ -2.30575074e-02,  -1.48001770e-02,  -1.87482573e-02,
             3.12710628e-02,   4.65160683e-02],
          [ -1.57763697e-02,  -7.43350089e-02,   3.56088914e-02,
             6.01700470e-02,   2.56659519e-02],
          [  1.65393353e-02,  -1.42158810e-02,  -1.96618009e-02,
             4.45595384e-02,   5.85878082e-02],
          [  5.91577962e-02,   8.85588527e-02,   5.07040359e-02,
             3.16770673e-02,   2.36601643e-02]],
 
         [[  1.55642186e-03,  -9.21114348e-03,  -5.73143512e-02,
             1.51848169e-02,   3.84251168e-03],
          [ -2.72485130e-02,  -4.49715927e-02,  -1.88090801e-02,
            -1.36818178e-02,  -4.67057675e-02],
          [ -2.91384314e-03,  -2.08584475e-04,  -2.28177011e-03,
            -3.32609564e-02,   4.41877032e-03],
          [  4.80165184e-02,  -2.46220361e-02,   4.69004922e-02,
             4.04179394e-02,  -3.58410850e-02],
          [  3.57319266e-02,  -1.32177426e-02,   2.46017822e-04,
            -1.75823011e-02,   9.00582038e-03]],
 
         [[ -1.18888635e-02,   5.67092448e-02,   3.70904319e-02,
             2.54618470e-02,   3.17210630e-02],
          [ -2.94606537e-02,   1.28054395e-02,   7.39561170e-02,
             5.75166121e-02,   2.07695197e-02],
          [ -3.29422131e-02,   3.84107046e-02,   3.99365230e-03,
             2.94291079e-02,   2.75424719e-02],
          [  4.51373085e-02,   3.11819781e-02,   6.39068112e-02,
             3.90927717e-02,   1.20201157e-02],
          [  1.01469061e-03,   5.73004782e-02,   4.52989303e-02,
            -3.12077603e-03,   2.50330567e-02]]]], dtype=float32),
 'Parameter951': array([[[[  4.96195257e-02,   7.71919563e-02,  -6.86200634e-02, ...,
             5.13539128e-02,   5.53193269e-03,  -3.51799242e-02],
          [ -4.84216362e-02,  -6.62607178e-02,   3.02521251e-02, ...,
            -8.04864336e-03,   6.49993569e-02,   6.96987137e-02],
          [ -3.70954610e-02,  -1.11515708e-02,  -6.58963472e-02, ...,
             6.79462701e-02,  -5.84772117e-02,  -3.71027589e-02]],
 
         [[  5.81069663e-02,   6.84583467e-03,   1.69201028e-02, ...,
            -8.98769721e-02,   4.87098796e-03,  -5.41992262e-02],
          [ -1.96968224e-02,   7.00225011e-02,   1.77204721e-02, ...,
            -2.84568071e-02,  -6.00223057e-02,  -7.65364617e-02],
          [ -1.00188646e-02,  -1.81076583e-02,   1.27311675e-02, ...,
             5.45494445e-02,   1.52093293e-02,   7.48332515e-02]],
 
         [[ -5.48895821e-02,   4.29048538e-02,   6.17466792e-02, ...,
             1.87889580e-02,  -3.36908251e-02,  -7.05532879e-02],
          [ -2.54339445e-02,   4.74370643e-02,   7.67133897e-03, ...,
            -2.24966630e-02,  -2.69668251e-02,  -5.39017618e-02],
          [ -2.64273845e-02,   8.97130072e-02,   3.15383337e-02, ...,
            -1.67586021e-02,  -4.42506373e-02,  -3.40169221e-02]]],
 
 
        [[[ -7.22535998e-02,  -6.68399781e-02,   8.56712908e-02, ...,
            -4.42113988e-02,   2.21314635e-02,   2.95006745e-02],
          [  6.49067312e-02,   2.74787825e-02,   5.38378209e-02, ...,
            -1.05891032e-02,   3.96598689e-02,  -5.68677811e-03],
          [ -4.15510423e-02,   3.53757255e-02,   1.18736746e-02, ...,
             8.24608132e-02,  -5.35841063e-02,  -6.61786180e-03]],
 
         [[ -7.15298206e-02,   3.48002613e-02,  -8.36518854e-02, ...,
             7.32915699e-02,  -6.90423176e-02,  -1.10895587e-02],
          [ -1.57115683e-02,  -3.06216360e-04,   4.05607261e-02, ...,
            -9.00185201e-03,  -6.77847862e-02,   4.84601781e-02],
          [  1.22178849e-02,  -4.16579731e-02,   7.72470534e-02, ...,
             3.59043926e-02,  -2.73712520e-02,  -7.21237215e-04]],
 
         [[ -1.29280668e-02,   6.19209707e-02,   5.55542298e-02, ...,
             7.01477304e-02,  -7.84665793e-02,  -2.61522494e-02],
          [  9.06975195e-03,   1.31275719e-02,  -1.45645877e-02, ...,
            -1.71396264e-03,   1.09468112e-02,  -3.01749464e-02],
          [ -4.82725091e-02,   7.18035623e-02,  -1.80017743e-02, ...,
            -3.65053117e-02,   3.84981893e-02,   2.25751735e-02]]],
 
 
        [[[  2.13409867e-02,   6.49040341e-02,   2.31917500e-02, ...,
             5.47198541e-02,  -5.79147711e-02,  -4.92521841e-03],
          [  7.08006918e-02,   8.91990513e-02,  -8.62441286e-02, ...,
            -1.08316690e-02,   2.48841681e-02,  -7.19963610e-02],
          [  7.54527152e-02,   5.94296772e-03,  -8.66265744e-02, ...,
            -3.43071553e-03,  -5.11880703e-02,  -2.16364525e-02]],
 
         [[ -5.36551289e-02,  -3.51386108e-02,   6.51103780e-02, ...,
             6.55367225e-02,  -4.97609191e-02,  -8.47155675e-02],
          [ -3.51024373e-03,  -3.60842608e-02,  -4.46575396e-02, ...,
            -3.48894820e-02,   7.56100491e-02,  -7.47119710e-02],
          [ -8.04226026e-02,   5.78160340e-04,   2.13479456e-02, ...,
             3.62732783e-02,  -9.13994573e-03,   5.74724339e-02]],
 
         [[ -5.03626503e-02,  -2.51163878e-02,   7.08387792e-02, ...,
            -1.26006305e-02,   4.66650426e-02,  -5.44465706e-02],
          [ -5.47596663e-02,   6.51044352e-03,  -2.35627927e-02, ...,
             2.18931977e-02,   1.53395915e-04,   6.29917160e-02],
          [  8.76750126e-02,  -6.88705444e-02,  -6.07457459e-02, ...,
             7.26778209e-02,  -6.18547536e-02,  -6.09887904e-03]]],
 
 
        ..., 
        [[[  7.28446767e-02,  -6.34200424e-02,   4.96948883e-02, ...,
             3.89456339e-02,  -5.54866195e-02,   3.09384316e-02],
          [ -3.34300138e-02,   7.20057935e-02,   2.46157907e-02, ...,
            -4.65599522e-02,   8.32826421e-02,   3.65534648e-02],
          [ -5.99190651e-04,   5.54965110e-03,  -6.33999407e-02, ...,
            -8.20965390e-04,  -2.96563767e-02,   8.70519876e-02]],
 
         [[ -5.29961586e-02,   1.11419156e-01,  -5.83809018e-02, ...,
            -2.75943778e-03,   8.74899141e-03,   4.66432832e-02],
          [  7.94593617e-02,   6.03051856e-03,  -5.92014194e-02, ...,
            -6.08223714e-02,   3.43570448e-02,  -6.26147836e-02],
          [ -4.47003171e-02,   1.01672843e-01,  -3.01800668e-02, ...,
             6.96191739e-04,  -5.86512089e-02,   1.54629769e-02]],
 
         [[  3.70902792e-02,  -3.42140719e-02,   2.63185911e-02, ...,
            -1.90551616e-02,   2.27979068e-02,  -7.16872001e-03],
          [ -6.96889088e-02,   2.36020219e-02,  -1.93506442e-02, ...,
            -6.94757104e-02,  -5.97725436e-02,   1.47284344e-02],
          [ -2.65508182e-02,  -4.70147282e-02,   1.24008348e-02, ...,
            -5.63070066e-02,   3.36741917e-02,   1.60369687e-02]]],
 
 
        [[[  1.25168841e-02,  -5.04423529e-02,   1.18575487e-02, ...,
            -9.65028033e-02,   3.01161166e-02,  -5.69514446e-02],
          [ -9.80417430e-03,   3.80505919e-02,  -8.43491871e-03, ...,
            -5.19497804e-02,   5.60872555e-02,   1.21489633e-02],
          [ -4.40368541e-02,  -5.24025236e-04,  -2.61773672e-02, ...,
             4.30783555e-02,   6.89634606e-02,   3.67673077e-02]],
 
         [[ -6.25792593e-02,  -3.16730961e-02,   5.51902875e-02, ...,
             5.91918873e-03,  -4.28057760e-02,  -3.77526991e-02],
          [ -5.46699353e-02,   3.84779871e-02,   9.68395267e-03, ...,
            -4.26855013e-02,   8.51915218e-03,   7.51215145e-02],
          [  1.06406406e-01,   3.49410460e-03,  -8.68172944e-02, ...,
             1.74479745e-02,  -7.72447959e-02,   3.40375938e-02]],
 
         [[ -5.96767105e-02,   3.60388532e-02,  -4.61565666e-02, ...,
             5.76188788e-02,   3.79464068e-02,  -4.87192459e-02],
          [  9.38159153e-02,   5.62736019e-02,  -5.52171469e-02, ...,
            -7.93621764e-02,  -1.00642428e-04,  -2.42809430e-02],
          [  3.66457105e-02,  -5.35357855e-02,  -8.61385604e-04, ...,
            -9.10467505e-02,   1.64176654e-02,   6.10154606e-02]]],
 
 
        [[[ -6.00645021e-02,  -1.25436597e-02,   3.75506356e-02, ...,
             3.43879648e-02,  -3.84529047e-02,   5.68652563e-02],
          [  1.33747351e-04,   4.26138230e-02,   6.59435391e-02, ...,
             5.61789516e-03,  -3.79180014e-02,  -5.48358597e-02],
          [ -5.48139848e-02,   9.06965956e-02,   5.97436503e-02, ...,
             7.60354698e-02,   7.73083866e-02,  -9.49495211e-02]],
 
         [[  6.39767572e-02,   1.15230292e-01,  -8.91046822e-02, ...,
            -9.47685242e-02,   8.33794996e-02,  -5.92719158e-03],
          [ -6.69576526e-02,   8.16219002e-02,  -1.90071184e-02, ...,
            -1.38591854e-02,  -6.60328716e-02,   6.42533824e-02],
          [ -4.78185974e-02,   1.18660085e-01,   5.04106656e-03, ...,
             4.44183350e-02,   6.56832084e-02,   8.31779987e-02]],
 
         [[  3.00058350e-02,   9.15818103e-03,   1.06234420e-02, ...,
            -1.25810578e-02,  -4.91635092e-02,  -7.09611252e-02],
          [  6.90244138e-02,  -5.64587228e-02,  -7.43527487e-02, ...,
            -7.77745247e-02,  -7.84526616e-02,   8.95617753e-02],
          [  1.07233331e-01,   6.34525791e-02,   7.17737377e-02, ...,
            -5.57110645e-02,  -8.51562843e-02,  -2.33576223e-02]]]], dtype=float32)}

In [80]:
def create_alexnet(input, out_dims):
    def LocalResponseNormalization(k, n, alpha, beta, name=''):
        x = C.placeholder(name='lrn_arg')
        x2 = C.square(x)
        # reshape to insert a fake singleton reduction dimension after the 3th axis (channel axis). Note Python axis order and BrainScript are reversed.
        x2s = C.reshape(x2, (1, C.InferredDimension), 0, 1)
        W = C.constant(alpha/(2*n+1), (1,2*n+1,1,1), name='W')
        # 3D convolution with a filter that has a non 1-size only in the 3rd axis, and does not reduce since the reduction dimension is fake and 1
        y = C.convolution (W, x2s)
        # reshape back to remove the fake singleton reduction dimension
        b = C.reshape(y, C.InferredDimension, 0, 2)
        den = C.exp(beta * C.log(k + b))
        apply_x = C.element_divide(x, den)
        return apply_x
    
    with C.layers.default_options(activation=None, pad=True, bias=True):
        model = C.layers.Sequential([
            # we separate Convolution and ReLU to name the output for feature extraction (usually before ReLU) 
            C.layers.Convolution2D((11,11), 96, init=C.initializer.normal(0.01), pad=False, strides=(4,4), name='conv1'),
            C.layers.Activation(activation=relu, name='relu1'),
            LocalResponseNormalization(1.0, 2, 0.0001, 0.75, name='norm1'),
            C.layers.MaxPooling((3,3), (2,2), name='pool1'),

            C.layers.Convolution2D((5,5), 192, init=C.initializer.normal(0.01), init_bias=0.1, name='conv2'), 
            C.layers.Activation(activation=relu, name='relu2'),
            LocalResponseNormalization(1.0, 2, 0.0001, 0.75, name='norm2'),
            C.layers.MaxPooling((3,3), (2,2), name='pool2'),

            C.layers.Convolution2D((3,3), 384, init=C.initializer.normal(0.01), name='conv3'), 
            C.layers.Activation(activation=relu, name='relu3'),
            C.layers.Convolution2D((3,3), 384, init=C.initializer.normal(0.01), init_bias=0.1, name='conv4'), 
            C.layers.Activation(activation=relu, name='relu4'),
            C.layers.Convolution2D((3,3), 256, init=C.initializer.normal(0.01), init_bias=0.1, name='conv5'), 
            C.layers.Activation(activation=relu, name='relu5'), 
            C.layers.MaxPooling((3,3), (2,2), name='pool5'), 

            C.layers.Dense(4096, init=C.initializer.normal(0.005), init_bias=0.1, name='fc6'),
            C.layers.Activation(activation=relu, name='relu6'),
            C.layers.Dropout(0.5, name='drop6'),
            C.layers.Dense(4096, init=C.initializer.normal(0.005), init_bias=0.1, name='fc7'),
            C.layers.Activation(activation=relu, name='relu7'),
            C.layers.Dropout(0.5, name='drop7'),
            C.layers.Dense(out_dims, init=C.initializer.normal(0.01), name='fc8')
            ])
    return model(input)

pred_alexnet = train_and_evaluate(reader_train, 
                                      reader_test, 
                                      max_epochs=10, 
                                      model_func=create_alexnet)


---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-80-271f4d4cfbd0> in <module>()
     51                                       reader_test,
     52                                       max_epochs=10,
---> 53                                       model_func=create_alexnet)

<ipython-input-19-c9e0b256eb55> in train_and_evaluate(reader_train, reader_test, max_epochs, model_func)
     12 
     13     # apply model to input
---> 14     z = model_func(input_var_norm, out_dims=10)
     15 
     16     #

<ipython-input-80-271f4d4cfbd0> in create_alexnet(input, out_dims)
     46             C.layers.Dense(out_dims, init=C.initializer.normal(0.01), name='fc8')
     47             ])
---> 48     return model(input)
     49 
     50 pred_alexnet = train_and_evaluate(reader_train, 

C:\Anaconda\envs\cntk-py34\lib\site-packages\cntk\ops\functions.py in __call__(self, *args, **kwargs)
    321         # applying the function means to inline its piece of graph
    322         if is_symbolic:
--> 323             return self.clone(CloneMethod.share, arg_map)
    324 
    325         # numeric: evaluate

C:\Anaconda\envs\cntk-py34\lib\site-packages\cntk\internal\swig_helper.py in wrapper(*args, **kwds)
     67     @wraps(f)
     68     def wrapper(*args, **kwds):
---> 69         result = f(*args, **kwds)
     70         map_if_possible(result)
     71         return result

C:\Anaconda\envs\cntk-py34\lib\site-packages\cntk\ops\functions.py in clone(self, method, substitutions)
    472         if not isinstance(substitutions, dict):
    473             raise TypeError("Variable substitution map must be a dictionary")
--> 474         return super(Function, self).clone(method, substitutions)
    475 
    476     @property

C:\Anaconda\envs\cntk-py34\lib\site-packages\cntk\cntk_py.py in clone(self, *args)
   1495 
   1496     def clone(self, *args):
-> 1497         return _cntk_py.Function_clone(self, *args)
   1498     if _newclass:
   1499         deserialize = staticmethod(_cntk_py.Function_deserialize)

ValueError: Convolution operation requires that kernel dim 5 <= input dim 3.

[CALL STACK]
    > Microsoft::MSR::CNTK::DataTransferer::  operator=
    - CNTK::NDMask::  MaskedCount
    - CNTK::NDMask::  MaskedCount
    - CNTK::  Unpooling
    - CNTK::  Unpooling
    - CNTK::Function::  ReplacePlaceholders
    - CNTK::  Hardmax
    - CNTK::  Unpooling
    - CNTK::  Unpooling
    - CNTK::  Unpooling
    - CNTK::  Unpooling
    - CNTK::  Unpooling
    - CNTK::  Unpooling
    - CNTK::  Unpooling
    - CNTK::  Unpooling
    - CNTK::  Unpooling

Now that we have a trained model, let us classify the following image of a truck. We use PIL to read the image.


In [16]:
# Figure 6
Image(url="https://cntk.ai/jup/201/00014.png", width=64, height=64)


Out[16]:

In [17]:
# Download a sample image 
# (this is 00014.png from test dataset)
# Any image of size 32,32 can be evaluated

url = "https://cntk.ai/jup/201/00014.png"
myimg = np.array(PIL.Image.open(urlopen(url)), dtype=np.float32)

During training we have subtracted the mean from the input images. Here we take an approximate value of the mean and subtract it from the image.


In [18]:
def eval(pred_op, image_data):
    label_lookup = ["airplane", "automobile", "bird", "cat", "deer", "dog", "frog", "horse", "ship", "truck"]
    image_mean = 133.0
    image_data -= image_mean
    image_data = np.ascontiguousarray(np.transpose(image_data, (2, 0, 1)))
    
    result = np.squeeze(pred_op.eval({pred_op.arguments[0]:[image_data]}))
    
    # Return top 3 results:
    top_count = 3
    result_indices = (-np.array(result)).argsort()[:top_count]

    print("Top 3 predictions:")
    for i in range(top_count):
        print("\tLabel: {:10s}, confidence: {:.2f}%".format(label_lookup[result_indices[i]], result[result_indices[i]] * 100))

In [19]:
# Run the evaluation on the downloaded image
eval(pred_basic_model, myimg)


Top 3 predictions:
	Label: truck     , confidence: 96.59%
	Label: ship      , confidence: 2.31%
	Label: cat       , confidence: 0.43%

Adding dropout layer, with drop rate of 0.25, before the last dense layer:


In [20]:
def create_basic_model_with_dropout(input, out_dims):

    with C.layers.default_options(activation=C.relu, init=C.glorot_uniform()):
        model = C.layers.Sequential([
            C.layers.For(range(3), lambda i: [
                C.layers.Convolution((5,5), [32,32,64][i], pad=True),
                C.layers.MaxPooling((3,3), strides=(2,2))
            ]),
            C.layers.Dense(64),
            C.layers.Dropout(0.25),
            C.layers.Dense(out_dims, activation=None)
        ])

    return model(input)

In [21]:
pred_basic_model_dropout = train_and_evaluate(reader_train, 
                                              reader_test, 
                                              max_epochs=5, 
                                              model_func=create_basic_model_with_dropout)


Training 116906 parameters in 10 parameter tensors.

Learning rate per minibatch: 0.01
Momentum per sample: 0.9983550962823424
Finished Epoch[1 of 5]: [Training] loss = 2.107245 * 50000, metric = 79.08% * 50000 12.977s (3853.0 samples/s);
Finished Epoch[2 of 5]: [Training] loss = 1.795581 * 50000, metric = 67.10% * 50000 12.244s (4083.6 samples/s);
Finished Epoch[3 of 5]: [Training] loss = 1.657041 * 50000, metric = 61.52% * 50000 12.265s (4076.6 samples/s);
Finished Epoch[4 of 5]: [Training] loss = 1.567592 * 50000, metric = 57.72% * 50000 12.251s (4081.3 samples/s);
Finished Epoch[5 of 5]: [Training] loss = 1.500142 * 50000, metric = 54.97% * 50000 12.228s (4089.0 samples/s);

Final Results: Minibatch[1-626]: errs = 47.1% * 10000

Add batch normalization after each convolution and before the last dense layer:


In [22]:
def create_basic_model_with_batch_normalization(input, out_dims):

    with C.layers.default_options(activation=C.relu, init=C.glorot_uniform()):
        model = C.layers.Sequential([
            C.layers.For(range(3), lambda i: [
                C.layers.Convolution((5,5), [32,32,64][i], pad=True),
                C.layers.BatchNormalization(map_rank=1),
                C.layers.MaxPooling((3,3), strides=(2,2))
            ]),
            C.layers.Dense(64),
            C.layers.BatchNormalization(map_rank=1),
            C.layers.Dense(out_dims, activation=None)
        ])

    return model(input)

In [23]:
pred_basic_model_bn = train_and_evaluate(reader_train, 
                                         reader_test, 
                                         max_epochs=5, 
                                         model_func=create_basic_model_with_batch_normalization)


Training 117290 parameters in 18 parameter tensors.

Learning rate per minibatch: 0.01
Momentum per sample: 0.9983550962823424
Finished Epoch[1 of 5]: [Training] loss = 1.536584 * 50000, metric = 55.22% * 50000 12.978s (3852.7 samples/s);
Finished Epoch[2 of 5]: [Training] loss = 1.215455 * 50000, metric = 43.35% * 50000 12.196s (4099.7 samples/s);
Finished Epoch[3 of 5]: [Training] loss = 1.092067 * 50000, metric = 38.66% * 50000 12.260s (4078.3 samples/s);
Finished Epoch[4 of 5]: [Training] loss = 1.011021 * 50000, metric = 35.57% * 50000 12.330s (4055.2 samples/s);
Finished Epoch[5 of 5]: [Training] loss = 0.952613 * 50000, metric = 33.38% * 50000 12.286s (4069.7 samples/s);

Final Results: Minibatch[1-626]: errs = 30.1% * 10000

Let's implement an inspired VGG style network, using layer API, here the architecture:

VGG9
conv3-64
conv3-64
max3
conv3-96
conv3-96
max3
conv3-128
conv3-128
max3
FC-1024
FC-1024
FC-10

In [24]:
def create_vgg9_model(input, out_dims):
    with C.layers.default_options(activation=C.relu, init=C.glorot_uniform()):
        model = C.layers.Sequential([
            C.layers.For(range(3), lambda i: [
                C.layers.Convolution((3,3), [64,96,128][i], pad=True),
                C.layers.Convolution((3,3), [64,96,128][i], pad=True),
                C.layers.MaxPooling((3,3), strides=(2,2))
            ]),
            C.layers.For(range(2), lambda : [
                C.layers.Dense(1024)
            ]),
            C.layers.Dense(out_dims, activation=None)
        ])
        
    return model(input)

In [25]:
pred_vgg = train_and_evaluate(reader_train, 
                              reader_test, 
                              max_epochs=5, 
                              model_func=create_vgg9_model)


Training 2675978 parameters in 18 parameter tensors.

Learning rate per minibatch: 0.01
Momentum per sample: 0.9983550962823424
Finished Epoch[1 of 5]: [Training] loss = 2.267064 * 50000, metric = 84.67% * 50000 18.672s (2677.8 samples/s);
Finished Epoch[2 of 5]: [Training] loss = 1.877782 * 50000, metric = 69.81% * 50000 12.578s (3975.2 samples/s);
Finished Epoch[3 of 5]: [Training] loss = 1.689757 * 50000, metric = 63.07% * 50000 12.729s (3928.0 samples/s);
Finished Epoch[4 of 5]: [Training] loss = 1.564912 * 50000, metric = 57.57% * 50000 12.536s (3988.5 samples/s);
Finished Epoch[5 of 5]: [Training] loss = 1.475126 * 50000, metric = 53.79% * 50000 13.171s (3796.2 samples/s);

Final Results: Minibatch[1-626]: errs = 50.1% * 10000

Residual Network (ResNet)

One of the main problem of a Deep Neural Network is how to propagate the error all the way to the first layer. For a deep network, the gradients keep getting smaller until it has no effect on the network weights. ResNet was designed to overcome such problem, by defining a block with identity path, as shown below:


In [26]:
# Figure 7
Image(url="https://cntk.ai/jup/201/ResNetBlock2.png")


Out[26]:

The idea of the above block is 2 folds:

  • During back propagation the gradients have a path that does not affect its magnitude.
  • The network need to learn residual mapping (delta to x).

So let's implements ResNet blocks using CNTK:

        ResNetNode                   ResNetNodeInc
            |                              |
     +------+------+             +---------+----------+
     |             |             |                    |
     V             |             V                    V
+----------+       |      +--------------+   +----------------+
| Conv, BN |       |      | Conv x 2, BN |   | SubSample, BN  |
+----------+       |      +--------------+   +----------------+
     |             |             |                    |
     V             |             V                    |
 +-------+         |         +-------+                |
 | ReLU  |         |         | ReLU  |                |
 +-------+         |         +-------+                |
     |             |             |                    |
     V             |             V                    |
+----------+       |        +----------+              |
| Conv, BN |       |        | Conv, BN |              |
+----------+       |        +----------+              |
     |             |             |                    |
     |    +---+    |             |       +---+        |
     +--->| + |<---+             +------>+ + +<-------+
          +---+                          +---+
            |                              |
            V                              V
        +-------+                      +-------+
        | ReLU  |                      | ReLU  |
        +-------+                      +-------+
            |                              |
            V                              V

In [27]:
def convolution_bn(input, filter_size, num_filters, strides=(1,1), init=C.he_normal(), activation=C.relu):
    if activation is None:
        activation = lambda x: x
        
    r = C.layers.Convolution(filter_size, 
                             num_filters, 
                             strides=strides, 
                             init=init, 
                             activation=None, 
                             pad=True, bias=False)(input)
    r = C.layers.BatchNormalization(map_rank=1)(r)
    r = activation(r)
    
    return r

def resnet_basic(input, num_filters):
    c1 = convolution_bn(input, (3,3), num_filters)
    c2 = convolution_bn(c1, (3,3), num_filters, activation=None)
    p  = c2 + input
    return C.relu(p)

def resnet_basic_inc(input, num_filters):
    c1 = convolution_bn(input, (3,3), num_filters, strides=(2,2))
    c2 = convolution_bn(c1, (3,3), num_filters, activation=None)

    s = convolution_bn(input, (1,1), num_filters, strides=(2,2), activation=None)
    
    p = c2 + s
    return C.relu(p)

def resnet_basic_stack(input, num_filters, num_stack):
    assert (num_stack > 0)
    
    r = input
    for _ in range(num_stack):
        r = resnet_basic(r, num_filters)
    return r

Let's write the full model:


In [28]:
def create_resnet_model(input, out_dims):
    conv = convolution_bn(input, (3,3), 16)
    r1_1 = resnet_basic_stack(conv, 16, 3)

    r2_1 = resnet_basic_inc(r1_1, 32)
    r2_2 = resnet_basic_stack(r2_1, 32, 2)

    r3_1 = resnet_basic_inc(r2_2, 64)
    r3_2 = resnet_basic_stack(r3_1, 64, 2)

    # Global average pooling
    pool = C.layers.AveragePooling(filter_shape=(8,8), strides=(1,1))(r3_2)    
    net = C.layers.Dense(out_dims, init=C.he_normal(), activation=None)(pool)
    
    return net

In [29]:
pred_resnet = train_and_evaluate(reader_train, reader_test, max_epochs=5, model_func=create_resnet_model)


Training 272474 parameters in 65 parameter tensors.

Learning rate per minibatch: 0.01
Momentum per sample: 0.9983550962823424
Finished Epoch[1 of 5]: [Training] loss = 1.895607 * 50000, metric = 70.00% * 50000 24.547s (2036.9 samples/s);
Finished Epoch[2 of 5]: [Training] loss = 1.594962 * 50000, metric = 59.18% * 50000 21.075s (2372.5 samples/s);
Finished Epoch[3 of 5]: [Training] loss = 1.456406 * 50000, metric = 53.31% * 50000 21.631s (2311.5 samples/s);
Finished Epoch[4 of 5]: [Training] loss = 1.354717 * 50000, metric = 49.36% * 50000 20.848s (2398.3 samples/s);
Finished Epoch[5 of 5]: [Training] loss = 1.275108 * 50000, metric = 45.98% * 50000 21.164s (2362.5 samples/s);

Final Results: Minibatch[1-626]: errs = 43.9% * 10000


In [ ]: