Training and Inference Module

We modularized commonly used codes for training and inference in the module (or mod for short) package. This package provides intermediate-level and high-level interface for executing predefined networks.

Basic Usage

Preliminary

In this tutorial, we will use a simple multilayer perception for 10 classes and a synthetic dataset.


In [1]:
import mxnet as mx
from data_iter import SyntheticData

# mlp
net = mx.sym.Variable('data')
net = mx.sym.FullyConnected(net, name='fc1', num_hidden=64)
net = mx.sym.Activation(net, name='relu1', act_type="relu")
net = mx.sym.FullyConnected(net, name='fc2', num_hidden=10)
net = mx.sym.SoftmaxOutput(net, name='softmax')
# synthetic 10 classes dataset with 128 dimension 
data = SyntheticData(10, 128)
mx.viz.plot_network(net)


Out[1]:
plot data data fc1 FullyConnected 64 fc1->data relu1 Activation relu relu1->fc1 fc2 FullyConnected 10 fc2->relu1 softmax_label softmax_label softmax softmax softmax->fc2 softmax->softmax_label

Create Module

The most widely used module class is Module, which wraps a Symbol and one or more Executors.

We construct a module by specify

  • symbol : the network Symbol
  • context : the device (or a list of devices) for execution
  • data_names : the list of data variable names
  • label_names : the list of label variable names

One can refer to data.ipynb for more explanations about the last two arguments. Here we have only one data named data, and one label, with the name softmax_label, which is automatically named for us following the name softmax we specified for the SoftmaxOutput operator.


In [2]:
mod = mx.mod.Module(symbol=net, 
                    context=mx.cpu(),
                    data_names=['data'], 
                    label_names=['softmax_label'])

Train, Predict, and Evaluate

Modules provide high-level APIs for training, predicting and evaluating. To fit a module, simply call the fit function with some DataIters.


In [3]:
# @@@ AUTOTEST_OUTPUT_IGNORED_CELL
import logging
logging.basicConfig(level=logging.INFO)

batch_size=32
mod.fit(data.get_iter(batch_size), 
        eval_data=data.get_iter(batch_size),
        optimizer='sgd',
        optimizer_params={'learning_rate':0.1},
        eval_metric='acc',
        num_epoch=5)


INFO:root:Epoch[0] Train-accuracy=0.100000
INFO:root:Epoch[0] Time cost=0.171
INFO:root:Epoch[0] Validation-accuracy=0.087500
INFO:root:Epoch[1] Train-accuracy=0.146875
INFO:root:Epoch[1] Time cost=0.074
INFO:root:Epoch[1] Validation-accuracy=0.078125
INFO:root:Epoch[2] Train-accuracy=0.165625
INFO:root:Epoch[2] Time cost=0.061
INFO:root:Epoch[2] Validation-accuracy=0.081250
INFO:root:Epoch[3] Train-accuracy=0.118750
INFO:root:Epoch[3] Time cost=0.046
INFO:root:Epoch[3] Validation-accuracy=0.203125
INFO:root:Epoch[4] Train-accuracy=0.243750
INFO:root:Epoch[4] Time cost=0.099
INFO:root:Epoch[4] Validation-accuracy=0.146875

To predict with a module, simply call predict() with a DataIter. It will collect and return all the prediction results.


In [4]:
y = mod.predict(data.get_iter(batch_size))
'shape of predict: %s' % (y.shape,)


Out[4]:
'shape of predict: (320, 10)'

Another convenient API for prediction in the case where the prediction results might be too large to fit in the memory is iter_predict:


In [5]:
# @@@ AUTOTEST_OUTPUT_IGNORED_CELL
for preds, i_batch, batch in mod.iter_predict(data.get_iter(batch_size)):
    pred_label = preds[0].asnumpy().argmax(axis=1)
    label = batch.label[0].asnumpy().astype('int32')
    print('batch %d, accuracy %f' % (i_batch, float(sum(pred_label==label))/len(label)))


batch 0, accuracy 0.156250
batch 1, accuracy 0.125000
batch 2, accuracy 0.343750
batch 3, accuracy 0.125000
batch 4, accuracy 0.031250
batch 5, accuracy 0.031250
batch 6, accuracy 0.187500
batch 7, accuracy 0.250000
batch 8, accuracy 0.125000
batch 9, accuracy 0.187500

If we do not need the prediction outputs, but just need to evaluate on a test set, we can call the score() function with a DataIter and a EvalMetric:


In [6]:
# @@@ AUTOTEST_OUTPUT_IGNORED_CELL
mod.score(data.get_iter(batch_size), ['mse', 'acc'])


Out[6]:
[('mse', 27.860061836242675), ('accuracy', 0.096875000000000003)]

Save and Load

We can save the module parameters in each training epoch by using a checkpoint callback.


In [7]:
# @@@ AUTOTEST_OUTPUT_IGNORED_CELL
# construct a callback function to save checkpoints
model_prefix = 'mx_mlp'
checkpoint = mx.callback.do_checkpoint(model_prefix)

mod = mx.mod.Module(symbol=net)
mod.fit(data.get_iter(batch_size), num_epoch=5, epoch_end_callback=checkpoint)


INFO:root:Epoch[0] Train-accuracy=0.140625
INFO:root:Epoch[0] Time cost=0.052
INFO:root:Saved checkpoint to "mx_mlp-0001.params"
INFO:root:Epoch[1] Train-accuracy=0.087500
INFO:root:Epoch[1] Time cost=0.090
INFO:root:Saved checkpoint to "mx_mlp-0002.params"
INFO:root:Epoch[2] Train-accuracy=0.159375
INFO:root:Epoch[2] Time cost=0.043
INFO:root:Saved checkpoint to "mx_mlp-0003.params"
INFO:root:Epoch[3] Train-accuracy=0.084375
INFO:root:Epoch[3] Time cost=0.043
INFO:root:Saved checkpoint to "mx_mlp-0004.params"
INFO:root:Epoch[4] Train-accuracy=0.093750
INFO:root:Epoch[4] Time cost=0.038
INFO:root:Saved checkpoint to "mx_mlp-0005.params"

To load the saved module parameters, call the load_checkpoint function. It load the Symbol and the associated parameters. We can then set the loaded parameters into the module.


In [8]:
sym, arg_params, aux_params = mx.model.load_checkpoint(model_prefix, 3)
print(sym.tojson() == net.tojson())

# assign the loaded parameters to the module
mod.set_params(arg_params, aux_params)


True

Or if we just want to resume training from a saved checkpoint, instead of calling set_params(), we can directly call fit(), passing the loaded parameters, so that fit() knows to start from those parameters instead of initializing from random. We also set the begin_epoch so that so that fit() knows we are resuming from a previous saved epoch.


In [9]:
# @@@ AUTOTEST_OUTPUT_IGNORED_CELL
mod = mx.mod.Module(symbol=sym)
mod.fit(data.get_iter(batch_size),
        num_epoch=5,
        arg_params=arg_params, 
        aux_params=aux_params,
        begin_epoch=3)


INFO:root:Epoch[3] Train-accuracy=0.078125
INFO:root:Epoch[3] Time cost=0.052
INFO:root:Epoch[4] Train-accuracy=0.118750
INFO:root:Epoch[4] Time cost=0.042

Module as a computation "machine"

We already seen how to module for basic training and inference. Now we are going to show a more flexiable usage of module.

A module represents a computation component. The design purpose of a module is that it abstract a computation “machine”, that accpets Symbol programs and data, and then we can run forward, backward, update parameters, etc.

We aim to make the APIs easy and flexible to use, especially in the case when we need to use imperative API to work with multiple modules (e.g. stochastic depth network).

A module has several states:

  • Initial state. Memory is not allocated yet, not ready for computation yet.
  • Binded. Shapes for inputs, outputs, and parameters are all known, memory allocated, ready for computation.
  • Parameter initialized. For modules with parameters, doing computation before initializing the parameters might result in undefined outputs.|
  • Optimizer installed. An optimizer can be installed to a module. After this, the parameters of the module can be updated according to the optimizer after gradients are computed (forward-backward).

The following codes implement a simplified fit(). Here we used other components including initializer, optimizer, and metric, which are explained in other notebooks.


In [10]:
# @@@ AUTOTEST_OUTPUT_IGNORED_CELL
# initial state
mod = mx.mod.Module(symbol=net)

# bind, tell the module the data and label shapes, so
# that memory could be allocated on the devices for computation
train_iter = data.get_iter(batch_size)
mod.bind(data_shapes=train_iter.provide_data, label_shapes=train_iter.provide_label)

# init parameters
mod.init_params(initializer=mx.init.Xavier(magnitude=2.))

# init optimizer
mod.init_optimizer(optimizer='sgd', optimizer_params=(('learning_rate', 0.1), ))

# use accuracy as the metric
metric = mx.metric.create('acc')

# train one epoch, i.e. going over the data iter one pass
for batch in train_iter:
    mod.forward(batch, is_train=True)       # compute predictions
    mod.update_metric(metric, batch.label)  # accumulate prediction accuracy
    mod.backward()                          # compute gradients
    mod.update()                            # update parameters using SGD
    
# training accuracy
print(metric.get())


('accuracy', 0.46875)

Beside the operations, a module provides a lot of useful information.

basic names:

  • data_names: list of string indicating the names of the required data.
  • output_names: list of string indicating the names of the outputs.

state information

  • binded: bool, indicating whether the memory buffers needed for computation has been allocated.
  • for_training: whether the module is binded for training (if binded).
  • params_initialized: bool, indicating whether the parameters of this modules has been initialized.
  • optimizer_initialized: bool, indicating whether an optimizer is defined and initialized.
  • inputs_need_grad: bool, indicating whether gradients with respect to the input data is needed. Might be useful when implementing composition of modules.

input/output information

  • data_shapes: a list of (name, shape). In theory, since the memory is allocated, we could directly provide the data arrays. But in the case of data parallelization, the data arrays might not be of the same shape as viewed from the external world.
  • label_shapes: a list of (name, shape). This might be [] if the module does not need labels (e.g. it does not contains a loss function at the top), or a module is not binded for training.
  • output_shapes: a list of (name, shape) for outputs of the module.

parameters (for modules with parameters)

  • get_params(): return a tuple (arg_params, aux_params). Each of those is a dictionary of name to NDArray mapping. Those NDArray always lives on CPU. The actual parameters used for computing might live on other devices (GPUs), this function will retrieve (a copy of) the latest parameters.
  • get_outputs(): get outputs of the previous forward operation.
  • get_input_grads(): get the gradients with respect to the inputs computed in the previous backward operation.

In [11]:
print((mod.data_shapes, mod.label_shapes, mod.output_shapes))
print(mod.get_params())


([DataDesc[data,(32, 128),<class 'numpy.float32'>,NCHW]], [DataDesc[softmax_label,(32,),<class 'numpy.float32'>,NCHW]], [('softmax_output', (32, 10))])
({'fc1_weight': 
[[ 0.0044282  -0.14036682  0.09225076 ..., -0.10437089  0.06913053
   0.02976805]
 [-0.05223462  0.13739362 -0.01786359 ...,  0.09563041  0.11268744
   0.00668876]
 [ 0.01219749 -0.00526298  0.12900518 ...,  0.08611787 -0.0110269
  -0.01260506]
 ..., 
 [ 0.06454389  0.00594679 -0.08161377 ...,  0.13535586  0.12165765
  -0.0014752 ]
 [-0.0212306   0.04312064  0.13771765 ...,  0.09815074 -0.12458699
   0.05672931]
 [ 0.07553843  0.01484931 -0.01948447 ...,  0.02386812  0.06220835
   0.06169618]]
<NDArray 64x128 @cpu(0)>, 'fc1_bias': 
[  4.73744934e-03   1.02726917e-04   5.61462762e-03  -7.19625270e-03
  -8.77505052e-04   1.50408316e-03   1.56324799e-03   2.08255183e-02
   8.67690798e-03   0.00000000e+00   1.19885122e-02   0.00000000e+00
  -3.91672272e-03   1.55673372e-02  -5.31160505e-03  -1.14281811e-02
   0.00000000e+00  -2.91474210e-03   0.00000000e+00  -2.48055323e-03
   4.92983311e-03   6.90773828e-03   1.06136324e-02   1.79583475e-03
  -6.95278496e-03   0.00000000e+00   3.17541743e-03  -7.88762607e-03
   5.62453410e-04  -3.82365799e-03  -1.64269190e-03   8.86745192e-03
   2.09175423e-03   2.29797210e-03   4.34349244e-03   5.97649626e-03
  -1.01414835e-02   1.33104101e-02  -2.54589017e-04   1.05941121e-03
   2.09465390e-04   6.84527401e-03   1.04011260e-02   4.28308547e-03
   0.00000000e+00   7.61443516e-07   3.39048123e-03  -2.50364258e-03
   1.02601480e-02   9.58353374e-03  -4.20960784e-03   1.13417776e-02
   4.60029161e-03   2.09772550e-02   1.30192908e-02   3.17705912e-04
   3.50411329e-03  -2.80806795e-03  -9.26235504e-03  -7.72205042e-03
  -2.41576019e-03  -1.53316360e-03  -3.02575901e-03  -1.73519216e-02]
<NDArray 64 @cpu(0)>, 'fc2_weight': 
[[ -3.72178480e-02  -1.91981658e-01   1.75340399e-01  -1.54423907e-01
   -2.43007094e-02  -2.19728112e-01   1.77506521e-01   8.80772620e-02
   -1.86076295e-03  -2.19160065e-01   7.56453909e-03   6.31188452e-02
   -1.54394418e-01  -1.36027321e-01  -5.15879244e-02   7.23978877e-02
   -5.86261004e-02   1.20875686e-01   1.80144548e-01  -3.03679495e-03
   -6.66927174e-02  -2.03905970e-01  -1.78788096e-01   1.97186600e-02
    2.22074538e-01   1.17082238e-01  -1.70570202e-02   3.83021706e-03
   -1.88121051e-01   1.32936444e-02  -1.90152615e-01   1.45678595e-01
   -1.38963804e-01  -9.02557522e-02  -1.39616936e-01   8.11820254e-02
   -1.86795875e-01   1.34484917e-01   1.41332850e-01  -1.18987001e-01
    1.51824608e-01  -1.44147515e-01   1.79002076e-01   1.73718452e-01
   -1.66918397e-01  -2.11345647e-02  -1.36514902e-01   5.33122793e-02
    1.55706152e-01  -1.62857100e-02   1.72626108e-01  -1.96899667e-01
   -1.29110083e-01   7.54536390e-02  -8.34719092e-02   5.71472105e-03
    1.92375451e-01   4.33557015e-03   9.46345702e-02   6.41739592e-02
    1.89556122e-01   5.67914881e-02   2.15822950e-01   6.76635057e-02]
 [ -1.97097093e-01   1.20761171e-01   2.01176330e-01  -3.49664092e-02
    1.85290471e-01  -5.54951467e-02   2.22161459e-03  -2.35327557e-01
    2.37261839e-02  -8.32546502e-02   1.42303482e-01  -5.49211502e-02
    4.31765355e-02  -1.65138811e-01  -1.85562193e-01  -1.25857234e-01
    1.39514506e-02  -1.23499736e-01  -1.63019717e-01   1.43979937e-01
   -1.22252638e-02  -1.16527285e-02   2.05012754e-01  -6.17253110e-02
   -1.41930338e-02   6.95159733e-02   1.50943041e-01   1.44737065e-01
    1.98697358e-01   1.69902369e-01  -2.66340072e-03  -1.27451643e-01
   -1.11230686e-02   1.95105255e-01  -5.99418692e-02  -8.94437107e-05
   -3.50943990e-02  -1.12717576e-01  -2.44352877e-01   4.12035920e-02
    1.17046773e-01  -1.79414283e-02   1.46084279e-01  -8.25243071e-02
    1.36238426e-01   1.47972509e-01   1.78445317e-02  -2.00409383e-01
   -1.98077917e-01   2.00136811e-01  -9.28691477e-02  -2.18608484e-01
    6.11890666e-02  -3.42436135e-02   1.51763007e-01   6.77847713e-02
   -9.97588784e-02   1.01955883e-01  -1.76753968e-01  -7.14210644e-02
    1.07408874e-01  -1.33717045e-01   1.98668167e-01  -1.67188093e-01]
 [ -7.46126324e-02   1.23790473e-01  -9.07619596e-02  -1.57010537e-02
    2.15307340e-01   1.44637153e-01  -1.63188148e-02  -2.11427763e-01
   -1.91450417e-01  -5.29292375e-02   5.31884916e-02   2.16690481e-01
    2.14034021e-01   3.71949151e-02  -1.70402870e-01   1.86623141e-01
    2.52526402e-02   1.46677583e-01   1.25668675e-01   1.67869240e-01
   -1.28931433e-01   3.68553437e-02   9.81869325e-02   9.33834910e-02
    1.03316724e-01  -4.82749790e-02   1.76260263e-01  -1.86049849e-01
   -2.19463155e-01   1.63516104e-02   1.67595506e-01  -2.07609549e-01
   -2.16804564e-01   1.36843264e-01   1.09875821e-01  -1.56053931e-01
   -3.49495858e-02  -1.29447639e-01  -2.09690213e-01  -1.62607979e-03
   -1.14562571e-01  -7.26362318e-02  -1.04583204e-01  -2.83725578e-02
    3.84640694e-02  -7.59808570e-02  -1.32845640e-01   3.07440199e-02
   -2.28651926e-01   2.65154809e-01  -8.53842944e-02  -2.01451764e-01
    1.16814628e-01   2.82961696e-01   1.99821979e-01  -1.02971427e-01
   -1.32774472e-01  -5.79209588e-02   1.73218876e-01  -8.61298367e-02
   -3.37768830e-02   1.98313028e-01  -2.35611975e-01  -1.04736991e-01]
 [  1.19386852e-01  -2.16256231e-01  -1.17913805e-01  -1.66113123e-01
   -1.39918327e-01  -9.68909264e-02  -6.72747418e-02   1.47568211e-01
   -1.24715067e-01   1.24734193e-01   3.26950736e-02   7.73226619e-02
   -1.46941051e-01  -4.50129136e-02   1.07945174e-01   7.28577822e-02
    1.46345943e-01   9.42140743e-02  -1.09325647e-02   1.20708317e-01
    1.36347905e-01   1.63940415e-01   1.00977153e-01  -1.26919031e-01
    1.16453767e-01   9.14255679e-02   3.12722437e-02  -2.15862453e-01
    4.20756415e-02   1.81081295e-01   1.62746236e-01  -1.63646918e-02
    1.61500946e-01  -1.43430559e-02  -1.33079812e-01   1.06849462e-01
   -1.42813221e-01  -1.03224024e-01   2.17851594e-01   2.19102308e-01
   -1.25334486e-01   1.72664180e-01   1.77993402e-01  -7.58207813e-02
    2.80818641e-02   1.99439138e-01  -2.58592479e-02  -2.29052696e-02
   -8.69208649e-02   1.01649903e-01  -9.27563384e-02   1.60186321e-01
    1.62936319e-02   6.53818697e-02  -1.85112432e-01  -1.69995070e-01
   -1.88732028e-01   1.85491413e-01   1.60498202e-01  -1.74937651e-01
   -2.26934254e-02  -5.00126109e-02  -1.90904602e-01   1.24702208e-01]
 [ -1.23468250e-01   1.06924415e-01  -2.24253729e-01  -6.19159173e-03
   -2.07321923e-02  -2.04362795e-01  -5.69303744e-02   6.90823272e-02
   -9.68865007e-02  -9.98770297e-02  -2.03334093e-01  -3.48986238e-02
   -9.45435241e-02   1.86014891e-01  -1.12035654e-01  -1.63572013e-01
    1.50883287e-01  -2.18153396e-03  -1.68064952e-01   4.35527181e-03
    1.48517072e-01  -7.99391642e-02  -2.14392796e-01   1.02693424e-01
    1.53782561e-01   3.42654288e-02  -1.38319120e-01   1.65538490e-03
   -1.84172586e-01   8.74739662e-02   1.89481258e-01  -1.65423259e-01
   -2.95349173e-02  -1.04635298e-01  -5.54203466e-02   2.25014701e-01
    5.63591607e-02  -5.24011673e-03   2.22381994e-01  -1.96364507e-01
   -2.46963859e-01   7.84328133e-02   2.11737394e-01  -1.44494280e-01
   -9.65243429e-02  -3.72487307e-02  -1.50578663e-01   1.66239038e-01
   -2.86942106e-02  -2.03184038e-01  -2.07667395e-01   1.86826810e-01
   -1.42626613e-01   1.60031155e-01  -2.49843180e-01   2.17069089e-01
   -1.13047130e-01   1.84937790e-01   1.57043502e-01   2.14441121e-02
    2.22728819e-01  -1.63345356e-02  -1.38946712e-01  -1.51267931e-01]
 [  1.27788827e-01   3.27472948e-02  -6.80629686e-02   1.49647251e-01
    5.21660857e-02   8.00653324e-02  -6.40384480e-02  -2.15940461e-01
    1.32989027e-02   1.59935921e-01  -1.62424102e-01   1.58721983e-01
   -2.40436032e-01  -1.43679395e-01  -1.50745869e-01  -2.76464541e-02
    7.41653740e-02  -1.97759852e-01  -8.01315904e-02   5.47061674e-02
   -2.62491927e-02  -2.43206918e-01   1.30105853e-01  -3.62685733e-02
   -1.54858798e-01  -5.15383631e-02   2.11833268e-01  -3.15963104e-02
   -1.41443565e-01   2.65722722e-02  -1.52972490e-01   1.33424774e-01
   -1.84226051e-01  -4.78543192e-02   4.17600721e-02  -1.82775781e-02
    1.19041122e-01  -1.81785021e-02   2.21679777e-01   1.87535524e-01
   -8.99241194e-02   1.48778245e-01   9.60070267e-02   1.19468406e-01
   -2.03787118e-01  -1.38324797e-01  -2.12524578e-01   4.74058986e-02
   -1.95055947e-01   4.02259417e-02   1.92578942e-01  -7.18178675e-02
   -1.69570550e-01   1.81453601e-01  -1.67953670e-01   1.38156891e-01
    6.13041222e-02   2.19574645e-01   8.73357430e-02  -1.56766534e-01
    2.08157092e-01   1.35116875e-01   1.61957324e-01  -2.14036673e-01]
 [ -8.66717473e-03   7.85165802e-02  -7.65048414e-02  -2.26518586e-01
    1.81835890e-01   1.07750602e-01  -9.51395556e-02   1.02361046e-01
    2.03481033e-01   1.03896290e-01   1.11242287e-01  -8.83567333e-02
   -7.93943480e-02   1.38106510e-01  -2.17788532e-01  -2.08559155e-01
    8.24451447e-02   2.11823061e-01   1.56708747e-01   2.58134864e-02
    9.07088630e-03  -1.60296038e-02   7.88855776e-02  -1.51975095e-01
   -5.31326197e-02  -7.46974349e-02  -3.47250439e-02  -4.04538810e-02
    2.54233163e-02   7.84785450e-02  -1.73606455e-01  -1.43970728e-01
   -7.49285445e-02  -1.41526610e-01   7.87613168e-02   2.55046953e-02
    1.80724263e-01   1.52553618e-01  -1.91316575e-01   6.35137185e-02
    1.68837141e-02   9.97201651e-02  -1.52645141e-01  -1.05390765e-01
   -2.23824799e-01   4.93862927e-02  -7.74793476e-02   4.54410352e-02
    9.97940600e-02  -1.38888836e-01  -1.31608739e-01  -1.56661626e-02
    1.83090925e-01   1.27122924e-01  -6.87944219e-02   7.37393498e-02
   -2.27896288e-01  -1.91569384e-02   3.89518775e-02  -1.58987045e-01
   -1.08617276e-01   9.71739367e-02  -1.70393959e-01   2.20350906e-01]
 [ -2.41155967e-01  -2.52932664e-02   6.28155395e-02  -1.29657596e-01
    5.38515672e-02   3.09350006e-02  -1.52586296e-01   3.76469865e-02
   -1.92625850e-01   2.57676542e-02   8.96347337e-05  -1.72223166e-01
    5.46446890e-02   3.77389826e-02   7.18385205e-02   1.51568547e-01
   -1.50530934e-01   1.21695146e-01  -6.42414391e-02   1.56806186e-01
    4.06227820e-02   1.07695736e-01  -1.50658637e-01   1.68752700e-01
   -1.54542908e-01  -1.05383918e-01   1.75742909e-01   7.46252462e-02
    8.95687193e-02   7.91945532e-02  -2.16605872e-01   1.96852818e-01
   -1.46699883e-03   1.04729600e-01   1.43497229e-01  -2.12646618e-01
   -2.01375976e-01  -2.58943439e-02  -6.15540333e-03   8.56398419e-02
   -9.86511540e-03   1.02992810e-01  -4.53208424e-02   1.41671434e-01
    9.19059813e-02   4.85263243e-02  -1.64974138e-01   7.64214545e-02
    1.77999586e-01  -7.40066543e-02  -1.90320969e-01  -2.19050825e-01
   -2.01026082e-01  -4.50335108e-02  -1.51412770e-01   9.22257826e-02
   -1.49811566e-01   3.75598744e-02  -2.08129793e-01   1.08787920e-02
    1.31263986e-01   2.54414733e-02  -2.20409185e-01  -7.12524280e-02]
 [ -5.60008222e-04  -2.29532018e-01  -5.51255560e-03  -1.21287689e-01
    7.51506211e-03  -2.42115214e-01   1.31572753e-01  -1.14747912e-01
    1.33398741e-01  -3.63059789e-02  -1.20847836e-01  -2.00322315e-01
   -1.09469891e-01   2.21812814e-01  -2.68961024e-02  -2.94846669e-03
    2.19097406e-01  -2.43819296e-01  -2.23004922e-01   2.15184558e-02
    1.71569079e-01  -3.33778150e-02  -1.25142619e-01  -9.43669826e-02
   -1.90606609e-01   2.53448784e-02  -7.38542527e-02   5.05769514e-02
    8.74357447e-02   9.58363786e-02  -1.93932042e-01  -2.17266530e-01
   -2.32893974e-02   5.03557920e-03   2.29231641e-01  -1.86903298e-01
    1.01948597e-01   1.18148439e-01   1.20380715e-01  -5.00750840e-02
    3.78244370e-02   1.60923842e-02  -2.41884831e-04   3.80417220e-02
    1.94732457e-01  -1.47002324e-01  -6.14743754e-02   9.19977352e-02
    6.63157627e-02   4.13787253e-02  -1.09143406e-01   1.27991766e-01
   -8.17035213e-02   1.81239992e-01  -1.90433085e-01  -9.36919358e-03
   -1.82496086e-01  -1.69363469e-02   5.53531125e-02  -9.30506662e-02
   -5.97105995e-02  -1.69466957e-01   5.95968068e-02  -8.39885175e-02]
 [  4.39895540e-02   1.06624864e-01  -1.58235207e-01  -1.05943374e-01
    6.75749853e-02  -4.63631675e-02   1.29264042e-01   1.72501639e-01
   -1.87802240e-02  -1.42417073e-01  -2.22392708e-01  -2.24003807e-01
    8.20191354e-02  -1.39718279e-01  -7.30710253e-02   7.46886656e-02
   -9.34336185e-02   6.43208623e-02  -1.79361224e-01   9.86500457e-02
    5.86259924e-03   2.07397789e-01   8.28128606e-02  -2.03660592e-01
   -1.40815303e-01  -8.22620094e-03  -3.55550237e-02   1.78816155e-01
   -7.02937096e-02  -5.69947064e-02  -1.96160957e-01  -1.19736932e-01
   -6.35712370e-02  -1.17311709e-01   2.22273797e-01   3.08805611e-02
    1.97819173e-01   1.61166668e-01  -1.19022325e-01   4.90181483e-02
    1.07985772e-02  -1.39540672e-01  -8.67302865e-02  -1.06698938e-01
    1.66971475e-01  -1.69183854e-02  -2.16226175e-01   1.16054974e-01
   -1.43676132e-01  -2.11986214e-01   1.65516347e-01   8.16148967e-02
    2.16737464e-01  -1.09202467e-01   1.51411206e-01  -9.11670923e-02
   -1.12101331e-01   2.02816173e-01   1.58274725e-01  -7.57554248e-02
    2.04867512e-01   4.55018505e-02   1.38419122e-01  -6.04576357e-02]]
<NDArray 10x64 @cpu(0)>, 'fc2_bias': 
[-0.01622528 -0.00313228 -0.00763383  0.00567062 -0.00719109 -0.00111698
  0.01943333  0.00286393  0.02434839 -0.0170168 ]
<NDArray 10 @cpu(0)>}, {})

More on Modules

Module simplifies the implementation of new modules. For example

  • SequentialModule can chain multiple modules together
  • BucketingModule is able to handle bucketing, which is useful for various length inputs and outputs
  • PythonModule implements many APIs as empty function to ease users to implement customized modules.

See also example/module for a list of code examples using the module API.

Implementation

The module is implemented in python, located at python/mxnet/module

Futher Readings