Using RRBM: Examples of RBM networks

  1. using the compiled libraries (GSL)
  2. using the native R libraries

Load Libraries


In [1]:
library(rrbm);

RBMs: the MNIST example

Load Dataset

  • Previously it could be loaded through the RDS package.
  • Now, it is included in the package datasets as "mnist"

In [2]:
# mnist <- readRDS("../datasets/mnist.rds");
data(mnist)

In [3]:
training_x <- mnist$train$x / 255;
training_y <- mnist$train$y;

testing_x <- mnist$test$x / 255;
testing_y <- mnist$test$y;

Let's reduce the dataset size for this example


In [4]:
training_x <- training_x[1:1000,, drop=FALSE];
training_y <- training_y[1:1000, drop=FALSE];

testing_x <- testing_x[1:1000,, drop=FALSE];
testing_y <- testing_y[1:1000, drop=FALSE];

Train the RBM


In [5]:
rbm_mnist <- train.rbm(n_hidden = 30,
                       dataset = training_x,
                       learning_rate = 1e-3,
                       training_epochs = 10,
                       batch_size = 10,
                       momentum = 0.5
);

Predict using the RBM

There are 3 methods: predict.rbm, forward.rbm and backward.rbm

  • predict.rbm: passes the dataset forward and backward through the RBM

In [6]:
result <- predict(rbm_mnist, training_x);
str(result);


List of 2
 $ reconstruction: num [1:1000, 1:784] 2.03e-02 2.25e-03 6.84e-05 2.84e-03 2.59e-04 ...
 $ activation    : num [1:1000, 1:30] 0.836 0.812 0.652 0.741 0.775 ...
  • forward.rbm: passes the dataset forward only, returning activations only

In [7]:
act1 <- forward.rbm(rbm_mnist, training_x);
str(act1);


 num [1:1000, 1:30] 0.836 0.812 0.652 0.741 0.775 ...
  • backward.rbm: passes a set of activations back the RBM, returning the reconstructions

In [8]:
recons1 <- backward.rbm(rbm_mnist, act1);
str(recons1);


 num [1:1000, 1:784] -8.87e-05 -1.47e-04 2.26e-03 6.55e-04 -6.70e-04 ...

Update / Re-train an RBM

You can pass a trained RBM as initial values for a new RBM. The properties of the RBM must match with the RBM passed as init_rbm. The function returns a new updated copy of the old RBM


In [9]:
rbm_mnist_update <- train.rbm(n_hidden = 30,
                              dataset = training_x,
                              learning_rate = 1e-3,
                              training_epochs = 10,
                              batch_size = 10,
                              momentum = 0.5,
                              init_rbm = rbm_mnist
);

Using the R native functions


In [10]:
rm (list = ls());

Load the R sources


In [11]:
setwd("..");
source("./rbm.R");
setwd("./notebooks");

Load Dataset


In [12]:
# mnist <- readRDS("../datasets/mnist.rds");
data(mnist)

In [13]:
training_x <- mnist$train$x / 255;
training_y <- mnist$train$y;

testing_x <- mnist$test$x / 255;
testing_y <- mnist$test$y;

In [14]:
training_x <- training_x[1:1000,, drop=FALSE];
training_y <- training_y[1:1000, drop=FALSE];

testing_x <- testing_x[1:1000,, drop=FALSE];
testing_y <- testing_y[1:1000, drop=FALSE];

Train the RBM


In [15]:
rbm_mnist <- train_rbm(n_hidden = 30,
                       dataset = training_x,
                       learning_rate = 1e-3,
                       training_epochs = 10,
                       batch_size = 10,
                       momentum = 0.5
);


[1] "Training epoch 1, cost is 58.008781882777"
[1] "Training epoch 2, cost is 51.2308940863924"
[1] "Training epoch 3, cost is 51.1911011949171"
[1] "Training epoch 4, cost is 51.1451888321765"
[1] "Training epoch 5, cost is 51.1976890769631"
[1] "Training epoch 6, cost is 51.1386957235904"
[1] "Training epoch 7, cost is 51.1108288625722"
[1] "Training epoch 8, cost is 51.054386851713"
[1] "Training epoch 9, cost is 51.0856545143375"
[1] "Training epoch 10, cost is 51.0706292473888"
[1] "Training took 1.03504323959351"

Predict using the RBM

There are 3 methods: predict_rbm, forward_rbm and backward_rbm

  • predict_rbm: passes the dataset forward and backward through the RBM (in the native version, predict is not an S3 function)

In [16]:
result <- predict_rbm(rbm_mnist, training_x);
str(result);


List of 2
 $ activations   : num [1:1000, 1:30] 0.806 0.791 0.605 0.779 0.818 ...
 $ reconstruction: num [1:1000, 1:784] -0.003166 0.000929 -0.001115 0.006736 -0.003092 ...
  • forward_rbm: passes the dataset forward only, returning activations only

In [17]:
act1 <- forward_rbm(rbm_mnist, training_x);
str(act1);


 num [1:1000, 1:30] 0.806 0.791 0.605 0.779 0.818 ...
  • backward_rbm: passes a set of activations back the RBM, returning the reconstructions

In [18]:
recons1 <- backward_rbm(rbm_mnist, act1);
str(recons1);


 num [1:1000, 1:784] -0.003166 0.000929 -0.001115 0.006736 -0.003092 ...