Exercises

To get you started with Theano and Lasage, here are some task which you can try at home. Please also see Theano tutorial (http://deeplearning.net/software/theano/tutorial/) and the Lasagne Docs (http://lasagne.readthedocs.org/en/latest/index.html)

Understand Theano

The above linked Theano tutorial is great to get to know Theano. I can highly recommend the following articles.

Logistic Function: http://deeplearning.net/software/theano/tutorial/examples.html#logistic-function

Computing More than one Thing at the Same Time: http://deeplearning.net/software/theano/tutorial/examples.html#computing-more-than-one-thing-at-the-same-time

Using Shared Variables: http://deeplearning.net/software/theano/tutorial/examples.html#using-shared-variables

Computing Gradients: http://deeplearning.net/software/theano/tutorial/gradients.html#tutcomputinggrads

Classifying MNIST digits using Logistic Regression: http://www.deeplearning.net/tutorial/logreg.html

Multi Layer Peceptron: http://www.deeplearning.net/tutorial/mlp.html#mlp

Understand the MNIST Example

Open the MNIST example and try to understand it and to modify it.

Task: Change the hyperparameters for number of hidden units and learning rate. How does the accurarcy and training time change?

Task: Change the mini-batch sizes. Which impact does it have?

Task: Change the activation function from tanh to sigmoid.

Task: In the current implementation the learning rate is fixed. Try a decreasing learning rate, start in the first epoch with e.g. 0.1, then decrease it stepwise to 0.01. Hint: Your Theano computation graph needs an additional variable which will capute the learning rate.

Task: Try to add a second hidden layer. What's the impact?

Understand the Lasagne Example

Task: Same as above, try different parameters (hidden units, learning rate, activation function, mini batch size).

Task: Add a second hidden layer


In [ ]: