Recurrent neural networks

Import various modules that we need for this notebook (now using Keras 1.0.0)


In [12]:
%pylab inline

import copy

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt

from keras.datasets import imdb, reuters
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation, Flatten
from keras.optimizers import SGD, RMSprop
from keras.utils import np_utils
from keras.layers.convolutional import Convolution1D, MaxPooling1D, ZeroPadding1D, AveragePooling1D
from keras.callbacks import EarlyStopping
from keras.layers.normalization import BatchNormalization
from keras.preprocessing import sequence
from keras.layers.embeddings import Embedding

from gensim.models import word2vec


Populating the interactive namespace from numpy and matplotlib
WARNING: pylab import has clobbered these variables: ['copy']
`%matplotlib` prevents importing * from pylab and numpy

Load the MNIST dataset, flatten the images, convert the class labels, and scale the data.

I. Example

We read in the IMDB dataset, using the next 500 most commonly used terms.


In [2]:
(X_train, y_train), (X_test, y_test) = imdb.load_data(nb_words=500, maxlen=100, test_split=0.2)
X_train = sequence.pad_sequences(X_train, maxlen=100)
X_test = sequence.pad_sequences(X_test, maxlen=100)

Let's look at one sample from X_train and the first 10 elements of y_train. The codes give indicies for the word in the vocabulary (unfortunately, we do not have access to the vocabulary for this set).