Lab 6.3 - Improving the model

In this section of the lab, you will be asked to apply what you have learned to create a RNN model that can generate new sequences of text based on what it has learned from a large set of existing text. In this case we will be using the full text of Lewis Carroll's Alice in Wonderland. Your task for the assignment is to:

  • format the book text into a set of training data
  • define a RNN model in Keras based on one or more LSTM or GRU layers
  • train the model with the training data
  • use the trained model to generate new text

Our previous model based on Obama's essay was prone to overfitting since there was not that much data to learn from. Thus, the generated text was either unintelligeable (not enough learning) or exactly replicated the training data (over-fitting). In this case, we are working with a much bigger data set, which should provide enough data to avoid over-fitting, but will also take more time to train. To improve your model, you can experiment with tuning the following hyper-parameters:

  • Use more than one recurrent layer and/or add more memory units (hidden neurons) to each layer. This will allow you to learn more complex structures in the data.
  • Use sequences longer than 100 characters, which will allow you to learn from patterns further back in time.
  • Change the way the sequences are generated. For example you could try to break up the text into real sentances using the periods, and then either cut or pad each sentance to make it 100 characters long.
  • Increase the number of training epochs, which will give the model more time to learn. Monitor the validation loss at each epoch to make sure the model is still improving at each epoch and is not overfitting the training data.
  • Add more dropout to the recurrent layers to minimize over-fitting.
  • Tune the batch size - try a batch size of 1 as a (very slow) baseline and larger sizes from there.
  • Experiment with scale factors (temperature) when interpreting the prediction probabilities.

If you get an error such as alloc error or out of memory error during training it means that your computer does not have enough RAM memory to store the model parameters or the batch of training data needed during a training step. If you run into this issue, try reducing the complexity of your model (both number and depth of layers) or the mini-batch size.

The last three code blocks will use your trained model to generate a sequence of text based on a predefined seed. Do not change any of the code, but run it before submitting your assignment. Your work will be evaluated based on the quality of the generated text. A good result should be legible with decent grammar and spelling (this indicates a high level of learning), but the exact text should not be found anywhere in the actual text (this indicates over-fitting).

Let's start by importing the libraries we will be using, and importing the full text from Alice in Wonderland:


In [9]:
import numpy as np
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from keras.layers import LSTM
from keras.callbacks import ModelCheckpoint
from keras.utils import np_utils

from time import gmtime, strftime
import os
import re
import pickle
import random
import sys

In [10]:
filename = "data/wonderland.txt"
raw_text = open(filename).read()

raw_text = re.sub('[^\nA-Za-z0-9 ,.:;?!-]+', '', raw_text)
raw_text = raw_text.lower()

n_chars = len(raw_text)
print "length of text:", n_chars
print "text preview:", raw_text[:500]


length of text: 141266
text preview: alices adventures in wonderland

lewis carroll

the millennium fulcrum edition 3.0




chapter i. down the rabbit-hole

alice was beginning to get very tired of sitting by her sister on the
bank, and of having nothing to do: once or twice she had peeped into the
book her sister was reading, but it had no pictures or conversations in
it, and what is the use of a book, thought alice without pictures or
conversations?

so she was considering in her own mind as well as she could, for the
hot day mad

In [11]:
# write your code here

# extract all unique characters in the text
chars = sorted(list(set(raw_text)))
n_vocab = len(chars)
print "number of unique characters found:", n_vocab

# create mapping of characters to integers and back
char_to_int = dict((c, i) for i, c in enumerate(chars))
int_to_char = dict((i, c) for i, c in enumerate(chars))

# test our mapping
print 'a', "- maps to ->", char_to_int["a"]
print 25, "- maps to ->", int_to_char[25]

# prepare the dataset of input to output pairs encoded as integers
seq_length = 100 #HYPER PARAMETER

inputs = []
outputs = []

for i in range(0, n_chars - seq_length, 1):
    inputs.append(raw_text[i:i + seq_length])
    outputs.append(raw_text[i + seq_length])
    
n_sequences = len(inputs)
print "Total sequences: ", n_sequences

#shuffle input and output data
indeces = range(len(inputs))
random.shuffle(indeces)

inputs = [inputs[x] for x in indeces]
outputs = [outputs[x] for x in indeces]

print inputs[0], "-->", outputs[0]


number of unique characters found: 37
a - maps to -> 11
25 - maps to -> o
Total sequences:  141166
it, he was
obliged to write with one finger for the rest of the day; and this was
of very little use --> ,

In [12]:
# create two empty numpy array with the proper dimensions
X = np.zeros((n_sequences, seq_length, n_vocab), dtype=np.bool)
y = np.zeros((n_sequences, n_vocab), dtype=np.bool)

# iterate over the data and build up the X and y data sets
# by setting the appropriate indices to 1 in each one-hot vector
for i, example in enumerate(inputs):
    for t, char in enumerate(example):
        X[i, t, char_to_int[char]] = 1
    y[i, char_to_int[outputs[i]]] = 1
    
print 'X dims -->', X.shape
print 'y dims -->', y.shape


X dims --> (141166, 100, 37)
y dims --> (141166, 37)

In [13]:
# define the LSTM model
model = Sequential()
model.add(LSTM(128, return_sequences=False, input_shape=(X.shape[1], X.shape[2])))
model.add(Dropout(0.50)) #HYPER PARAMETER
model.add(Dense(y.shape[1], activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam')

In [14]:
filepath="a6-basic_LSTM.hdf5"
checkpoint = ModelCheckpoint(filepath, monitor='loss', verbose=0, save_best_only=True, mode='min')
callbacks_list = [checkpoint]

In [15]:
def sample(preds, temperature=1.0):
    preds = np.asarray(preds).astype('float64')
    preds = np.log(preds) / temperature
    exp_preds = np.exp(preds)
    preds = exp_preds / np.sum(exp_preds)
    probas = np.random.multinomial(1, preds, 1)
    return np.argmax(probas)

In [16]:
def generate(sentence, sample_length=50, diversity=0.35):
    generated = sentence
    sys.stdout.write(generated)

    for i in range(sample_length):
        x = np.zeros((1, X.shape[1], X.shape[2]))
        for t, char in enumerate(sentence):
            x[0, t, char_to_int[char]] = 1.

        preds = model.predict(x, verbose=0)[0]
        next_index = sample(preds, diversity)
        next_char = int_to_char[next_index]

        generated += next_char
        sentence = sentence[1:] + next_char

        sys.stdout.write(next_char)
        sys.stdout.flush()
    print

In [17]:
epochs = 30 #HYPER PARAMETER
prediction_length = 100

for iteration in range(epochs):
    
    print 'epoch:', iteration + 1, '/', epochs
    #HYPER PARAMETER
    model.fit(X, y, validation_split=0.2, batch_size=256, nb_epoch=1, callbacks=callbacks_list)
    
    # get random starting point for seed
    start_index = random.randint(0, len(raw_text) - seq_length - 1)
    # extract seed sequence from raw text
    seed = raw_text[start_index: start_index + seq_length]
    
    print '----- generating with seed:', seed
    
    for diversity in [0.5, 1.2]:
        generate(seed, prediction_length, diversity)


epoch: 1 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 365s - loss: 2.7806 - val_loss: 2.3790
----- generating with seed: room! no room! they cried out when they saw alice
coming. theres plenty of room! said alice indignan
room! no room! they cried out when they saw alice
coming. theres plenty of room! said alice indignand woun he the che to the soure the he the whe aid the f rouee sat al  is the we the the the she al i
room! no room! they cried out when they saw alice
coming. theres plenty of room! said alice indignand thobudde ho seiligb
la-ow lhee hoiborigg, ay,th thind ystin. ad iato ! sal. oug d olwatroik tit in
epoch: 2 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 364s - loss: 2.3212 - val_loss: 2.1611
----- generating with seed: nd half believed herself in
wonderland, though she knew she had but to open them again, and all
woul
nd half believed herself in
wonderland, though she knew she had but to open them again, and all
woul the soudt worr and the soud on gar and the sate ar as ing the dart sous he the pid in the wat lout 
nd half believed herself in
wonderland, though she knew she had but to open them again, and all
woulu teal! sne

igtao, shende sasreny, smoul  ora
yep heafle

ordanlf on hh canp,
mhir pfoime, suic.
ar
epoch: 3 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 369s - loss: 2.1802 - val_loss: 2.0486
----- generating with seed: ery politely, feeling quite
pleased to have got into a conversation.

you dont know much, said the d
ery politely, feeling quite
pleased to have got into a conversation.

you dont know much, said the dound, in the buts bect and theuling on, and allice on it an and in it said the ance thed souther was
ery politely, feeling quite
pleased to have got into a conversation.

you dont know much, said the dfupledp!e
!o it pefrnowth; aidmeod and anoi!
k-io.
the alevradn. llyeveun.
 iu
wamlidg terof,
soik, 
epoch: 4 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 375s - loss: 2.0904 - val_loss: 1.9692
----- generating with seed: th some curiosity. what a
funny watch! she remarked. it tells the day of the month, and doesnt
tell 
th some curiosity. what a
funny watch! she remarked. it tells the day of the month, and doesnt
tell of the hertering to the morelll you nount very at inde wether the what in the to ken. the doryound s
th some curiosity. what a
funny watch! she remarked. it tells the day of the month, and doesnt
tell otl rafhy, soiss: but grinf rudtisp, in hht hevall.

bovereg sten-elad.

so the sanen, folist afrygn
epoch: 5 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 366s - loss: 2.0215 - val_loss: 1.9080
----- generating with seed: nt on, --found
it advisable to go with edgar atheling to meet william and offer him the
crown. willi
nt on, --found
it advisable to go with edgar atheling to meet william and offer him the
crown. willing out said the that at it wated and the pore and sating aid and all the grothe wertone: en the dont
nt on, --found
it advisable to go with edgar atheling to meet william and offer him the
crown. willie?

berissids there! fhered the destrss.

lavetlich sace mashe mas hing? the vice oung tous, s if jr
epoch: 6 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 375s - loss: 1.9677 - val_loss: 1.8617
----- generating with seed: s right, five! always lay the
blame on others!

youd better not talk! said five. i heard the queen s
s right, five! always lay the
blame on others!

youd better not talk! said five. i heard the queen she manked alice and winl the whith and the could the mades the fort the call and what the herselligh
s right, five! always lay the
blame on others!

youd better not talk! said five. i heard the queen sarpifes
mf mill
gah itco. oum.

perd! 
afd you, the tusey upple,
he wrupdon, think upole at, and cas
epoch: 7 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 364s - loss: 1.9215 - val_loss: 1.8168
----- generating with seed:  wanted much to know,
but the dodo had paused as if it thought that somebody ought to speak,
and no 
 wanted much to know,
but the dodo had paused as if it thought that somebody ought to speak,
and no mere the looked the sant of the rither in to said the was rately the growh her alice was to the crab
 wanted much to know,
but the dodo had paused as if it thought that somebody ought to speak,
and no kert, thes, the gorys, she busnwes she treant
ice uay vore thabe abecill, mas she don  louny viry es
epoch: 8 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 363s - loss: 1.8854 - val_loss: 1.7850
----- generating with seed: ngs may be different, said alice; all i know
is, it would feel very queer to me.

you! said the cate
ngs may be different, said alice; all i know
is, it would feel very queer to me.

you! said the caterreace.

for the catter the raster to nevery in a fertist the mome the had and then out it was a lit
ngs may be different, said alice; all i know
is, it would feel very queer to me.

you! said the caterfof, i woned a fenghy yous sade; thius, he duct: the  mmtesuring to thet hant, whni s, ibtteeld bon
epoch: 9 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 373s - loss: 1.8524 - val_loss: 1.7549
----- generating with seed:  of the court.

all this time the queen had never left off staring at the hatter, and,
just as the d
 of the court.

all this time the queen had never left off staring at the hatter, and,
just as the duchess the griching it sintter be the to sead the firts sied the catsredone the more thing to the sa
 of the court.

all this time the queen had never left off staring at the hatter, and,
just as the dniter,
suid you wange, you jery. i
cherpehs the lame cained insw--op efots-arzilved, alide heidn, th
epoch: 10 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 363s - loss: 1.8214 - val_loss: 1.7279
----- generating with seed: me dead
leaves that had fluttered down from the trees upon her face.

wake up, alice dear! said her 
me dead
leaves that had fluttered down from the trees upon her face.

wake up, alice dear! said her all the king as the little save the wat the hame here the forre a the pary of the said the docs it w
me dead
leaves that had fluttered down from the trees upon her face.

wake up, alice dear! said her comkly uncornover the- i demus, the gat, thermew!

o foor!

c mo
che ather in an leas, as all
youdju
epoch: 11 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 373s - loss: 1.7980 - val_loss: 1.7078
----- generating with seed: does the boots and shoes! she repeated
in a wondering tone.

why, what are your shoes done with? sai
does the boots and shoes! she repeated
in a wondering tone.

why, what are your shoes done with? said the said to the dithers in the were to ive to get of see it the rook of she had and said the histe
does the boots and shoes! she repeated
in a wondering tone.

why, what are your shoes done with? said alice bother, and do wattny nitem, lous sheelly kning gatlesinits by,
remparding in tho begam very
epoch: 12 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 371s - loss: 1.7707 - val_loss: 1.6872
----- generating with seed: ke the look
of things at all, as the game was in such confusion that she never knew
whether it was h
ke the look
of things at all, as the game was in such confusion that she never knew
whether it was her head not and began in a mory the a dowf to be of the his as she began and not gat for the harse s
ke the look
of things at all, as the game was in such confusion that she never knew
whether it was herpee.

oh aroute!,
he
plisn; tnow! hal--appeos.

elleden as the listoush. sat delt, if the leas eas
epoch: 13 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 367s - loss: 1.7494 - val_loss: 1.6753
----- generating with seed: tes she heard
a voice outside, and stopped to listen.

mary ann! mary ann! said the voice. fetch me 
tes she heard
a voice outside, and stopped to listen.

mary ann! mary ann! said the voice. fetch me the hooken it wat like a down and as the coulf wat ligel that so so were a dout not the bous in the 
tes she heard
a voice outside, and stopped to listen.

mary ann! mary ann! said the voice. fetch me negring little reat stes
;o toat.

  w--whes, mot neyopono ad mace at ohk
yeminustsekus timn ad whom
epoch: 14 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 364s - loss: 1.7325 - val_loss: 1.6563
----- generating with seed: alice sharply, for she was beginning to
feel a little worried.

just about as much right, said the d
alice sharply, for she was beginning to
feel a little worried.

just about as much right, said the door of the wither alace.

of the didnent was the door alice said to the king to said the haster of t
alice sharply, for she was beginning to
feel a little worried.

just about as much right, said the dothet hade, the rusther beot fatten-uce all stimt guoped inxe hxaven, ims anoule the omite whster yo
epoch: 15 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 372s - loss: 1.7067 - val_loss: 1.6419
----- generating with seed: boots every christmas.

and she went on planning to herself how she would manage it. they must
go by
boots every christmas.

and she went on planning to herself how she would manage it. they must
go by the wanting and she for and thong the queen of the would not it was the moust in the march as all a
boots every christmas.

and she went on planning to herself how she would manage it. they must
go bytele agehtsund; esgoo, sheow! that as, was troee an
yup. is-witheeted, blyen ik theie! suid they man
epoch: 16 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 363s - loss: 1.6925 - val_loss: 1.6352
----- generating with seed:  voice, thats bill, thought
alice, well, i hardly know--no more, thank ye; im better now--but im
a d
 voice, thats bill, thought
alice, well, i hardly know--no more, thank ye; im better now--but im
a doon it was she said the irther peating to see beant of the very there a little she went in a the and
 voice, thats bill, thought
alice, well, i hardly know--no more, thank ye; im better now--but im
a dight at all cumining poommeyoh, saed hi know, whiks teall,
hacp, the mrce one:eedf
ind; curauning to
epoch: 17 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 363s - loss: 1.6744 - val_loss: 1.6227
----- generating with seed: ous as it can be, said the gryphon.

it all came different! the mock turtle repeated thoughtfully. i
ous as it can be, said the gryphon.

it all came different! the mock turtle repeated thoughtfully. it nead it said alice sead couldne well the say have a donn she was soon whote it the duchess as she 
ous as it can be, said the gryphon.

it all came different! the mock turtle repeated thoughtfully. ies everquise
leusey no
leorif affursing.

when all the beconted to the orgo fon one..

lorily thim,,
epoch: 18 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 365s - loss: 1.6599 - val_loss: 1.6064
----- generating with seed: of the suppressed guinea-pigs,
filled the air, mixed up with the distant sobs of the miserable mock

of the suppressed guinea-pigs,
filled the air, mixed up with the distant sobs of the miserable mock
torther said the dono asing to ting said it and the dorat it to like thing the dormouse was said, an
of the suppressed guinea-pigs,
filled the air, mixed up with the distant sobs of the miserable mock
to-sanving they someictself thing lawls,
every, arl all ple fard! as the saice, what a dowant!

she 
epoch: 19 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 365s - loss: 1.6429 - val_loss: 1.5985
----- generating with seed: ain. in
a minute or two the caterpillar took the hookah out of its mouth
and yawned once or twice, a
ain. in
a minute or two the caterpillar took the hookah out of its mouth
and yawned once or twice, and could hear alack the whate as of that athers, and she think i wonter ad it head look of the sand,
ain. in
a minute or two the caterpillar took the hookah out of its mouth
and yawned once or twice, and the hardly sand were on.-!, was duan-twally: tuph ateamicunby sbow! donhed king fince sging.

her
epoch: 20 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 367s - loss: 1.6301 - val_loss: 1.5919
----- generating with seed:  poor speaker, said the king.

here one of the guinea-pigs cheered, and was immediately suppressed b
 poor speaker, said the king.

here one of the guinea-pigs cheered, and was immediately suppressed be of the caterpiling
the farth here hard near so the parss, she was domont began to hard the suches 
 poor speaker, said the king.

here one of the guinea-pigs cheered, and was immediately suppressed buen iv to so
yim waine shall nool utpere to, said the queen; and os getows
ffom in a aung
pig hs
pre
epoch: 21 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 363s - loss: 1.6175 - val_loss: 1.5820
----- generating with seed: e look
of things at all, as the game was in such confusion that she never knew
whether it was her tu
e look
of things at all, as the game was in such confusion that she never knew
whether it was her turting her hard doon in the time to gat the duchess was a little said and reast better i the puchers 
e look
of things at all, as the game was in such confusion that she never knew
whether it was her turts.

see in
a thing-asniwed as hoteekf! fom thoogs timh! to rebeat ster-aed conlle in ohen stabbed,
epoch: 22 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 367s - loss: 1.6065 - val_loss: 1.5760
----- generating with seed:  all in bed!
on various pretexts they all moved off, and alice was soon left alone.

i wish i hadnt 
 all in bed!
on various pretexts they all moved off, and alice was soon left alone.

i wish i hadnt said gaid the dorup, she said to herself, and the mouse was could not gen the mouse was and the sorm
 all in bed!
on various pretexts they all moved off, and alice was soon left alone.

i wish i hadnt ie a sosehass cad talking  etpint abprintsed prigped in i set wert, but ee; thepertall to all quite 
epoch: 23 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 367s - loss: 1.5929 - val_loss: 1.5683
----- generating with seed: lice was beginning to get very tired of sitting by her sister on the
bank, and of having nothing to 
lice was beginning to get very tired of sitting by her sister on the
bank, and of having nothing to me a canterd, and was the mouse and is would the jury with a mush her off the morse, and the mouse s
lice was beginning to get very tired of sitting by her sister on the
bank, and of having nothing to ghe pellied a durciley dun. whly dopn? quite would tems.

       and gave
    hex touthd juss two wi
epoch: 24 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 364s - loss: 1.5818 - val_loss: 1.5633
----- generating with seed: ught; and how funny itll seem, sending
presents to ones own feet! and how odd the directions will lo
ught; and how funny itll seem, sending
presents to ones own feet! and how odd the directions will low, said the duchess.

when alice thenesterst this a mouse to the court would be it lever hear the co
ught; and how funny itll seem, sending
presents to ones own feet! and how odd the directions will lookibgo ther, said the kinclitg.

the whate
quist on i gone
be to jeat litklay then pry, and snow! th
epoch: 25 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 372s - loss: 1.5681 - val_loss: 1.5590
----- generating with seed: raphy: then drawling--the drawling-master was an old conger-eel,
that used to come once a week: he t
raphy: then drawling--the drawling-master was an old conger-eel,
that used to come once a week: he tond time so the tone and the wither all the hatter; and the got would the belint seemed to to could,
raphy: then drawling--the drawling-master was an old conger-eel,
that used to come once a week: he tous bew
 f hereld on evary,
sed in lee aldiden wouted, thenrilins undonund ot do rauch!
 itsleen bno
epoch: 26 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 372s - loss: 1.5605 - val_loss: 1.5552
----- generating with seed: i never thought about it, said alice. why?

it does the boots and shoes. the gryphon replied very so
i never thought about it, said alice. why?

it does the boots and shoes. the gryphon replied very soon. and the mechention the jury of hear, and will it was a little seed i made on the reasing to the 
i never thought about it, said alice. why?

it does the boots and shoes. the gryphon replied very sonn!

as hnow prrhidn you sif; way whith yinound enawy. spee, tales ay
i sound cullo. which was siss 
epoch: 27 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 369s - loss: 1.5487 - val_loss: 1.5504
----- generating with seed: the cat.

i said pig, replied alice; and i wish you wouldnt keep appearing and
vanishing so suddenly
the cat.

i said pig, replied alice; and i wish you wouldnt keep appearing and
vanishing so suddenly how a pact of at the morse wor ont in whet it went on and a mancher, said the gottonself as which s
the cat.

i said pig, replied alice; and i wish you wouldnt keep appearing and
vanishing so suddenly.

the whith round anmthere, and soon the loors woudds
arreeding with way them, ask olouto the ora!

epoch: 28 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 376s - loss: 1.5409 - val_loss: 1.5499
----- generating with seed: said the lory, with a shiver.

i beg your pardon! said the mouse, frowning, but very politely: did
y
said the lory, with a shiver.

i beg your pardon! said the mouse, frowning, but very politely: did
you mante of the tore, and the were and looked of thats she was she sool do to be the room, and the w
said the lory, with a shiver.

i beg your pardon! said the mouse, frowning, but very politely: did
you, puctredenten like sabousing
and a elk among incalice fruloof! to kus am ut onre myone-
your, as 
epoch: 29 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 366s - loss: 1.5292 - val_loss: 1.5453
----- generating with seed: g come up again, dear! i
shall only look up and say who am i then? tell me that first, and then,
if 
g come up again, dear! i
shall only look up and say who am i then? tell me that first, and then,
if you mouse that it was of the enter.

the patter in a to got to now, what i wintire to every. all she
g come up again, dear! i
shall only look up and say who am i then? tell me that first, and then,
if i pllasnas withe, manter sherell
mo the ativer.
waw a ound hampelffidco, said the frystance,

by the
epoch: 30 / 30
Train on 112932 samples, validate on 28234 samples
Epoch 1/1
112932/112932 [==============================] - 375s - loss: 1.5205 - val_loss: 1.5449
----- generating with seed: ht be some sense in your knocking, the footman went on
without attending to her, if we had the door 
ht be some sense in your knocking, the footman went on
without attending to her, if we had the door all have and the cat diden, it in a set she took the louse, but that it fall things of she was so ne
ht be some sense in your knocking, the footman went on
without attending to her, if we had the door a list,, bus mimeff soas began hind that was noviced for and hid favel arieven! the
kine, and, on, s

Do not change this code, but run it before submitting your assignment to generate the results


In [18]:
def sample(preds, temperature=1.0):
    preds = np.asarray(preds).astype('float64')
    preds = np.log(preds) / temperature
    exp_preds = np.exp(preds)
    preds = exp_preds / np.sum(exp_preds)
    probas = np.random.multinomial(1, preds, 1)
    return np.argmax(probas)

In [19]:
def generate(sentence, sample_length=50, diversity=0.35):
    generated = sentence
    sys.stdout.write(generated)

    for i in range(sample_length):
        x = np.zeros((1, X.shape[1], X.shape[2]))
        for t, char in enumerate(sentence):
            x[0, t, char_to_int[char]] = 1.

        preds = model.predict(x, verbose=0)[0]
        next_index = sample(preds, diversity)
        next_char = int_to_char[next_index]

        generated += next_char
        sentence = sentence[1:] + next_char

        sys.stdout.write(next_char)
        sys.stdout.flush()
    print

In [20]:
prediction_length = 500
seed = "this time alice waited patiently until it chose to speak again. in a minute or two the caterpillar t"

generate(seed, prediction_length, .50)


this time alice waited patiently until it chose to speak again. in a minute or two the caterpillar tone. i was this it was bestself ho to see would said and the
ghented without hall was a look as it was that
dorsat the cat she was going in a large, said the caterpillar said and cours leaged on the came the said the hister somethoug, and the caterpillar minentend the round the cook to the astenst to me dore the dot ot the dormouse becanting and said to alice, as it was to the cat of the sagain, and she was only as the dormouse said to the caterpillar, but the rome to me she would mose on the gr