TV Script Generation

In this project, you'll generate your own Simpsons TV scripts using RNNs. You'll be using part of the Simpsons dataset of scripts from 27 seasons. The Neural Network you'll build will generate a new TV script for a scene at Moe's Tavern.

Get the Data

The data is already provided for you. You'll be using a subset of the original dataset. It consists of only the scenes in Moe's Tavern. This doesn't include other versions of the tavern, like "Moe's Cavern", "Flaming Moe's", "Uncle Moe's Family Feed-Bag", etc..


In [7]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper

data_dir = './data/simpsons/moes_tavern_lines.txt'
text = helper.load_data(data_dir)
# Ignore notice, since we don't use it for analysing the data
text = text[81:]

Explore the Data

Play around with view_sentence_range to view different parts of the data.


In [8]:
view_sentence_range = (0, 10)

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

print('Dataset Stats')
print('Roughly the number of unique words: {}'.format(len({word: None for word in text.split()})))
scenes = text.split('\n\n')
print('Number of scenes: {}'.format(len(scenes)))
sentence_count_scene = [scene.count('\n') for scene in scenes]
print('Average number of sentences in each scene: {}'.format(np.average(sentence_count_scene)))

sentences = [sentence for scene in scenes for sentence in scene.split('\n')]
print('Number of lines: {}'.format(len(sentences)))
word_count_sentence = [len(sentence.split()) for sentence in sentences]
print('Average number of words in each line: {}'.format(np.average(word_count_sentence)))

print()
print('The sentences {} to {}:'.format(*view_sentence_range))
print('\n'.join(text.split('\n')[view_sentence_range[0]:view_sentence_range[1]]))


Dataset Stats
Roughly the number of unique words: 11492
Number of scenes: 262
Average number of sentences in each scene: 15.248091603053435
Number of lines: 4257
Average number of words in each line: 11.50434578341555

The sentences 0 to 10:
Moe_Szyslak: (INTO PHONE) Moe's Tavern. Where the elite meet to drink.
Bart_Simpson: Eh, yeah, hello, is Mike there? Last name, Rotch.
Moe_Szyslak: (INTO PHONE) Hold on, I'll check. (TO BARFLIES) Mike Rotch. Mike Rotch. Hey, has anybody seen Mike Rotch, lately?
Moe_Szyslak: (INTO PHONE) Listen you little puke. One of these days I'm gonna catch you, and I'm gonna carve my name on your back with an ice pick.
Moe_Szyslak: What's the matter Homer? You're not your normal effervescent self.
Homer_Simpson: I got my problems, Moe. Give me another one.
Moe_Szyslak: Homer, hey, you should not drink to forget your problems.
Barney_Gumble: Yeah, you should only drink to enhance your social skills.


Implement Preprocessing Functions

The first thing to do to any dataset is preprocessing. Implement the following preprocessing functions below:

  • Lookup Table
  • Tokenize Punctuation

Lookup Table

To create a word embedding, you first need to transform the words to ids. In this function, create two dictionaries:

  • Dictionary to go from the words to an id, we'll call vocab_to_int
  • Dictionary to go from the id to word, we'll call int_to_vocab

Return these dictionaries in the following tuple (vocab_to_int, int_to_vocab)


In [9]:
import numpy as np
import problem_unittests as tests

def create_lookup_tables(text):
    """
    Create lookup tables for vocabulary
    :param text: The text of tv scripts split into words
    :return: A tuple of dicts (vocab_to_int, int_to_vocab)
    """
    textset = set(text) 
    vocab_to_int = { word: i for i, word in enumerate(textset)}
    int_to_vocab = { i : word for i, word in enumerate(textset)}
    return (vocab_to_int, int_to_vocab)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_create_lookup_tables(create_lookup_tables)


Tests Passed

Tokenize Punctuation

We'll be splitting the script into a word array using spaces as delimiters. However, punctuations like periods and exclamation marks make it hard for the neural network to distinguish between the word "bye" and "bye!".

Implement the function token_lookup to return a dict that will be used to tokenize symbols like "!" into "||Exclamation_Mark||". Create a dictionary for the following symbols where the symbol is the key and value is the token:

  • Period ( . )
  • Comma ( , )
  • Quotation Mark ( " )
  • Semicolon ( ; )
  • Exclamation mark ( ! )
  • Question mark ( ? )
  • Left Parentheses ( ( )
  • Right Parentheses ( ) )
  • Dash ( -- )
  • Return ( \n )

This dictionary will be used to token the symbols and add the delimiter (space) around it. This separates the symbols as it's own word, making it easier for the neural network to predict on the next word. Make sure you don't use a token that could be confused as a word. Instead of using the token "dash", try using something like "||dash||".


In [10]:
def token_lookup():
    """
    Generate a dict to turn punctuation into a token.
    :return: Tokenize dictionary where the key is the punctuation and the value is the token
    """
    return { '.' : '||period||',
             ',' : '||comma||',
             '"' : '||quotes||',
             ';' : '||semicolon||',
             '!' : '||exclamation_mark||',
             '?' : '||question_mark||',
             '(' : '||lparen||',
             ')' : '||rparen||',
             '--' : '||dash||',
             '\n' : '||return||'}

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_tokenize(token_lookup)


Tests Passed

Preprocess all the data and save it

Running the code cell below will preprocess all the data and save it to file.


In [11]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(data_dir, token_lookup, create_lookup_tables)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [12]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper
import numpy as np
import problem_unittests as tests

int_text, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()

Build the Neural Network

You'll build the components necessary to build a RNN by implementing the following functions below:

  • get_inputs
  • get_init_cell
  • get_embed
  • build_rnn
  • build_nn
  • get_batches

Check the Version of TensorFlow and Access to GPU


In [13]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer'
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


TensorFlow Version: 1.0.0
Default GPU Device: /gpu:0

Input

Implement the get_inputs() function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Input text placeholder named "input" using the TF Placeholder name parameter.
  • Targets placeholder
  • Learning Rate placeholder

Return the placeholders in the following tuple (Input, Targets, LearningRate)


In [14]:
def get_inputs():
    """
    Create TF Placeholders for input, targets, and learning rate.
    :return: Tuple (input, targets, learning rate)
    """
    Input = tf.placeholder(tf.int32, shape = [None, None], name="input")
    Targets = tf.placeholder(tf.int32, shape = [None, None])
    LearningRate = tf.placeholder(tf.float32)
    return Input, Targets, LearningRate


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_inputs(get_inputs)


Tests Passed

Build RNN Cell and Initialize

Stack one or more BasicLSTMCells in a MultiRNNCell.

  • The Rnn size should be set using rnn_size
  • Initalize Cell State using the MultiRNNCell's zero_state() function
    • Apply the name "initial_state" to the initial state using tf.identity()

Return the cell and initial state in the following tuple (Cell, InitialState)


In [32]:
def get_init_cell(batch_size, rnn_size):
    """
    Create an RNN Cell and initialize it.
    :param batch_size: Size of batches
    :param rnn_size: Size of RNNs
    :return: Tuple (cell, initialize state)
    """
    num_layers = 1
    keep_prob = 1.0
    #cell = tf.contrib.rnn.BasicLSTMCell(rnn_size)
    Cell = tf.contrib.rnn.MultiRNNCell([tf.contrib.rnn.DropoutWrapper(tf.contrib.rnn.BasicLSTMCell(rnn_size), output_keep_prob=keep_prob) for _ in range(num_layers)])        #[cell] * num_layers)
    InitialState  = Cell.zero_state(batch_size, tf.float32)
    InitialState = tf.identity(InitialState, name="initial_state")
    return Cell, InitialState


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_init_cell(get_init_cell)


Tests Passed

Word Embedding

Apply embedding to input_data using TensorFlow. Return the embedded sequence.


In [16]:
def get_embed(input_data, vocab_size, embed_dim):
    """
    Create embedding for <input_data>.
    :param input_data: TF placeholder for text input.
    :param vocab_size: Number of words in vocabulary.
    :param embed_dim: Number of embedding dimensions
    :return: Embedded input.
    """
    embedding = tf.Variable(tf.random_uniform((vocab_size, embed_dim)))
    embed = tf.nn.embedding_lookup(embedding, input_data)
    return embed


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_embed(get_embed)


Tests Passed

Build RNN

You created a RNN Cell in the get_init_cell() function. Time to use the cell to create a RNN.

Return the outputs and final_state state in the following tuple (Outputs, FinalState)


In [17]:
def build_rnn(cell, inputs):
    """
    Create a RNN using a RNN Cell
    :param cell: RNN Cell
    :param inputs: Input text data
    :return: Tuple (Outputs, Final State)
    """
    Outputs, FinalState = tf.nn.dynamic_rnn(cell, inputs, dtype=tf.float32)
    FinalState = tf.identity(FinalState, name="final_state")
    return Outputs, FinalState


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_build_rnn(build_rnn)


Tests Passed

Build the Neural Network

Apply the functions you implemented above to:

  • Apply embedding to input_data using your get_embed(input_data, vocab_size, embed_dim) function.
  • Build RNN using cell and your build_rnn(cell, inputs) function.
  • Apply a fully connected layer with a linear activation and vocab_size as the number of outputs.

Return the logits and final state in the following tuple (Logits, FinalState)


In [18]:
def build_nn(cell, rnn_size, input_data, vocab_size, embed_dim):
    """
    Build part of the neural network
    :param cell: RNN cell
    :param rnn_size: Size of rnns
    :param input_data: Input data
    :param vocab_size: Vocabulary size
    :param embed_dim: Number of embedding dimensions
    :return: Tuple (Logits, FinalState)
    """
    embed_input = get_embed(input_data, vocab_size, embed_dim)
    rnn_output, FinalState = build_rnn(cell, embed_input)

    Logits = tf.contrib.layers.fully_connected(inputs = rnn_output, num_outputs = vocab_size, activation_fn = None,
                                               weights_initializer = tf.truncated_normal_initializer(stddev=0.1))
                                               #biases_initializer = tf.zeros_initializer())
    
#    seq_output = tf.concat(rnn_output, axis=1)
#    x = tf.reshape(rnn_output, [-1, rnn_size])
#    weights = tf.Variable(tf.truncated_normal((rnn_size, vocab_size), stddev=0.1))
#    biases = tf.Variable(tf.zeros(vocab_size))
#    Logits = tf.matmul(x, weights) + biases
#    Logits = tf.reshape(Logits, input_data.get_shape().as_list() + [vocab_size])

    return Logits, FinalState


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_build_nn(build_nn)


Tests Passed

Batches

Implement get_batches to create batches of input and targets using int_text. The batches should be a Numpy array with the shape (number of batches, 2, batch size, sequence length). Each batch contains two elements:

  • The first element is a single batch of input with the shape [batch size, sequence length]
  • The second element is a single batch of targets with the shape [batch size, sequence length]

If you can't fill the last batch with enough data, drop the last batch.

For exmple, get_batches([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20], 3, 2) would return a Numpy array of the following:

[
  # First Batch
  [
    # Batch of Input
    [[ 1  2], [ 7  8], [13 14]]
    # Batch of targets
    [[ 2  3], [ 8  9], [14 15]]
  ]

  # Second Batch
  [
    # Batch of Input
    [[ 3  4], [ 9 10], [15 16]]
    # Batch of targets
    [[ 4  5], [10 11], [16 17]]
  ]

  # Third Batch
  [
    # Batch of Input
    [[ 5  6], [11 12], [17 18]]
    # Batch of targets
    [[ 6  7], [12 13], [18  1]]
  ]
]

Notice that the last target value in the last batch is the first input value of the first batch. In this case, 1. This is a common technique used when creating sequence batches, although it is rather unintuitive.


In [19]:
def get_batches(int_text, batch_size, seq_length):
    """
    Return batches of input and target
    :param int_text: Text with the words replaced by their ids
    :param batch_size: The size of batch
    :param seq_length: The length of sequence
    :return: Batches as a Numpy array
    """
    words_per_batch = batch_size * seq_length
    nbatches = len(int_text) // words_per_batch
    int_text = int_text[:nbatches * words_per_batch]

    # Yes, I know it's not the most pythonic way to do it like this.
    # It was just the simplest way for me to get the batches right the first time,
    # and focus on the other parts of the exercise.
    batches = np.array(np.zeros((nbatches, 2, batch_size, seq_length), dtype=np.int32))
    batchno = 0
    posinbatch = 0
    for i in range(0, len(int_text), seq_length):
        for j in range(seq_length):
            batches[batchno][0][posinbatch][j] = int_text[i + j]
            batches[batchno][1][posinbatch][j] = int_text[(i + j + 1) % len(int_text)]
            
        batchno += 1
        if batchno == nbatches:
            batchno = 0
            posinbatch += 1
    
    return batches


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_batches(get_batches)


Tests Passed

Neural Network Training

Hyperparameters

Tune the following parameters:

  • Set num_epochs to the number of epochs.
  • Set batch_size to the batch size.
  • Set rnn_size to the size of the RNNs.
  • Set embed_dim to the size of the embedding.
  • Set seq_length to the length of sequence.
  • Set learning_rate to the learning rate.
  • Set show_every_n_batches to the number of batches the neural network should print progress.

In [52]:
# Number of Epochs
num_epochs = 200
# Batch Size
batch_size = 512
# RNN Size
rnn_size = 1024
# Embedding Dimension Size
embed_dim = 512
# Sequence Length
seq_length = 16
# Learning Rate
learning_rate = 0.01
# Show stats for every n number of batches
show_every_n_batches = 1

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
save_dir = './save'

Build the Graph

Build the graph using the neural network you implemented.


In [53]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from tensorflow.contrib import seq2seq

train_graph = tf.Graph()
with train_graph.as_default():
    vocab_size = len(int_to_vocab)
    input_text, targets, lr = get_inputs()
    input_data_shape = tf.shape(input_text)
    cell, initial_state = get_init_cell(input_data_shape[0], rnn_size)
    logits, final_state = build_nn(cell, rnn_size, input_text, vocab_size, embed_dim)

    # Probabilities for generating words
    probs = tf.nn.softmax(logits, name='probs')

    # Loss function
    cost = seq2seq.sequence_loss(
        logits,
        targets,
        tf.ones([input_data_shape[0], input_data_shape[1]]))

    # Optimizer
    optimizer = tf.train.AdamOptimizer(lr)

    # Gradient Clipping
    gradients = optimizer.compute_gradients(cost)
    capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients if grad is not None]
    train_op = optimizer.apply_gradients(capped_gradients)

Train

Train the neural network on the preprocessed data. If you have a hard time getting a good loss, check the forms to see if anyone is having the same problem.


In [54]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
batches = get_batches(int_text, batch_size, seq_length)

with tf.Session(graph=train_graph) as sess:
    sess.run(tf.global_variables_initializer())

    for epoch_i in range(num_epochs):
        state = sess.run(initial_state, {input_text: batches[0][0]})

        for batch_i, (x, y) in enumerate(batches):
            feed = {
                input_text: x,
                targets: y,
                initial_state: state,
                lr: learning_rate}
            train_loss, state, _ = sess.run([cost, final_state, train_op], feed)

            # Show every <show_every_n_batches> batches
            if (epoch_i * len(batches) + batch_i) % show_every_n_batches == 0:
                print('Epoch {:>3} Batch {:>4}/{}   train_loss = {:.3f}'.format(
                    epoch_i,
                    batch_i,
                    len(batches),
                    train_loss))

    # Save Model
    saver = tf.train.Saver()
    saver.save(sess, save_dir)
    print('Model Trained and Saved')


Epoch   0 Batch    0/8   train_loss = 8.867
Epoch   0 Batch    1/8   train_loss = 17.487
Epoch   0 Batch    2/8   train_loss = 15.607
Epoch   0 Batch    3/8   train_loss = 9.869
Epoch   0 Batch    4/8   train_loss = 9.826
Epoch   0 Batch    5/8   train_loss = 10.125
Epoch   0 Batch    6/8   train_loss = 10.781
Epoch   0 Batch    7/8   train_loss = 10.624
Epoch   1 Batch    0/8   train_loss = 9.363
Epoch   1 Batch    1/8   train_loss = 8.380
Epoch   1 Batch    2/8   train_loss = 8.317
Epoch   1 Batch    3/8   train_loss = 8.249
Epoch   1 Batch    4/8   train_loss = 8.187
Epoch   1 Batch    5/8   train_loss = 7.995
Epoch   1 Batch    6/8   train_loss = 7.803
Epoch   1 Batch    7/8   train_loss = 7.591
Epoch   2 Batch    0/8   train_loss = 6.983
Epoch   2 Batch    1/8   train_loss = 6.812
Epoch   2 Batch    2/8   train_loss = 6.508
Epoch   2 Batch    3/8   train_loss = 6.386
Epoch   2 Batch    4/8   train_loss = 6.434
Epoch   2 Batch    5/8   train_loss = 6.513
Epoch   2 Batch    6/8   train_loss = 6.664
Epoch   2 Batch    7/8   train_loss = 6.578
Epoch   3 Batch    0/8   train_loss = 6.521
Epoch   3 Batch    1/8   train_loss = 6.556
Epoch   3 Batch    2/8   train_loss = 6.476
Epoch   3 Batch    3/8   train_loss = 6.438
Epoch   3 Batch    4/8   train_loss = 6.417
Epoch   3 Batch    5/8   train_loss = 6.329
Epoch   3 Batch    6/8   train_loss = 6.314
Epoch   3 Batch    7/8   train_loss = 6.251
Epoch   4 Batch    0/8   train_loss = 6.129
Epoch   4 Batch    1/8   train_loss = 6.028
Epoch   4 Batch    2/8   train_loss = 6.038
Epoch   4 Batch    3/8   train_loss = 6.044
Epoch   4 Batch    4/8   train_loss = 6.118
Epoch   4 Batch    5/8   train_loss = 6.160
Epoch   4 Batch    6/8   train_loss = 6.180
Epoch   4 Batch    7/8   train_loss = 6.141
Epoch   5 Batch    0/8   train_loss = 6.017
Epoch   5 Batch    1/8   train_loss = 5.993
Epoch   5 Batch    2/8   train_loss = 5.969
Epoch   5 Batch    3/8   train_loss = 5.946
Epoch   5 Batch    4/8   train_loss = 5.945
Epoch   5 Batch    5/8   train_loss = 5.886
Epoch   5 Batch    6/8   train_loss = 5.882
Epoch   5 Batch    7/8   train_loss = 5.828
Epoch   6 Batch    0/8   train_loss = 5.851
Epoch   6 Batch    1/8   train_loss = 5.894
Epoch   6 Batch    2/8   train_loss = 5.877
Epoch   6 Batch    3/8   train_loss = 5.857
Epoch   6 Batch    4/8   train_loss = 5.883
Epoch   6 Batch    5/8   train_loss = 5.858
Epoch   6 Batch    6/8   train_loss = 5.850
Epoch   6 Batch    7/8   train_loss = 5.787
Epoch   7 Batch    0/8   train_loss = 5.758
Epoch   7 Batch    1/8   train_loss = 5.714
Epoch   7 Batch    2/8   train_loss = 5.713
Epoch   7 Batch    3/8   train_loss = 5.687
Epoch   7 Batch    4/8   train_loss = 5.716
Epoch   7 Batch    5/8   train_loss = 5.696
Epoch   7 Batch    6/8   train_loss = 5.692
Epoch   7 Batch    7/8   train_loss = 5.645
Epoch   8 Batch    0/8   train_loss = 5.648
Epoch   8 Batch    1/8   train_loss = 5.670
Epoch   8 Batch    2/8   train_loss = 5.656
Epoch   8 Batch    3/8   train_loss = 5.617
Epoch   8 Batch    4/8   train_loss = 5.640
Epoch   8 Batch    5/8   train_loss = 5.593
Epoch   8 Batch    6/8   train_loss = 5.563
Epoch   8 Batch    7/8   train_loss = 5.495
Epoch   9 Batch    0/8   train_loss = 5.517
Epoch   9 Batch    1/8   train_loss = 5.527
Epoch   9 Batch    2/8   train_loss = 5.499
Epoch   9 Batch    3/8   train_loss = 5.457
Epoch   9 Batch    4/8   train_loss = 5.493
Epoch   9 Batch    5/8   train_loss = 5.477
Epoch   9 Batch    6/8   train_loss = 5.484
Epoch   9 Batch    7/8   train_loss = 5.421
Epoch  10 Batch    0/8   train_loss = 5.407
Epoch  10 Batch    1/8   train_loss = 5.434
Epoch  10 Batch    2/8   train_loss = 5.423
Epoch  10 Batch    3/8   train_loss = 5.394
Epoch  10 Batch    4/8   train_loss = 5.407
Epoch  10 Batch    5/8   train_loss = 5.355
Epoch  10 Batch    6/8   train_loss = 5.353
Epoch  10 Batch    7/8   train_loss = 5.309
Epoch  11 Batch    0/8   train_loss = 5.308
Epoch  11 Batch    1/8   train_loss = 5.321
Epoch  11 Batch    2/8   train_loss = 5.315
Epoch  11 Batch    3/8   train_loss = 5.279
Epoch  11 Batch    4/8   train_loss = 5.324
Epoch  11 Batch    5/8   train_loss = 5.289
Epoch  11 Batch    6/8   train_loss = 5.290
Epoch  11 Batch    7/8   train_loss = 5.230
Epoch  12 Batch    0/8   train_loss = 5.223
Epoch  12 Batch    1/8   train_loss = 5.265
Epoch  12 Batch    2/8   train_loss = 5.260
Epoch  12 Batch    3/8   train_loss = 5.221
Epoch  12 Batch    4/8   train_loss = 5.233
Epoch  12 Batch    5/8   train_loss = 5.201
Epoch  12 Batch    6/8   train_loss = 5.203
Epoch  12 Batch    7/8   train_loss = 5.151
Epoch  13 Batch    0/8   train_loss = 5.134
Epoch  13 Batch    1/8   train_loss = 5.156
Epoch  13 Batch    2/8   train_loss = 5.166
Epoch  13 Batch    3/8   train_loss = 5.129
Epoch  13 Batch    4/8   train_loss = 5.163
Epoch  13 Batch    5/8   train_loss = 5.119
Epoch  13 Batch    6/8   train_loss = 5.115
Epoch  13 Batch    7/8   train_loss = 5.069
Epoch  14 Batch    0/8   train_loss = 5.065
Epoch  14 Batch    1/8   train_loss = 5.098
Epoch  14 Batch    2/8   train_loss = 5.086
Epoch  14 Batch    3/8   train_loss = 5.060
Epoch  14 Batch    4/8   train_loss = 5.078
Epoch  14 Batch    5/8   train_loss = 5.034
Epoch  14 Batch    6/8   train_loss = 5.037
Epoch  14 Batch    7/8   train_loss = 4.979
Epoch  15 Batch    0/8   train_loss = 4.968
Epoch  15 Batch    1/8   train_loss = 5.000
Epoch  15 Batch    2/8   train_loss = 4.994
Epoch  15 Batch    3/8   train_loss = 4.968
Epoch  15 Batch    4/8   train_loss = 4.991
Epoch  15 Batch    5/8   train_loss = 4.947
Epoch  15 Batch    6/8   train_loss = 4.957
Epoch  15 Batch    7/8   train_loss = 4.912
Epoch  16 Batch    0/8   train_loss = 4.897
Epoch  16 Batch    1/8   train_loss = 4.916
Epoch  16 Batch    2/8   train_loss = 4.905
Epoch  16 Batch    3/8   train_loss = 4.884
Epoch  16 Batch    4/8   train_loss = 4.917
Epoch  16 Batch    5/8   train_loss = 4.874
Epoch  16 Batch    6/8   train_loss = 4.880
Epoch  16 Batch    7/8   train_loss = 4.827
Epoch  17 Batch    0/8   train_loss = 4.815
Epoch  17 Batch    1/8   train_loss = 4.844
Epoch  17 Batch    2/8   train_loss = 4.842
Epoch  17 Batch    3/8   train_loss = 4.818
Epoch  17 Batch    4/8   train_loss = 4.844
Epoch  17 Batch    5/8   train_loss = 4.808
Epoch  17 Batch    6/8   train_loss = 4.817
Epoch  17 Batch    7/8   train_loss = 4.760
Epoch  18 Batch    0/8   train_loss = 4.745
Epoch  18 Batch    1/8   train_loss = 4.782
Epoch  18 Batch    2/8   train_loss = 4.766
Epoch  18 Batch    3/8   train_loss = 4.746
Epoch  18 Batch    4/8   train_loss = 4.765
Epoch  18 Batch    5/8   train_loss = 4.735
Epoch  18 Batch    6/8   train_loss = 4.751
Epoch  18 Batch    7/8   train_loss = 4.699
Epoch  19 Batch    0/8   train_loss = 4.679
Epoch  19 Batch    1/8   train_loss = 4.708
Epoch  19 Batch    2/8   train_loss = 4.703
Epoch  19 Batch    3/8   train_loss = 4.690
Epoch  19 Batch    4/8   train_loss = 4.708
Epoch  19 Batch    5/8   train_loss = 4.672
Epoch  19 Batch    6/8   train_loss = 4.680
Epoch  19 Batch    7/8   train_loss = 4.634
Epoch  20 Batch    0/8   train_loss = 4.622
Epoch  20 Batch    1/8   train_loss = 4.655
Epoch  20 Batch    2/8   train_loss = 4.645
Epoch  20 Batch    3/8   train_loss = 4.631
Epoch  20 Batch    4/8   train_loss = 4.642
Epoch  20 Batch    5/8   train_loss = 4.611
Epoch  20 Batch    6/8   train_loss = 4.613
Epoch  20 Batch    7/8   train_loss = 4.570
Epoch  21 Batch    0/8   train_loss = 4.545
Epoch  21 Batch    1/8   train_loss = 4.579
Epoch  21 Batch    2/8   train_loss = 4.579
Epoch  21 Batch    3/8   train_loss = 4.553
Epoch  21 Batch    4/8   train_loss = 4.561
Epoch  21 Batch    5/8   train_loss = 4.530
Epoch  21 Batch    6/8   train_loss = 4.541
Epoch  21 Batch    7/8   train_loss = 4.500
Epoch  22 Batch    0/8   train_loss = 4.483
Epoch  22 Batch    1/8   train_loss = 4.511
Epoch  22 Batch    2/8   train_loss = 4.504
Epoch  22 Batch    3/8   train_loss = 4.506
Epoch  22 Batch    4/8   train_loss = 4.513
Epoch  22 Batch    5/8   train_loss = 4.470
Epoch  22 Batch    6/8   train_loss = 4.469
Epoch  22 Batch    7/8   train_loss = 4.430
Epoch  23 Batch    0/8   train_loss = 4.413
Epoch  23 Batch    1/8   train_loss = 4.439
Epoch  23 Batch    2/8   train_loss = 4.425
Epoch  23 Batch    3/8   train_loss = 4.409
Epoch  23 Batch    4/8   train_loss = 4.430
Epoch  23 Batch    5/8   train_loss = 4.406
Epoch  23 Batch    6/8   train_loss = 4.404
Epoch  23 Batch    7/8   train_loss = 4.363
Epoch  24 Batch    0/8   train_loss = 4.347
Epoch  24 Batch    1/8   train_loss = 4.380
Epoch  24 Batch    2/8   train_loss = 4.368
Epoch  24 Batch    3/8   train_loss = 4.357
Epoch  24 Batch    4/8   train_loss = 4.371
Epoch  24 Batch    5/8   train_loss = 4.353
Epoch  24 Batch    6/8   train_loss = 4.347
Epoch  24 Batch    7/8   train_loss = 4.295
Epoch  25 Batch    0/8   train_loss = 4.276
Epoch  25 Batch    1/8   train_loss = 4.321
Epoch  25 Batch    2/8   train_loss = 4.309
Epoch  25 Batch    3/8   train_loss = 4.283
Epoch  25 Batch    4/8   train_loss = 4.291
Epoch  25 Batch    5/8   train_loss = 4.275
Epoch  25 Batch    6/8   train_loss = 4.293
Epoch  25 Batch    7/8   train_loss = 4.254
Epoch  26 Batch    0/8   train_loss = 4.224
Epoch  26 Batch    1/8   train_loss = 4.249
Epoch  26 Batch    2/8   train_loss = 4.244
Epoch  26 Batch    3/8   train_loss = 4.223
Epoch  26 Batch    4/8   train_loss = 4.238
Epoch  26 Batch    5/8   train_loss = 4.216
Epoch  26 Batch    6/8   train_loss = 4.226
Epoch  26 Batch    7/8   train_loss = 4.190
Epoch  27 Batch    0/8   train_loss = 4.159
Epoch  27 Batch    1/8   train_loss = 4.187
Epoch  27 Batch    2/8   train_loss = 4.172
Epoch  27 Batch    3/8   train_loss = 4.155
Epoch  27 Batch    4/8   train_loss = 4.166
Epoch  27 Batch    5/8   train_loss = 4.149
Epoch  27 Batch    6/8   train_loss = 4.160
Epoch  27 Batch    7/8   train_loss = 4.123
Epoch  28 Batch    0/8   train_loss = 4.103
Epoch  28 Batch    1/8   train_loss = 4.132
Epoch  28 Batch    2/8   train_loss = 4.118
Epoch  28 Batch    3/8   train_loss = 4.114
Epoch  28 Batch    4/8   train_loss = 4.119
Epoch  28 Batch    5/8   train_loss = 4.093
Epoch  28 Batch    6/8   train_loss = 4.101
Epoch  28 Batch    7/8   train_loss = 4.067
Epoch  29 Batch    0/8   train_loss = 4.037
Epoch  29 Batch    1/8   train_loss = 4.068
Epoch  29 Batch    2/8   train_loss = 4.050
Epoch  29 Batch    3/8   train_loss = 4.041
Epoch  29 Batch    4/8   train_loss = 4.055
Epoch  29 Batch    5/8   train_loss = 4.043
Epoch  29 Batch    6/8   train_loss = 4.041
Epoch  29 Batch    7/8   train_loss = 4.004
Epoch  30 Batch    0/8   train_loss = 3.974
Epoch  30 Batch    1/8   train_loss = 4.028
Epoch  30 Batch    2/8   train_loss = 4.015
Epoch  30 Batch    3/8   train_loss = 3.990
Epoch  30 Batch    4/8   train_loss = 4.004
Epoch  30 Batch    5/8   train_loss = 3.996
Epoch  30 Batch    6/8   train_loss = 3.990
Epoch  30 Batch    7/8   train_loss = 3.968
Epoch  31 Batch    0/8   train_loss = 3.948
Epoch  31 Batch    1/8   train_loss = 3.965
Epoch  31 Batch    2/8   train_loss = 3.953
Epoch  31 Batch    3/8   train_loss = 3.943
Epoch  31 Batch    4/8   train_loss = 3.959
Epoch  31 Batch    5/8   train_loss = 3.940
Epoch  31 Batch    6/8   train_loss = 3.938
Epoch  31 Batch    7/8   train_loss = 3.901
Epoch  32 Batch    0/8   train_loss = 3.893
Epoch  32 Batch    1/8   train_loss = 3.927
Epoch  32 Batch    2/8   train_loss = 3.903
Epoch  32 Batch    3/8   train_loss = 3.898
Epoch  32 Batch    4/8   train_loss = 3.913
Epoch  32 Batch    5/8   train_loss = 3.896
Epoch  32 Batch    6/8   train_loss = 3.890
Epoch  32 Batch    7/8   train_loss = 3.862
Epoch  33 Batch    0/8   train_loss = 3.853
Epoch  33 Batch    1/8   train_loss = 3.882
Epoch  33 Batch    2/8   train_loss = 3.859
Epoch  33 Batch    3/8   train_loss = 3.843
Epoch  33 Batch    4/8   train_loss = 3.860
Epoch  33 Batch    5/8   train_loss = 3.845
Epoch  33 Batch    6/8   train_loss = 3.845
Epoch  33 Batch    7/8   train_loss = 3.811
Epoch  34 Batch    0/8   train_loss = 3.781
Epoch  34 Batch    1/8   train_loss = 3.819
Epoch  34 Batch    2/8   train_loss = 3.807
Epoch  34 Batch    3/8   train_loss = 3.787
Epoch  34 Batch    4/8   train_loss = 3.803
Epoch  34 Batch    5/8   train_loss = 3.794
Epoch  34 Batch    6/8   train_loss = 3.787
Epoch  34 Batch    7/8   train_loss = 3.754
Epoch  35 Batch    0/8   train_loss = 3.731
Epoch  35 Batch    1/8   train_loss = 3.775
Epoch  35 Batch    2/8   train_loss = 3.756
Epoch  35 Batch    3/8   train_loss = 3.731
Epoch  35 Batch    4/8   train_loss = 3.750
Epoch  35 Batch    5/8   train_loss = 3.743
Epoch  35 Batch    6/8   train_loss = 3.735
Epoch  35 Batch    7/8   train_loss = 3.713
Epoch  36 Batch    0/8   train_loss = 3.688
Epoch  36 Batch    1/8   train_loss = 3.714
Epoch  36 Batch    2/8   train_loss = 3.708
Epoch  36 Batch    3/8   train_loss = 3.686
Epoch  36 Batch    4/8   train_loss = 3.717
Epoch  36 Batch    5/8   train_loss = 3.705
Epoch  36 Batch    6/8   train_loss = 3.683
Epoch  36 Batch    7/8   train_loss = 3.660
Epoch  37 Batch    0/8   train_loss = 3.649
Epoch  37 Batch    1/8   train_loss = 3.674
Epoch  37 Batch    2/8   train_loss = 3.663
Epoch  37 Batch    3/8   train_loss = 3.633
Epoch  37 Batch    4/8   train_loss = 3.655
Epoch  37 Batch    5/8   train_loss = 3.652
Epoch  37 Batch    6/8   train_loss = 3.646
Epoch  37 Batch    7/8   train_loss = 3.610
Epoch  38 Batch    0/8   train_loss = 3.598
Epoch  38 Batch    1/8   train_loss = 3.618
Epoch  38 Batch    2/8   train_loss = 3.621
Epoch  38 Batch    3/8   train_loss = 3.605
Epoch  38 Batch    4/8   train_loss = 3.612
Epoch  38 Batch    5/8   train_loss = 3.600
Epoch  38 Batch    6/8   train_loss = 3.589
Epoch  38 Batch    7/8   train_loss = 3.566
Epoch  39 Batch    0/8   train_loss = 3.552
Epoch  39 Batch    1/8   train_loss = 3.572
Epoch  39 Batch    2/8   train_loss = 3.561
Epoch  39 Batch    3/8   train_loss = 3.543
Epoch  39 Batch    4/8   train_loss = 3.563
Epoch  39 Batch    5/8   train_loss = 3.558
Epoch  39 Batch    6/8   train_loss = 3.547
Epoch  39 Batch    7/8   train_loss = 3.524
Epoch  40 Batch    0/8   train_loss = 3.506
Epoch  40 Batch    1/8   train_loss = 3.540
Epoch  40 Batch    2/8   train_loss = 3.526
Epoch  40 Batch    3/8   train_loss = 3.501
Epoch  40 Batch    4/8   train_loss = 3.526
Epoch  40 Batch    5/8   train_loss = 3.520
Epoch  40 Batch    6/8   train_loss = 3.515
Epoch  40 Batch    7/8   train_loss = 3.491
Epoch  41 Batch    0/8   train_loss = 3.468
Epoch  41 Batch    1/8   train_loss = 3.499
Epoch  41 Batch    2/8   train_loss = 3.483
Epoch  41 Batch    3/8   train_loss = 3.465
Epoch  41 Batch    4/8   train_loss = 3.490
Epoch  41 Batch    5/8   train_loss = 3.474
Epoch  41 Batch    6/8   train_loss = 3.471
Epoch  41 Batch    7/8   train_loss = 3.435
Epoch  42 Batch    0/8   train_loss = 3.426
Epoch  42 Batch    1/8   train_loss = 3.448
Epoch  42 Batch    2/8   train_loss = 3.431
Epoch  42 Batch    3/8   train_loss = 3.418
Epoch  42 Batch    4/8   train_loss = 3.443
Epoch  42 Batch    5/8   train_loss = 3.431
Epoch  42 Batch    6/8   train_loss = 3.409
Epoch  42 Batch    7/8   train_loss = 3.382
Epoch  43 Batch    0/8   train_loss = 3.374
Epoch  43 Batch    1/8   train_loss = 3.395
Epoch  43 Batch    2/8   train_loss = 3.385
Epoch  43 Batch    3/8   train_loss = 3.366
Epoch  43 Batch    4/8   train_loss = 3.382
Epoch  43 Batch    5/8   train_loss = 3.379
Epoch  43 Batch    6/8   train_loss = 3.367
Epoch  43 Batch    7/8   train_loss = 3.342
Epoch  44 Batch    0/8   train_loss = 3.327
Epoch  44 Batch    1/8   train_loss = 3.346
Epoch  44 Batch    2/8   train_loss = 3.348
Epoch  44 Batch    3/8   train_loss = 3.327
Epoch  44 Batch    4/8   train_loss = 3.343
Epoch  44 Batch    5/8   train_loss = 3.327
Epoch  44 Batch    6/8   train_loss = 3.308
Epoch  44 Batch    7/8   train_loss = 3.296
Epoch  45 Batch    0/8   train_loss = 3.292
Epoch  45 Batch    1/8   train_loss = 3.299
Epoch  45 Batch    2/8   train_loss = 3.296
Epoch  45 Batch    3/8   train_loss = 3.275
Epoch  45 Batch    4/8   train_loss = 3.305
Epoch  45 Batch    5/8   train_loss = 3.295
Epoch  45 Batch    6/8   train_loss = 3.269
Epoch  45 Batch    7/8   train_loss = 3.252
Epoch  46 Batch    0/8   train_loss = 3.245
Epoch  46 Batch    1/8   train_loss = 3.267
Epoch  46 Batch    2/8   train_loss = 3.267
Epoch  46 Batch    3/8   train_loss = 3.239
Epoch  46 Batch    4/8   train_loss = 3.265
Epoch  46 Batch    5/8   train_loss = 3.256
Epoch  46 Batch    6/8   train_loss = 3.244
Epoch  46 Batch    7/8   train_loss = 3.222
Epoch  47 Batch    0/8   train_loss = 3.212
Epoch  47 Batch    1/8   train_loss = 3.233
Epoch  47 Batch    2/8   train_loss = 3.236
Epoch  47 Batch    3/8   train_loss = 3.215
Epoch  47 Batch    4/8   train_loss = 3.243
Epoch  47 Batch    5/8   train_loss = 3.222
Epoch  47 Batch    6/8   train_loss = 3.205
Epoch  47 Batch    7/8   train_loss = 3.190
Epoch  48 Batch    0/8   train_loss = 3.193
Epoch  48 Batch    1/8   train_loss = 3.200
Epoch  48 Batch    2/8   train_loss = 3.190
Epoch  48 Batch    3/8   train_loss = 3.176
Epoch  48 Batch    4/8   train_loss = 3.203
Epoch  48 Batch    5/8   train_loss = 3.198
Epoch  48 Batch    6/8   train_loss = 3.163
Epoch  48 Batch    7/8   train_loss = 3.159
Epoch  49 Batch    0/8   train_loss = 3.165
Epoch  49 Batch    1/8   train_loss = 3.168
Epoch  49 Batch    2/8   train_loss = 3.160
Epoch  49 Batch    3/8   train_loss = 3.137
Epoch  49 Batch    4/8   train_loss = 3.170
Epoch  49 Batch    5/8   train_loss = 3.179
Epoch  49 Batch    6/8   train_loss = 3.149
Epoch  49 Batch    7/8   train_loss = 3.124
Epoch  50 Batch    0/8   train_loss = 3.119
Epoch  50 Batch    1/8   train_loss = 3.135
Epoch  50 Batch    2/8   train_loss = 3.134
Epoch  50 Batch    3/8   train_loss = 3.115
Epoch  50 Batch    4/8   train_loss = 3.134
Epoch  50 Batch    5/8   train_loss = 3.128
Epoch  50 Batch    6/8   train_loss = 3.099
Epoch  50 Batch    7/8   train_loss = 3.088
Epoch  51 Batch    0/8   train_loss = 3.084
Epoch  51 Batch    1/8   train_loss = 3.094
Epoch  51 Batch    2/8   train_loss = 3.078
Epoch  51 Batch    3/8   train_loss = 3.067
Epoch  51 Batch    4/8   train_loss = 3.087
Epoch  51 Batch    5/8   train_loss = 3.085
Epoch  51 Batch    6/8   train_loss = 3.057
Epoch  51 Batch    7/8   train_loss = 3.041
Epoch  52 Batch    0/8   train_loss = 3.038
Epoch  52 Batch    1/8   train_loss = 3.047
Epoch  52 Batch    2/8   train_loss = 3.041
Epoch  52 Batch    3/8   train_loss = 3.029
Epoch  52 Batch    4/8   train_loss = 3.052
Epoch  52 Batch    5/8   train_loss = 3.041
Epoch  52 Batch    6/8   train_loss = 3.007
Epoch  52 Batch    7/8   train_loss = 2.996
Epoch  53 Batch    0/8   train_loss = 3.002
Epoch  53 Batch    1/8   train_loss = 3.013
Epoch  53 Batch    2/8   train_loss = 3.004
Epoch  53 Batch    3/8   train_loss = 2.972
Epoch  53 Batch    4/8   train_loss = 3.007
Epoch  53 Batch    5/8   train_loss = 3.003
Epoch  53 Batch    6/8   train_loss = 2.970
Epoch  53 Batch    7/8   train_loss = 2.968
Epoch  54 Batch    0/8   train_loss = 2.964
Epoch  54 Batch    1/8   train_loss = 2.972
Epoch  54 Batch    2/8   train_loss = 2.969
Epoch  54 Batch    3/8   train_loss = 2.941
Epoch  54 Batch    4/8   train_loss = 2.976
Epoch  54 Batch    5/8   train_loss = 2.963
Epoch  54 Batch    6/8   train_loss = 2.937
Epoch  54 Batch    7/8   train_loss = 2.929
Epoch  55 Batch    0/8   train_loss = 2.938
Epoch  55 Batch    1/8   train_loss = 2.946
Epoch  55 Batch    2/8   train_loss = 2.935
Epoch  55 Batch    3/8   train_loss = 2.901
Epoch  55 Batch    4/8   train_loss = 2.946
Epoch  55 Batch    5/8   train_loss = 2.944
Epoch  55 Batch    6/8   train_loss = 2.911
Epoch  55 Batch    7/8   train_loss = 2.894
Epoch  56 Batch    0/8   train_loss = 2.905
Epoch  56 Batch    1/8   train_loss = 2.914
Epoch  56 Batch    2/8   train_loss = 2.912
Epoch  56 Batch    3/8   train_loss = 2.884
Epoch  56 Batch    4/8   train_loss = 2.907
Epoch  56 Batch    5/8   train_loss = 2.911
Epoch  56 Batch    6/8   train_loss = 2.892
Epoch  56 Batch    7/8   train_loss = 2.873
Epoch  57 Batch    0/8   train_loss = 2.874
Epoch  57 Batch    1/8   train_loss = 2.881
Epoch  57 Batch    2/8   train_loss = 2.882
Epoch  57 Batch    3/8   train_loss = 2.864
Epoch  57 Batch    4/8   train_loss = 2.885
Epoch  57 Batch    5/8   train_loss = 2.886
Epoch  57 Batch    6/8   train_loss = 2.845
Epoch  57 Batch    7/8   train_loss = 2.833
Epoch  58 Batch    0/8   train_loss = 2.845
Epoch  58 Batch    1/8   train_loss = 2.857
Epoch  58 Batch    2/8   train_loss = 2.848
Epoch  58 Batch    3/8   train_loss = 2.825
Epoch  58 Batch    4/8   train_loss = 2.840
Epoch  58 Batch    5/8   train_loss = 2.854
Epoch  58 Batch    6/8   train_loss = 2.813
Epoch  58 Batch    7/8   train_loss = 2.800
Epoch  59 Batch    0/8   train_loss = 2.802
Epoch  59 Batch    1/8   train_loss = 2.810
Epoch  59 Batch    2/8   train_loss = 2.813
Epoch  59 Batch    3/8   train_loss = 2.797
Epoch  59 Batch    4/8   train_loss = 2.806
Epoch  59 Batch    5/8   train_loss = 2.819
Epoch  59 Batch    6/8   train_loss = 2.780
Epoch  59 Batch    7/8   train_loss = 2.774
Epoch  60 Batch    0/8   train_loss = 2.781
Epoch  60 Batch    1/8   train_loss = 2.790
Epoch  60 Batch    2/8   train_loss = 2.785
Epoch  60 Batch    3/8   train_loss = 2.769
Epoch  60 Batch    4/8   train_loss = 2.795
Epoch  60 Batch    5/8   train_loss = 2.798
Epoch  60 Batch    6/8   train_loss = 2.754
Epoch  60 Batch    7/8   train_loss = 2.739
Epoch  61 Batch    0/8   train_loss = 2.748
Epoch  61 Batch    1/8   train_loss = 2.763
Epoch  61 Batch    2/8   train_loss = 2.756
Epoch  61 Batch    3/8   train_loss = 2.730
Epoch  61 Batch    4/8   train_loss = 2.755
Epoch  61 Batch    5/8   train_loss = 2.756
Epoch  61 Batch    6/8   train_loss = 2.716
Epoch  61 Batch    7/8   train_loss = 2.692
Epoch  62 Batch    0/8   train_loss = 2.713
Epoch  62 Batch    1/8   train_loss = 2.727
Epoch  62 Batch    2/8   train_loss = 2.723
Epoch  62 Batch    3/8   train_loss = 2.685
Epoch  62 Batch    4/8   train_loss = 2.705
Epoch  62 Batch    5/8   train_loss = 2.710
Epoch  62 Batch    6/8   train_loss = 2.680
Epoch  62 Batch    7/8   train_loss = 2.672
Epoch  63 Batch    0/8   train_loss = 2.672
Epoch  63 Batch    1/8   train_loss = 2.684
Epoch  63 Batch    2/8   train_loss = 2.681
Epoch  63 Batch    3/8   train_loss = 2.661
Epoch  63 Batch    4/8   train_loss = 2.682
Epoch  63 Batch    5/8   train_loss = 2.685
Epoch  63 Batch    6/8   train_loss = 2.648
Epoch  63 Batch    7/8   train_loss = 2.644
Epoch  64 Batch    0/8   train_loss = 2.654
Epoch  64 Batch    1/8   train_loss = 2.658
Epoch  64 Batch    2/8   train_loss = 2.647
Epoch  64 Batch    3/8   train_loss = 2.631
Epoch  64 Batch    4/8   train_loss = 2.654
Epoch  64 Batch    5/8   train_loss = 2.661
Epoch  64 Batch    6/8   train_loss = 2.617
Epoch  64 Batch    7/8   train_loss = 2.614
Epoch  65 Batch    0/8   train_loss = 2.618
Epoch  65 Batch    1/8   train_loss = 2.628
Epoch  65 Batch    2/8   train_loss = 2.620
Epoch  65 Batch    3/8   train_loss = 2.603
Epoch  65 Batch    4/8   train_loss = 2.615
Epoch  65 Batch    5/8   train_loss = 2.625
Epoch  65 Batch    6/8   train_loss = 2.585
Epoch  65 Batch    7/8   train_loss = 2.582
Epoch  66 Batch    0/8   train_loss = 2.582
Epoch  66 Batch    1/8   train_loss = 2.597
Epoch  66 Batch    2/8   train_loss = 2.586
Epoch  66 Batch    3/8   train_loss = 2.565
Epoch  66 Batch    4/8   train_loss = 2.584
Epoch  66 Batch    5/8   train_loss = 2.592
Epoch  66 Batch    6/8   train_loss = 2.547
Epoch  66 Batch    7/8   train_loss = 2.540
Epoch  67 Batch    0/8   train_loss = 2.547
Epoch  67 Batch    1/8   train_loss = 2.569
Epoch  67 Batch    2/8   train_loss = 2.564
Epoch  67 Batch    3/8   train_loss = 2.538
Epoch  67 Batch    4/8   train_loss = 2.548
Epoch  67 Batch    5/8   train_loss = 2.557
Epoch  67 Batch    6/8   train_loss = 2.522
Epoch  67 Batch    7/8   train_loss = 2.520
Epoch  68 Batch    0/8   train_loss = 2.520
Epoch  68 Batch    1/8   train_loss = 2.532
Epoch  68 Batch    2/8   train_loss = 2.542
Epoch  68 Batch    3/8   train_loss = 2.515
Epoch  68 Batch    4/8   train_loss = 2.530
Epoch  68 Batch    5/8   train_loss = 2.533
Epoch  68 Batch    6/8   train_loss = 2.495
Epoch  68 Batch    7/8   train_loss = 2.488
Epoch  69 Batch    0/8   train_loss = 2.501
Epoch  69 Batch    1/8   train_loss = 2.511
Epoch  69 Batch    2/8   train_loss = 2.512
Epoch  69 Batch    3/8   train_loss = 2.491
Epoch  69 Batch    4/8   train_loss = 2.499
Epoch  69 Batch    5/8   train_loss = 2.508
Epoch  69 Batch    6/8   train_loss = 2.478
Epoch  69 Batch    7/8   train_loss = 2.475
Epoch  70 Batch    0/8   train_loss = 2.472
Epoch  70 Batch    1/8   train_loss = 2.476
Epoch  70 Batch    2/8   train_loss = 2.481
Epoch  70 Batch    3/8   train_loss = 2.468
Epoch  70 Batch    4/8   train_loss = 2.477
Epoch  70 Batch    5/8   train_loss = 2.478
Epoch  70 Batch    6/8   train_loss = 2.438
Epoch  70 Batch    7/8   train_loss = 2.453
Epoch  71 Batch    0/8   train_loss = 2.462
Epoch  71 Batch    1/8   train_loss = 2.455
Epoch  71 Batch    2/8   train_loss = 2.442
Epoch  71 Batch    3/8   train_loss = 2.441
Epoch  71 Batch    4/8   train_loss = 2.467
Epoch  71 Batch    5/8   train_loss = 2.474
Epoch  71 Batch    6/8   train_loss = 2.410
Epoch  71 Batch    7/8   train_loss = 2.415
Epoch  72 Batch    0/8   train_loss = 2.438
Epoch  72 Batch    1/8   train_loss = 2.442
Epoch  72 Batch    2/8   train_loss = 2.430
Epoch  72 Batch    3/8   train_loss = 2.409
Epoch  72 Batch    4/8   train_loss = 2.435
Epoch  72 Batch    5/8   train_loss = 2.446
Epoch  72 Batch    6/8   train_loss = 2.386
Epoch  72 Batch    7/8   train_loss = 2.390
Epoch  73 Batch    0/8   train_loss = 2.407
Epoch  73 Batch    1/8   train_loss = 2.411
Epoch  73 Batch    2/8   train_loss = 2.406
Epoch  73 Batch    3/8   train_loss = 2.380
Epoch  73 Batch    4/8   train_loss = 2.392
Epoch  73 Batch    5/8   train_loss = 2.410
Epoch  73 Batch    6/8   train_loss = 2.367
Epoch  73 Batch    7/8   train_loss = 2.358
Epoch  74 Batch    0/8   train_loss = 2.369
Epoch  74 Batch    1/8   train_loss = 2.375
Epoch  74 Batch    2/8   train_loss = 2.374
Epoch  74 Batch    3/8   train_loss = 2.356
Epoch  74 Batch    4/8   train_loss = 2.363
Epoch  74 Batch    5/8   train_loss = 2.373
Epoch  74 Batch    6/8   train_loss = 2.335
Epoch  74 Batch    7/8   train_loss = 2.333
Epoch  75 Batch    0/8   train_loss = 2.342
Epoch  75 Batch    1/8   train_loss = 2.360
Epoch  75 Batch    2/8   train_loss = 2.342
Epoch  75 Batch    3/8   train_loss = 2.322
Epoch  75 Batch    4/8   train_loss = 2.341
Epoch  75 Batch    5/8   train_loss = 2.344
Epoch  75 Batch    6/8   train_loss = 2.299
Epoch  75 Batch    7/8   train_loss = 2.301
Epoch  76 Batch    0/8   train_loss = 2.314
Epoch  76 Batch    1/8   train_loss = 2.336
Epoch  76 Batch    2/8   train_loss = 2.321
Epoch  76 Batch    3/8   train_loss = 2.292
Epoch  76 Batch    4/8   train_loss = 2.302
Epoch  76 Batch    5/8   train_loss = 2.326
Epoch  76 Batch    6/8   train_loss = 2.279
Epoch  76 Batch    7/8   train_loss = 2.277
Epoch  77 Batch    0/8   train_loss = 2.289
Epoch  77 Batch    1/8   train_loss = 2.303
Epoch  77 Batch    2/8   train_loss = 2.300
Epoch  77 Batch    3/8   train_loss = 2.274
Epoch  77 Batch    4/8   train_loss = 2.285
Epoch  77 Batch    5/8   train_loss = 2.298
Epoch  77 Batch    6/8   train_loss = 2.255
Epoch  77 Batch    7/8   train_loss = 2.256
Epoch  78 Batch    0/8   train_loss = 2.275
Epoch  78 Batch    1/8   train_loss = 2.279
Epoch  78 Batch    2/8   train_loss = 2.279
Epoch  78 Batch    3/8   train_loss = 2.256
Epoch  78 Batch    4/8   train_loss = 2.269
Epoch  78 Batch    5/8   train_loss = 2.291
Epoch  78 Batch    6/8   train_loss = 2.238
Epoch  78 Batch    7/8   train_loss = 2.231
Epoch  79 Batch    0/8   train_loss = 2.256
Epoch  79 Batch    1/8   train_loss = 2.275
Epoch  79 Batch    2/8   train_loss = 2.263
Epoch  79 Batch    3/8   train_loss = 2.234
Epoch  79 Batch    4/8   train_loss = 2.234
Epoch  79 Batch    5/8   train_loss = 2.263
Epoch  79 Batch    6/8   train_loss = 2.233
Epoch  79 Batch    7/8   train_loss = 2.221
Epoch  80 Batch    0/8   train_loss = 2.217
Epoch  80 Batch    1/8   train_loss = 2.236
Epoch  80 Batch    2/8   train_loss = 2.243
Epoch  80 Batch    3/8   train_loss = 2.226
Epoch  80 Batch    4/8   train_loss = 2.217
Epoch  80 Batch    5/8   train_loss = 2.228
Epoch  80 Batch    6/8   train_loss = 2.199
Epoch  80 Batch    7/8   train_loss = 2.202
Epoch  81 Batch    0/8   train_loss = 2.210
Epoch  81 Batch    1/8   train_loss = 2.207
Epoch  81 Batch    2/8   train_loss = 2.208
Epoch  81 Batch    3/8   train_loss = 2.196
Epoch  81 Batch    4/8   train_loss = 2.206
Epoch  81 Batch    5/8   train_loss = 2.212
Epoch  81 Batch    6/8   train_loss = 2.169
Epoch  81 Batch    7/8   train_loss = 2.162
Epoch  82 Batch    0/8   train_loss = 2.190
Epoch  82 Batch    1/8   train_loss = 2.189
Epoch  82 Batch    2/8   train_loss = 2.193
Epoch  82 Batch    3/8   train_loss = 2.172
Epoch  82 Batch    4/8   train_loss = 2.178
Epoch  82 Batch    5/8   train_loss = 2.191
Epoch  82 Batch    6/8   train_loss = 2.159
Epoch  82 Batch    7/8   train_loss = 2.149
Epoch  83 Batch    0/8   train_loss = 2.168
Epoch  83 Batch    1/8   train_loss = 2.170
Epoch  83 Batch    2/8   train_loss = 2.177
Epoch  83 Batch    3/8   train_loss = 2.153
Epoch  83 Batch    4/8   train_loss = 2.162
Epoch  83 Batch    5/8   train_loss = 2.179
Epoch  83 Batch    6/8   train_loss = 2.136
Epoch  83 Batch    7/8   train_loss = 2.133
Epoch  84 Batch    0/8   train_loss = 2.159
Epoch  84 Batch    1/8   train_loss = 2.148
Epoch  84 Batch    2/8   train_loss = 2.146
Epoch  84 Batch    3/8   train_loss = 2.132
Epoch  84 Batch    4/8   train_loss = 2.143
Epoch  84 Batch    5/8   train_loss = 2.159
Epoch  84 Batch    6/8   train_loss = 2.112
Epoch  84 Batch    7/8   train_loss = 2.108
Epoch  85 Batch    0/8   train_loss = 2.136
Epoch  85 Batch    1/8   train_loss = 2.137
Epoch  85 Batch    2/8   train_loss = 2.112
Epoch  85 Batch    3/8   train_loss = 2.094
Epoch  85 Batch    4/8   train_loss = 2.112
Epoch  85 Batch    5/8   train_loss = 2.144
Epoch  85 Batch    6/8   train_loss = 2.096
Epoch  85 Batch    7/8   train_loss = 2.080
Epoch  86 Batch    0/8   train_loss = 2.101
Epoch  86 Batch    1/8   train_loss = 2.112
Epoch  86 Batch    2/8   train_loss = 2.100
Epoch  86 Batch    3/8   train_loss = 2.075
Epoch  86 Batch    4/8   train_loss = 2.080
Epoch  86 Batch    5/8   train_loss = 2.116
Epoch  86 Batch    6/8   train_loss = 2.074
Epoch  86 Batch    7/8   train_loss = 2.059
Epoch  87 Batch    0/8   train_loss = 2.073
Epoch  87 Batch    1/8   train_loss = 2.086
Epoch  87 Batch    2/8   train_loss = 2.083
Epoch  87 Batch    3/8   train_loss = 2.059
Epoch  87 Batch    4/8   train_loss = 2.061
Epoch  87 Batch    5/8   train_loss = 2.093
Epoch  87 Batch    6/8   train_loss = 2.053
Epoch  87 Batch    7/8   train_loss = 2.041
Epoch  88 Batch    0/8   train_loss = 2.063
Epoch  88 Batch    1/8   train_loss = 2.059
Epoch  88 Batch    2/8   train_loss = 2.062
Epoch  88 Batch    3/8   train_loss = 2.039
Epoch  88 Batch    4/8   train_loss = 2.047
Epoch  88 Batch    5/8   train_loss = 2.080
Epoch  88 Batch    6/8   train_loss = 2.030
Epoch  88 Batch    7/8   train_loss = 2.025
Epoch  89 Batch    0/8   train_loss = 2.050
Epoch  89 Batch    1/8   train_loss = 2.044
Epoch  89 Batch    2/8   train_loss = 2.040
Epoch  89 Batch    3/8   train_loss = 2.014
Epoch  89 Batch    4/8   train_loss = 2.015
Epoch  89 Batch    5/8   train_loss = 2.051
Epoch  89 Batch    6/8   train_loss = 2.010
Epoch  89 Batch    7/8   train_loss = 2.009
Epoch  90 Batch    0/8   train_loss = 2.018
Epoch  90 Batch    1/8   train_loss = 2.020
Epoch  90 Batch    2/8   train_loss = 2.022
Epoch  90 Batch    3/8   train_loss = 2.004
Epoch  90 Batch    4/8   train_loss = 1.995
Epoch  90 Batch    5/8   train_loss = 2.026
Epoch  90 Batch    6/8   train_loss = 1.986
Epoch  90 Batch    7/8   train_loss = 1.989
Epoch  91 Batch    0/8   train_loss = 2.004
Epoch  91 Batch    1/8   train_loss = 2.001
Epoch  91 Batch    2/8   train_loss = 1.990
Epoch  91 Batch    3/8   train_loss = 1.976
Epoch  91 Batch    4/8   train_loss = 1.973
Epoch  91 Batch    5/8   train_loss = 2.002
Epoch  91 Batch    6/8   train_loss = 1.962
Epoch  91 Batch    7/8   train_loss = 1.959
Epoch  92 Batch    0/8   train_loss = 1.977
Epoch  92 Batch    1/8   train_loss = 1.985
Epoch  92 Batch    2/8   train_loss = 1.976
Epoch  92 Batch    3/8   train_loss = 1.948
Epoch  92 Batch    4/8   train_loss = 1.962
Epoch  92 Batch    5/8   train_loss = 2.001
Epoch  92 Batch    6/8   train_loss = 1.943
Epoch  92 Batch    7/8   train_loss = 1.935
Epoch  93 Batch    0/8   train_loss = 1.955
Epoch  93 Batch    1/8   train_loss = 1.969
Epoch  93 Batch    2/8   train_loss = 1.974
Epoch  93 Batch    3/8   train_loss = 1.941
Epoch  93 Batch    4/8   train_loss = 1.934
Epoch  93 Batch    5/8   train_loss = 1.975
Epoch  93 Batch    6/8   train_loss = 1.928
Epoch  93 Batch    7/8   train_loss = 1.925
Epoch  94 Batch    0/8   train_loss = 1.944
Epoch  94 Batch    1/8   train_loss = 1.947
Epoch  94 Batch    2/8   train_loss = 1.947
Epoch  94 Batch    3/8   train_loss = 1.921
Epoch  94 Batch    4/8   train_loss = 1.907
Epoch  94 Batch    5/8   train_loss = 1.952
Epoch  94 Batch    6/8   train_loss = 1.898
Epoch  94 Batch    7/8   train_loss = 1.897
Epoch  95 Batch    0/8   train_loss = 1.918
Epoch  95 Batch    1/8   train_loss = 1.921
Epoch  95 Batch    2/8   train_loss = 1.923
Epoch  95 Batch    3/8   train_loss = 1.900
Epoch  95 Batch    4/8   train_loss = 1.887
Epoch  95 Batch    5/8   train_loss = 1.923
Epoch  95 Batch    6/8   train_loss = 1.889
Epoch  95 Batch    7/8   train_loss = 1.890
Epoch  96 Batch    0/8   train_loss = 1.902
Epoch  96 Batch    1/8   train_loss = 1.903
Epoch  96 Batch    2/8   train_loss = 1.905
Epoch  96 Batch    3/8   train_loss = 1.899
Epoch  96 Batch    4/8   train_loss = 1.891
Epoch  96 Batch    5/8   train_loss = 1.912
Epoch  96 Batch    6/8   train_loss = 1.865
Epoch  96 Batch    7/8   train_loss = 1.872
Epoch  97 Batch    0/8   train_loss = 1.907
Epoch  97 Batch    1/8   train_loss = 1.902
Epoch  97 Batch    2/8   train_loss = 1.877
Epoch  97 Batch    3/8   train_loss = 1.873
Epoch  97 Batch    4/8   train_loss = 1.876
Epoch  97 Batch    5/8   train_loss = 1.918
Epoch  97 Batch    6/8   train_loss = 1.848
Epoch  97 Batch    7/8   train_loss = 1.840
Epoch  98 Batch    0/8   train_loss = 1.871
Epoch  98 Batch    1/8   train_loss = 1.895
Epoch  98 Batch    2/8   train_loss = 1.883
Epoch  98 Batch    3/8   train_loss = 1.847
Epoch  98 Batch    4/8   train_loss = 1.839
Epoch  98 Batch    5/8   train_loss = 1.890
Epoch  98 Batch    6/8   train_loss = 1.839
Epoch  98 Batch    7/8   train_loss = 1.826
Epoch  99 Batch    0/8   train_loss = 1.842
Epoch  99 Batch    1/8   train_loss = 1.864
Epoch  99 Batch    2/8   train_loss = 1.866
Epoch  99 Batch    3/8   train_loss = 1.837
Epoch  99 Batch    4/8   train_loss = 1.820
Epoch  99 Batch    5/8   train_loss = 1.870
Epoch  99 Batch    6/8   train_loss = 1.823
Epoch  99 Batch    7/8   train_loss = 1.811
Epoch 100 Batch    0/8   train_loss = 1.828
Epoch 100 Batch    1/8   train_loss = 1.847
Epoch 100 Batch    2/8   train_loss = 1.843
Epoch 100 Batch    3/8   train_loss = 1.814
Epoch 100 Batch    4/8   train_loss = 1.792
Epoch 100 Batch    5/8   train_loss = 1.844
Epoch 100 Batch    6/8   train_loss = 1.797
Epoch 100 Batch    7/8   train_loss = 1.792
Epoch 101 Batch    0/8   train_loss = 1.810
Epoch 101 Batch    1/8   train_loss = 1.819
Epoch 101 Batch    2/8   train_loss = 1.822
Epoch 101 Batch    3/8   train_loss = 1.797
Epoch 101 Batch    4/8   train_loss = 1.787
Epoch 101 Batch    5/8   train_loss = 1.830
Epoch 101 Batch    6/8   train_loss = 1.782
Epoch 101 Batch    7/8   train_loss = 1.781
Epoch 102 Batch    0/8   train_loss = 1.799
Epoch 102 Batch    1/8   train_loss = 1.807
Epoch 102 Batch    2/8   train_loss = 1.807
Epoch 102 Batch    3/8   train_loss = 1.780
Epoch 102 Batch    4/8   train_loss = 1.784
Epoch 102 Batch    5/8   train_loss = 1.822
Epoch 102 Batch    6/8   train_loss = 1.758
Epoch 102 Batch    7/8   train_loss = 1.759
Epoch 103 Batch    0/8   train_loss = 1.788
Epoch 103 Batch    1/8   train_loss = 1.796
Epoch 103 Batch    2/8   train_loss = 1.796
Epoch 103 Batch    3/8   train_loss = 1.762
Epoch 103 Batch    4/8   train_loss = 1.756
Epoch 103 Batch    5/8   train_loss = 1.806
Epoch 103 Batch    6/8   train_loss = 1.763
Epoch 103 Batch    7/8   train_loss = 1.752
Epoch 104 Batch    0/8   train_loss = 1.771
Epoch 104 Batch    1/8   train_loss = 1.778
Epoch 104 Batch    2/8   train_loss = 1.781
Epoch 104 Batch    3/8   train_loss = 1.755
Epoch 104 Batch    4/8   train_loss = 1.734
Epoch 104 Batch    5/8   train_loss = 1.778
Epoch 104 Batch    6/8   train_loss = 1.737
Epoch 104 Batch    7/8   train_loss = 1.738
Epoch 105 Batch    0/8   train_loss = 1.756
Epoch 105 Batch    1/8   train_loss = 1.755
Epoch 105 Batch    2/8   train_loss = 1.743
Epoch 105 Batch    3/8   train_loss = 1.735
Epoch 105 Batch    4/8   train_loss = 1.726
Epoch 105 Batch    5/8   train_loss = 1.762
Epoch 105 Batch    6/8   train_loss = 1.709
Epoch 105 Batch    7/8   train_loss = 1.704
Epoch 106 Batch    0/8   train_loss = 1.733
Epoch 106 Batch    1/8   train_loss = 1.743
Epoch 106 Batch    2/8   train_loss = 1.728
Epoch 106 Batch    3/8   train_loss = 1.698
Epoch 106 Batch    4/8   train_loss = 1.694
Epoch 106 Batch    5/8   train_loss = 1.748
Epoch 106 Batch    6/8   train_loss = 1.695
Epoch 106 Batch    7/8   train_loss = 1.682
Epoch 107 Batch    0/8   train_loss = 1.707
Epoch 107 Batch    1/8   train_loss = 1.717
Epoch 107 Batch    2/8   train_loss = 1.715
Epoch 107 Batch    3/8   train_loss = 1.693
Epoch 107 Batch    4/8   train_loss = 1.675
Epoch 107 Batch    5/8   train_loss = 1.724
Epoch 107 Batch    6/8   train_loss = 1.684
Epoch 107 Batch    7/8   train_loss = 1.670
Epoch 108 Batch    0/8   train_loss = 1.687
Epoch 108 Batch    1/8   train_loss = 1.695
Epoch 108 Batch    2/8   train_loss = 1.700
Epoch 108 Batch    3/8   train_loss = 1.680
Epoch 108 Batch    4/8   train_loss = 1.666
Epoch 108 Batch    5/8   train_loss = 1.698
Epoch 108 Batch    6/8   train_loss = 1.660
Epoch 108 Batch    7/8   train_loss = 1.654
Epoch 109 Batch    0/8   train_loss = 1.693
Epoch 109 Batch    1/8   train_loss = 1.696
Epoch 109 Batch    2/8   train_loss = 1.684
Epoch 109 Batch    3/8   train_loss = 1.656
Epoch 109 Batch    4/8   train_loss = 1.658
Epoch 109 Batch    5/8   train_loss = 1.711
Epoch 109 Batch    6/8   train_loss = 1.660
Epoch 109 Batch    7/8   train_loss = 1.641
Epoch 110 Batch    0/8   train_loss = 1.667
Epoch 110 Batch    1/8   train_loss = 1.686
Epoch 110 Batch    2/8   train_loss = 1.689
Epoch 110 Batch    3/8   train_loss = 1.651
Epoch 110 Batch    4/8   train_loss = 1.634
Epoch 110 Batch    5/8   train_loss = 1.692
Epoch 110 Batch    6/8   train_loss = 1.651
Epoch 110 Batch    7/8   train_loss = 1.644
Epoch 111 Batch    0/8   train_loss = 1.661
Epoch 111 Batch    1/8   train_loss = 1.666
Epoch 111 Batch    2/8   train_loss = 1.656
Epoch 111 Batch    3/8   train_loss = 1.643
Epoch 111 Batch    4/8   train_loss = 1.636
Epoch 111 Batch    5/8   train_loss = 1.673
Epoch 111 Batch    6/8   train_loss = 1.624
Epoch 111 Batch    7/8   train_loss = 1.618
Epoch 112 Batch    0/8   train_loss = 1.653
Epoch 112 Batch    1/8   train_loss = 1.658
Epoch 112 Batch    2/8   train_loss = 1.646
Epoch 112 Batch    3/8   train_loss = 1.612
Epoch 112 Batch    4/8   train_loss = 1.609
Epoch 112 Batch    5/8   train_loss = 1.662
Epoch 112 Batch    6/8   train_loss = 1.618
Epoch 112 Batch    7/8   train_loss = 1.598
Epoch 113 Batch    0/8   train_loss = 1.622
Epoch 113 Batch    1/8   train_loss = 1.633
Epoch 113 Batch    2/8   train_loss = 1.623
Epoch 113 Batch    3/8   train_loss = 1.602
Epoch 113 Batch    4/8   train_loss = 1.593
Epoch 113 Batch    5/8   train_loss = 1.642
Epoch 113 Batch    6/8   train_loss = 1.589
Epoch 113 Batch    7/8   train_loss = 1.573
Epoch 114 Batch    0/8   train_loss = 1.598
Epoch 114 Batch    1/8   train_loss = 1.609
Epoch 114 Batch    2/8   train_loss = 1.603
Epoch 114 Batch    3/8   train_loss = 1.575
Epoch 114 Batch    4/8   train_loss = 1.561
Epoch 114 Batch    5/8   train_loss = 1.625
Epoch 114 Batch    6/8   train_loss = 1.578
Epoch 114 Batch    7/8   train_loss = 1.554
Epoch 115 Batch    0/8   train_loss = 1.575
Epoch 115 Batch    1/8   train_loss = 1.595
Epoch 115 Batch    2/8   train_loss = 1.592
Epoch 115 Batch    3/8   train_loss = 1.570
Epoch 115 Batch    4/8   train_loss = 1.548
Epoch 115 Batch    5/8   train_loss = 1.596
Epoch 115 Batch    6/8   train_loss = 1.557
Epoch 115 Batch    7/8   train_loss = 1.552
Epoch 116 Batch    0/8   train_loss = 1.570
Epoch 116 Batch    1/8   train_loss = 1.565
Epoch 116 Batch    2/8   train_loss = 1.571
Epoch 116 Batch    3/8   train_loss = 1.553
Epoch 116 Batch    4/8   train_loss = 1.546
Epoch 116 Batch    5/8   train_loss = 1.584
Epoch 116 Batch    6/8   train_loss = 1.534
Epoch 116 Batch    7/8   train_loss = 1.527
Epoch 117 Batch    0/8   train_loss = 1.562
Epoch 117 Batch    1/8   train_loss = 1.565
Epoch 117 Batch    2/8   train_loss = 1.562
Epoch 117 Batch    3/8   train_loss = 1.529
Epoch 117 Batch    4/8   train_loss = 1.525
Epoch 117 Batch    5/8   train_loss = 1.578
Epoch 117 Batch    6/8   train_loss = 1.532
Epoch 117 Batch    7/8   train_loss = 1.516
Epoch 118 Batch    0/8   train_loss = 1.540
Epoch 118 Batch    1/8   train_loss = 1.542
Epoch 118 Batch    2/8   train_loss = 1.545
Epoch 118 Batch    3/8   train_loss = 1.527
Epoch 118 Batch    4/8   train_loss = 1.514
Epoch 118 Batch    5/8   train_loss = 1.559
Epoch 118 Batch    6/8   train_loss = 1.509
Epoch 118 Batch    7/8   train_loss = 1.499
Epoch 119 Batch    0/8   train_loss = 1.524
Epoch 119 Batch    1/8   train_loss = 1.536
Epoch 119 Batch    2/8   train_loss = 1.525
Epoch 119 Batch    3/8   train_loss = 1.509
Epoch 119 Batch    4/8   train_loss = 1.488
Epoch 119 Batch    5/8   train_loss = 1.552
Epoch 119 Batch    6/8   train_loss = 1.505
Epoch 119 Batch    7/8   train_loss = 1.488
Epoch 120 Batch    0/8   train_loss = 1.508
Epoch 120 Batch    1/8   train_loss = 1.518
Epoch 120 Batch    2/8   train_loss = 1.516
Epoch 120 Batch    3/8   train_loss = 1.504
Epoch 120 Batch    4/8   train_loss = 1.487
Epoch 120 Batch    5/8   train_loss = 1.537
Epoch 120 Batch    6/8   train_loss = 1.486
Epoch 120 Batch    7/8   train_loss = 1.472
Epoch 121 Batch    0/8   train_loss = 1.499
Epoch 121 Batch    1/8   train_loss = 1.512
Epoch 121 Batch    2/8   train_loss = 1.504
Epoch 121 Batch    3/8   train_loss = 1.477
Epoch 121 Batch    4/8   train_loss = 1.469
Epoch 121 Batch    5/8   train_loss = 1.522
Epoch 121 Batch    6/8   train_loss = 1.474
Epoch 121 Batch    7/8   train_loss = 1.456
Epoch 122 Batch    0/8   train_loss = 1.478
Epoch 122 Batch    1/8   train_loss = 1.490
Epoch 122 Batch    2/8   train_loss = 1.493
Epoch 122 Batch    3/8   train_loss = 1.469
Epoch 122 Batch    4/8   train_loss = 1.449
Epoch 122 Batch    5/8   train_loss = 1.501
Epoch 122 Batch    6/8   train_loss = 1.453
Epoch 122 Batch    7/8   train_loss = 1.446
Epoch 123 Batch    0/8   train_loss = 1.462
Epoch 123 Batch    1/8   train_loss = 1.473
Epoch 123 Batch    2/8   train_loss = 1.466
Epoch 123 Batch    3/8   train_loss = 1.452
Epoch 123 Batch    4/8   train_loss = 1.431
Epoch 123 Batch    5/8   train_loss = 1.493
Epoch 123 Batch    6/8   train_loss = 1.436
Epoch 123 Batch    7/8   train_loss = 1.425
Epoch 124 Batch    0/8   train_loss = 1.459
Epoch 124 Batch    1/8   train_loss = 1.459
Epoch 124 Batch    2/8   train_loss = 1.453
Epoch 124 Batch    3/8   train_loss = 1.435
Epoch 124 Batch    4/8   train_loss = 1.429
Epoch 124 Batch    5/8   train_loss = 1.480
Epoch 124 Batch    6/8   train_loss = 1.425
Epoch 124 Batch    7/8   train_loss = 1.406
Epoch 125 Batch    0/8   train_loss = 1.443
Epoch 125 Batch    1/8   train_loss = 1.454
Epoch 125 Batch    2/8   train_loss = 1.439
Epoch 125 Batch    3/8   train_loss = 1.419
Epoch 125 Batch    4/8   train_loss = 1.407
Epoch 125 Batch    5/8   train_loss = 1.473
Epoch 125 Batch    6/8   train_loss = 1.422
Epoch 125 Batch    7/8   train_loss = 1.392
Epoch 126 Batch    0/8   train_loss = 1.412
Epoch 126 Batch    1/8   train_loss = 1.437
Epoch 126 Batch    2/8   train_loss = 1.437
Epoch 126 Batch    3/8   train_loss = 1.413
Epoch 126 Batch    4/8   train_loss = 1.382
Epoch 126 Batch    5/8   train_loss = 1.435
Epoch 126 Batch    6/8   train_loss = 1.402
Epoch 126 Batch    7/8   train_loss = 1.394
Epoch 127 Batch    0/8   train_loss = 1.409
Epoch 127 Batch    1/8   train_loss = 1.412
Epoch 127 Batch    2/8   train_loss = 1.411
Epoch 127 Batch    3/8   train_loss = 1.401
Epoch 127 Batch    4/8   train_loss = 1.379
Epoch 127 Batch    5/8   train_loss = 1.428
Epoch 127 Batch    6/8   train_loss = 1.385
Epoch 127 Batch    7/8   train_loss = 1.373
Epoch 128 Batch    0/8   train_loss = 1.396
Epoch 128 Batch    1/8   train_loss = 1.402
Epoch 128 Batch    2/8   train_loss = 1.396
Epoch 128 Batch    3/8   train_loss = 1.377
Epoch 128 Batch    4/8   train_loss = 1.357
Epoch 128 Batch    5/8   train_loss = 1.415
Epoch 128 Batch    6/8   train_loss = 1.370
Epoch 128 Batch    7/8   train_loss = 1.366
Epoch 129 Batch    0/8   train_loss = 1.378
Epoch 129 Batch    1/8   train_loss = 1.386
Epoch 129 Batch    2/8   train_loss = 1.391
Epoch 129 Batch    3/8   train_loss = 1.370
Epoch 129 Batch    4/8   train_loss = 1.349
Epoch 129 Batch    5/8   train_loss = 1.405
Epoch 129 Batch    6/8   train_loss = 1.352
Epoch 129 Batch    7/8   train_loss = 1.353
Epoch 130 Batch    0/8   train_loss = 1.373
Epoch 130 Batch    1/8   train_loss = 1.373
Epoch 130 Batch    2/8   train_loss = 1.364
Epoch 130 Batch    3/8   train_loss = 1.350
Epoch 130 Batch    4/8   train_loss = 1.335
Epoch 130 Batch    5/8   train_loss = 1.403
Epoch 130 Batch    6/8   train_loss = 1.341
Epoch 130 Batch    7/8   train_loss = 1.322
Epoch 131 Batch    0/8   train_loss = 1.351
Epoch 131 Batch    1/8   train_loss = 1.359
Epoch 131 Batch    2/8   train_loss = 1.356
Epoch 131 Batch    3/8   train_loss = 1.331
Epoch 131 Batch    4/8   train_loss = 1.306
Epoch 131 Batch    5/8   train_loss = 1.378
Epoch 131 Batch    6/8   train_loss = 1.333
Epoch 131 Batch    7/8   train_loss = 1.311
Epoch 132 Batch    0/8   train_loss = 1.327
Epoch 132 Batch    1/8   train_loss = 1.330
Epoch 132 Batch    2/8   train_loss = 1.345
Epoch 132 Batch    3/8   train_loss = 1.333
Epoch 132 Batch    4/8   train_loss = 1.294
Epoch 132 Batch    5/8   train_loss = 1.351
Epoch 132 Batch    6/8   train_loss = 1.310
Epoch 132 Batch    7/8   train_loss = 1.303
Epoch 133 Batch    0/8   train_loss = 1.325
Epoch 133 Batch    1/8   train_loss = 1.321
Epoch 133 Batch    2/8   train_loss = 1.318
Epoch 133 Batch    3/8   train_loss = 1.312
Epoch 133 Batch    4/8   train_loss = 1.290
Epoch 133 Batch    5/8   train_loss = 1.354
Epoch 133 Batch    6/8   train_loss = 1.296
Epoch 133 Batch    7/8   train_loss = 1.283
Epoch 134 Batch    0/8   train_loss = 1.315
Epoch 134 Batch    1/8   train_loss = 1.317
Epoch 134 Batch    2/8   train_loss = 1.315
Epoch 134 Batch    3/8   train_loss = 1.305
Epoch 134 Batch    4/8   train_loss = 1.267
Epoch 134 Batch    5/8   train_loss = 1.336
Epoch 134 Batch    6/8   train_loss = 1.288
Epoch 134 Batch    7/8   train_loss = 1.281
Epoch 135 Batch    0/8   train_loss = 1.307
Epoch 135 Batch    1/8   train_loss = 1.300
Epoch 135 Batch    2/8   train_loss = 1.300
Epoch 135 Batch    3/8   train_loss = 1.294
Epoch 135 Batch    4/8   train_loss = 1.262
Epoch 135 Batch    5/8   train_loss = 1.328
Epoch 135 Batch    6/8   train_loss = 1.276
Epoch 135 Batch    7/8   train_loss = 1.262
Epoch 136 Batch    0/8   train_loss = 1.299
Epoch 136 Batch    1/8   train_loss = 1.298
Epoch 136 Batch    2/8   train_loss = 1.296
Epoch 136 Batch    3/8   train_loss = 1.277
Epoch 136 Batch    4/8   train_loss = 1.245
Epoch 136 Batch    5/8   train_loss = 1.326
Epoch 136 Batch    6/8   train_loss = 1.281
Epoch 136 Batch    7/8   train_loss = 1.250
Epoch 137 Batch    0/8   train_loss = 1.275
Epoch 137 Batch    1/8   train_loss = 1.285
Epoch 137 Batch    2/8   train_loss = 1.298
Epoch 137 Batch    3/8   train_loss = 1.276
Epoch 137 Batch    4/8   train_loss = 1.231
Epoch 137 Batch    5/8   train_loss = 1.298
Epoch 137 Batch    6/8   train_loss = 1.258
Epoch 137 Batch    7/8   train_loss = 1.251
Epoch 138 Batch    0/8   train_loss = 1.258
Epoch 138 Batch    1/8   train_loss = 1.262
Epoch 138 Batch    2/8   train_loss = 1.275
Epoch 138 Batch    3/8   train_loss = 1.267
Epoch 138 Batch    4/8   train_loss = 1.230
Epoch 138 Batch    5/8   train_loss = 1.284
Epoch 138 Batch    6/8   train_loss = 1.244
Epoch 138 Batch    7/8   train_loss = 1.245
Epoch 139 Batch    0/8   train_loss = 1.249
Epoch 139 Batch    1/8   train_loss = 1.247
Epoch 139 Batch    2/8   train_loss = 1.258
Epoch 139 Batch    3/8   train_loss = 1.251
Epoch 139 Batch    4/8   train_loss = 1.219
Epoch 139 Batch    5/8   train_loss = 1.270
Epoch 139 Batch    6/8   train_loss = 1.223
Epoch 139 Batch    7/8   train_loss = 1.222
Epoch 140 Batch    0/8   train_loss = 1.251
Epoch 140 Batch    1/8   train_loss = 1.240
Epoch 140 Batch    2/8   train_loss = 1.235
Epoch 140 Batch    3/8   train_loss = 1.238
Epoch 140 Batch    4/8   train_loss = 1.212
Epoch 140 Batch    5/8   train_loss = 1.269
Epoch 140 Batch    6/8   train_loss = 1.223
Epoch 140 Batch    7/8   train_loss = 1.211
Epoch 141 Batch    0/8   train_loss = 1.239
Epoch 141 Batch    1/8   train_loss = 1.232
Epoch 141 Batch    2/8   train_loss = 1.231
Epoch 141 Batch    3/8   train_loss = 1.227
Epoch 141 Batch    4/8   train_loss = 1.204
Epoch 141 Batch    5/8   train_loss = 1.265
Epoch 141 Batch    6/8   train_loss = 1.203
Epoch 141 Batch    7/8   train_loss = 1.195
Epoch 142 Batch    0/8   train_loss = 1.226
Epoch 142 Batch    1/8   train_loss = 1.223
Epoch 142 Batch    2/8   train_loss = 1.217
Epoch 142 Batch    3/8   train_loss = 1.211
Epoch 142 Batch    4/8   train_loss = 1.183
Epoch 142 Batch    5/8   train_loss = 1.257
Epoch 142 Batch    6/8   train_loss = 1.202
Epoch 142 Batch    7/8   train_loss = 1.189
Epoch 143 Batch    0/8   train_loss = 1.214
Epoch 143 Batch    1/8   train_loss = 1.205
Epoch 143 Batch    2/8   train_loss = 1.199
Epoch 143 Batch    3/8   train_loss = 1.207
Epoch 143 Batch    4/8   train_loss = 1.182
Epoch 143 Batch    5/8   train_loss = 1.238
Epoch 143 Batch    6/8   train_loss = 1.181
Epoch 143 Batch    7/8   train_loss = 1.177
Epoch 144 Batch    0/8   train_loss = 1.210
Epoch 144 Batch    1/8   train_loss = 1.199
Epoch 144 Batch    2/8   train_loss = 1.187
Epoch 144 Batch    3/8   train_loss = 1.177
Epoch 144 Batch    4/8   train_loss = 1.172
Epoch 144 Batch    5/8   train_loss = 1.229
Epoch 144 Batch    6/8   train_loss = 1.170
Epoch 144 Batch    7/8   train_loss = 1.169
Epoch 145 Batch    0/8   train_loss = 1.189
Epoch 145 Batch    1/8   train_loss = 1.196
Epoch 145 Batch    2/8   train_loss = 1.203
Epoch 145 Batch    3/8   train_loss = 1.177
Epoch 145 Batch    4/8   train_loss = 1.149
Epoch 145 Batch    5/8   train_loss = 1.217
Epoch 145 Batch    6/8   train_loss = 1.172
Epoch 145 Batch    7/8   train_loss = 1.155
Epoch 146 Batch    0/8   train_loss = 1.184
Epoch 146 Batch    1/8   train_loss = 1.182
Epoch 146 Batch    2/8   train_loss = 1.182
Epoch 146 Batch    3/8   train_loss = 1.165
Epoch 146 Batch    4/8   train_loss = 1.150
Epoch 146 Batch    5/8   train_loss = 1.212
Epoch 146 Batch    6/8   train_loss = 1.146
Epoch 146 Batch    7/8   train_loss = 1.136
Epoch 147 Batch    0/8   train_loss = 1.173
Epoch 147 Batch    1/8   train_loss = 1.179
Epoch 147 Batch    2/8   train_loss = 1.165
Epoch 147 Batch    3/8   train_loss = 1.153
Epoch 147 Batch    4/8   train_loss = 1.138
Epoch 147 Batch    5/8   train_loss = 1.208
Epoch 147 Batch    6/8   train_loss = 1.148
Epoch 147 Batch    7/8   train_loss = 1.130
Epoch 148 Batch    0/8   train_loss = 1.162
Epoch 148 Batch    1/8   train_loss = 1.160
Epoch 148 Batch    2/8   train_loss = 1.162
Epoch 148 Batch    3/8   train_loss = 1.152
Epoch 148 Batch    4/8   train_loss = 1.121
Epoch 148 Batch    5/8   train_loss = 1.190
Epoch 148 Batch    6/8   train_loss = 1.133
Epoch 148 Batch    7/8   train_loss = 1.116
Epoch 149 Batch    0/8   train_loss = 1.141
Epoch 149 Batch    1/8   train_loss = 1.144
Epoch 149 Batch    2/8   train_loss = 1.145
Epoch 149 Batch    3/8   train_loss = 1.130
Epoch 149 Batch    4/8   train_loss = 1.097
Epoch 149 Batch    5/8   train_loss = 1.177
Epoch 149 Batch    6/8   train_loss = 1.146
Epoch 149 Batch    7/8   train_loss = 1.112
Epoch 150 Batch    0/8   train_loss = 1.128
Epoch 150 Batch    1/8   train_loss = 1.126
Epoch 150 Batch    2/8   train_loss = 1.148
Epoch 150 Batch    3/8   train_loss = 1.136
Epoch 150 Batch    4/8   train_loss = 1.093
Epoch 150 Batch    5/8   train_loss = 1.167
Epoch 150 Batch    6/8   train_loss = 1.135
Epoch 150 Batch    7/8   train_loss = 1.131
Epoch 151 Batch    0/8   train_loss = 1.128
Epoch 151 Batch    1/8   train_loss = 1.119
Epoch 151 Batch    2/8   train_loss = 1.148
Epoch 151 Batch    3/8   train_loss = 1.143
Epoch 151 Batch    4/8   train_loss = 1.095
Epoch 151 Batch    5/8   train_loss = 1.160
Epoch 151 Batch    6/8   train_loss = 1.126
Epoch 151 Batch    7/8   train_loss = 1.122
Epoch 152 Batch    0/8   train_loss = 1.152
Epoch 152 Batch    1/8   train_loss = 1.108
Epoch 152 Batch    2/8   train_loss = 1.129
Epoch 152 Batch    3/8   train_loss = 1.156
Epoch 152 Batch    4/8   train_loss = 1.102
Epoch 152 Batch    5/8   train_loss = 1.169
Epoch 152 Batch    6/8   train_loss = 1.118
Epoch 152 Batch    7/8   train_loss = 1.111
Epoch 153 Batch    0/8   train_loss = 1.140
Epoch 153 Batch    1/8   train_loss = 1.131
Epoch 153 Batch    2/8   train_loss = 1.117
Epoch 153 Batch    3/8   train_loss = 1.142
Epoch 153 Batch    4/8   train_loss = 1.102
Epoch 153 Batch    5/8   train_loss = 1.168
Epoch 153 Batch    6/8   train_loss = 1.117
Epoch 153 Batch    7/8   train_loss = 1.105
Epoch 154 Batch    0/8   train_loss = 1.144
Epoch 154 Batch    1/8   train_loss = 1.133
Epoch 154 Batch    2/8   train_loss = 1.123
Epoch 154 Batch    3/8   train_loss = 1.136
Epoch 154 Batch    4/8   train_loss = 1.100
Epoch 154 Batch    5/8   train_loss = 1.167
Epoch 154 Batch    6/8   train_loss = 1.116
Epoch 154 Batch    7/8   train_loss = 1.089
Epoch 155 Batch    0/8   train_loss = 1.132
Epoch 155 Batch    1/8   train_loss = 1.124
Epoch 155 Batch    2/8   train_loss = 1.127
Epoch 155 Batch    3/8   train_loss = 1.128
Epoch 155 Batch    4/8   train_loss = 1.081
Epoch 155 Batch    5/8   train_loss = 1.145
Epoch 155 Batch    6/8   train_loss = 1.116
Epoch 155 Batch    7/8   train_loss = 1.086
Epoch 156 Batch    0/8   train_loss = 1.114
Epoch 156 Batch    1/8   train_loss = 1.106
Epoch 156 Batch    2/8   train_loss = 1.113
Epoch 156 Batch    3/8   train_loss = 1.134
Epoch 156 Batch    4/8   train_loss = 1.072
Epoch 156 Batch    5/8   train_loss = 1.120
Epoch 156 Batch    6/8   train_loss = 1.093
Epoch 156 Batch    7/8   train_loss = 1.091
Epoch 157 Batch    0/8   train_loss = 1.090
Epoch 157 Batch    1/8   train_loss = 1.097
Epoch 157 Batch    2/8   train_loss = 1.111
Epoch 157 Batch    3/8   train_loss = 1.134
Epoch 157 Batch    4/8   train_loss = 1.079
Epoch 157 Batch    5/8   train_loss = 1.103
Epoch 157 Batch    6/8   train_loss = 1.069
Epoch 157 Batch    7/8   train_loss = 1.112
Epoch 158 Batch    0/8   train_loss = 1.095
Epoch 158 Batch    1/8   train_loss = 1.071
Epoch 158 Batch    2/8   train_loss = 1.082
Epoch 158 Batch    3/8   train_loss = 1.137
Epoch 158 Batch    4/8   train_loss = 1.096
Epoch 158 Batch    5/8   train_loss = 1.115
Epoch 158 Batch    6/8   train_loss = 1.047
Epoch 158 Batch    7/8   train_loss = 1.065
Epoch 159 Batch    0/8   train_loss = 1.122
Epoch 159 Batch    1/8   train_loss = 1.078
Epoch 159 Batch    2/8   train_loss = 1.068
Epoch 159 Batch    3/8   train_loss = 1.096
Epoch 159 Batch    4/8   train_loss = 1.069
Epoch 159 Batch    5/8   train_loss = 1.123
Epoch 159 Batch    6/8   train_loss = 1.064
Epoch 159 Batch    7/8   train_loss = 1.037
Epoch 160 Batch    0/8   train_loss = 1.086
Epoch 160 Batch    1/8   train_loss = 1.091
Epoch 160 Batch    2/8   train_loss = 1.054
Epoch 160 Batch    3/8   train_loss = 1.068
Epoch 160 Batch    4/8   train_loss = 1.043
Epoch 160 Batch    5/8   train_loss = 1.100
Epoch 160 Batch    6/8   train_loss = 1.048
Epoch 160 Batch    7/8   train_loss = 1.027
Epoch 161 Batch    0/8   train_loss = 1.057
Epoch 161 Batch    1/8   train_loss = 1.079
Epoch 161 Batch    2/8   train_loss = 1.051
Epoch 161 Batch    3/8   train_loss = 1.044
Epoch 161 Batch    4/8   train_loss = 1.029
Epoch 161 Batch    5/8   train_loss = 1.090
Epoch 161 Batch    6/8   train_loss = 1.040
Epoch 161 Batch    7/8   train_loss = 1.026
Epoch 162 Batch    0/8   train_loss = 1.040
Epoch 162 Batch    1/8   train_loss = 1.048
Epoch 162 Batch    2/8   train_loss = 1.052
Epoch 162 Batch    3/8   train_loss = 1.029
Epoch 162 Batch    4/8   train_loss = 0.994
Epoch 162 Batch    5/8   train_loss = 1.070
Epoch 162 Batch    6/8   train_loss = 1.029
Epoch 162 Batch    7/8   train_loss = 1.005
Epoch 163 Batch    0/8   train_loss = 1.029
Epoch 163 Batch    1/8   train_loss = 1.024
Epoch 163 Batch    2/8   train_loss = 1.033
Epoch 163 Batch    3/8   train_loss = 1.028
Epoch 163 Batch    4/8   train_loss = 0.982
Epoch 163 Batch    5/8   train_loss = 1.047
Epoch 163 Batch    6/8   train_loss = 1.020
Epoch 163 Batch    7/8   train_loss = 1.004
Epoch 164 Batch    0/8   train_loss = 1.012
Epoch 164 Batch    1/8   train_loss = 1.012
Epoch 164 Batch    2/8   train_loss = 1.030
Epoch 164 Batch    3/8   train_loss = 1.022
Epoch 164 Batch    4/8   train_loss = 0.983
Epoch 164 Batch    5/8   train_loss = 1.032
Epoch 164 Batch    6/8   train_loss = 1.005
Epoch 164 Batch    7/8   train_loss = 1.007
Epoch 165 Batch    0/8   train_loss = 0.996
Epoch 165 Batch    1/8   train_loss = 1.002
Epoch 165 Batch    2/8   train_loss = 1.027
Epoch 165 Batch    3/8   train_loss = 1.021
Epoch 165 Batch    4/8   train_loss = 0.985
Epoch 165 Batch    5/8   train_loss = 1.033
Epoch 165 Batch    6/8   train_loss = 0.991
Epoch 165 Batch    7/8   train_loss = 1.020
Epoch 166 Batch    0/8   train_loss = 1.003
Epoch 166 Batch    1/8   train_loss = 0.985
Epoch 166 Batch    2/8   train_loss = 1.001
Epoch 166 Batch    3/8   train_loss = 1.032
Epoch 166 Batch    4/8   train_loss = 1.003
Epoch 166 Batch    5/8   train_loss = 1.030
Epoch 166 Batch    6/8   train_loss = 0.984
Epoch 166 Batch    7/8   train_loss = 0.980
Epoch 167 Batch    0/8   train_loss = 1.030
Epoch 167 Batch    1/8   train_loss = 1.005
Epoch 167 Batch    2/8   train_loss = 0.974
Epoch 167 Batch    3/8   train_loss = 1.009
Epoch 167 Batch    4/8   train_loss = 0.997
Epoch 167 Batch    5/8   train_loss = 1.036
Epoch 167 Batch    6/8   train_loss = 1.010
Epoch 167 Batch    7/8   train_loss = 0.975
Epoch 168 Batch    0/8   train_loss = 0.994
Epoch 168 Batch    1/8   train_loss = 1.010
Epoch 168 Batch    2/8   train_loss = 0.993
Epoch 168 Batch    3/8   train_loss = 1.000
Epoch 168 Batch    4/8   train_loss = 0.973
Epoch 168 Batch    5/8   train_loss = 1.010
Epoch 168 Batch    6/8   train_loss = 0.986
Epoch 168 Batch    7/8   train_loss = 0.974
Epoch 169 Batch    0/8   train_loss = 0.997
Epoch 169 Batch    1/8   train_loss = 0.977
Epoch 169 Batch    2/8   train_loss = 0.979
Epoch 169 Batch    3/8   train_loss = 0.993
Epoch 169 Batch    4/8   train_loss = 0.952
Epoch 169 Batch    5/8   train_loss = 1.010
Epoch 169 Batch    6/8   train_loss = 0.961
Epoch 169 Batch    7/8   train_loss = 0.950
Epoch 170 Batch    0/8   train_loss = 0.985
Epoch 170 Batch    1/8   train_loss = 0.961
Epoch 170 Batch    2/8   train_loss = 0.960
Epoch 170 Batch    3/8   train_loss = 0.983
Epoch 170 Batch    4/8   train_loss = 0.934
Epoch 170 Batch    5/8   train_loss = 0.984
Epoch 170 Batch    6/8   train_loss = 0.948
Epoch 170 Batch    7/8   train_loss = 0.946
Epoch 171 Batch    0/8   train_loss = 0.972
Epoch 171 Batch    1/8   train_loss = 0.955
Epoch 171 Batch    2/8   train_loss = 0.940
Epoch 171 Batch    3/8   train_loss = 0.974
Epoch 171 Batch    4/8   train_loss = 0.949
Epoch 171 Batch    5/8   train_loss = 0.981
Epoch 171 Batch    6/8   train_loss = 0.922
Epoch 171 Batch    7/8   train_loss = 0.927
Epoch 172 Batch    0/8   train_loss = 0.973
Epoch 172 Batch    1/8   train_loss = 0.974
Epoch 172 Batch    2/8   train_loss = 0.929
Epoch 172 Batch    3/8   train_loss = 0.946
Epoch 172 Batch    4/8   train_loss = 0.934
Epoch 172 Batch    5/8   train_loss = 0.984
Epoch 172 Batch    6/8   train_loss = 0.921
Epoch 172 Batch    7/8   train_loss = 0.905
Epoch 173 Batch    0/8   train_loss = 0.946
Epoch 173 Batch    1/8   train_loss = 0.966
Epoch 173 Batch    2/8   train_loss = 0.947
Epoch 173 Batch    3/8   train_loss = 0.932
Epoch 173 Batch    4/8   train_loss = 0.910
Epoch 173 Batch    5/8   train_loss = 0.975
Epoch 173 Batch    6/8   train_loss = 0.938
Epoch 173 Batch    7/8   train_loss = 0.901
Epoch 174 Batch    0/8   train_loss = 0.928
Epoch 174 Batch    1/8   train_loss = 0.952
Epoch 174 Batch    2/8   train_loss = 0.957
Epoch 174 Batch    3/8   train_loss = 0.926
Epoch 174 Batch    4/8   train_loss = 0.880
Epoch 174 Batch    5/8   train_loss = 0.964
Epoch 174 Batch    6/8   train_loss = 0.936
Epoch 174 Batch    7/8   train_loss = 0.897
Epoch 175 Batch    0/8   train_loss = 0.917
Epoch 175 Batch    1/8   train_loss = 0.926
Epoch 175 Batch    2/8   train_loss = 0.940
Epoch 175 Batch    3/8   train_loss = 0.937
Epoch 175 Batch    4/8   train_loss = 0.892
Epoch 175 Batch    5/8   train_loss = 0.952
Epoch 175 Batch    6/8   train_loss = 0.916
Epoch 175 Batch    7/8   train_loss = 0.888
Epoch 176 Batch    0/8   train_loss = 0.911
Epoch 176 Batch    1/8   train_loss = 0.914
Epoch 176 Batch    2/8   train_loss = 0.921
Epoch 176 Batch    3/8   train_loss = 0.930
Epoch 176 Batch    4/8   train_loss = 0.868
Epoch 176 Batch    5/8   train_loss = 0.927
Epoch 176 Batch    6/8   train_loss = 0.910
Epoch 176 Batch    7/8   train_loss = 0.882
Epoch 177 Batch    0/8   train_loss = 0.890
Epoch 177 Batch    1/8   train_loss = 0.895
Epoch 177 Batch    2/8   train_loss = 0.906
Epoch 177 Batch    3/8   train_loss = 0.913
Epoch 177 Batch    4/8   train_loss = 0.865
Epoch 177 Batch    5/8   train_loss = 0.912
Epoch 177 Batch    6/8   train_loss = 0.877
Epoch 177 Batch    7/8   train_loss = 0.872
Epoch 178 Batch    0/8   train_loss = 0.884
Epoch 178 Batch    1/8   train_loss = 0.879
Epoch 178 Batch    2/8   train_loss = 0.884
Epoch 178 Batch    3/8   train_loss = 0.900
Epoch 178 Batch    4/8   train_loss = 0.854
Epoch 178 Batch    5/8   train_loss = 0.902
Epoch 178 Batch    6/8   train_loss = 0.865
Epoch 178 Batch    7/8   train_loss = 0.855
Epoch 179 Batch    0/8   train_loss = 0.871
Epoch 179 Batch    1/8   train_loss = 0.862
Epoch 179 Batch    2/8   train_loss = 0.880
Epoch 179 Batch    3/8   train_loss = 0.884
Epoch 179 Batch    4/8   train_loss = 0.844
Epoch 179 Batch    5/8   train_loss = 0.887
Epoch 179 Batch    6/8   train_loss = 0.852
Epoch 179 Batch    7/8   train_loss = 0.852
Epoch 180 Batch    0/8   train_loss = 0.872
Epoch 180 Batch    1/8   train_loss = 0.848
Epoch 180 Batch    2/8   train_loss = 0.873
Epoch 180 Batch    3/8   train_loss = 0.878
Epoch 180 Batch    4/8   train_loss = 0.833
Epoch 180 Batch    5/8   train_loss = 0.883
Epoch 180 Batch    6/8   train_loss = 0.852
Epoch 180 Batch    7/8   train_loss = 0.840
Epoch 181 Batch    0/8   train_loss = 0.877
Epoch 181 Batch    1/8   train_loss = 0.845
Epoch 181 Batch    2/8   train_loss = 0.857
Epoch 181 Batch    3/8   train_loss = 0.881
Epoch 181 Batch    4/8   train_loss = 0.825
Epoch 181 Batch    5/8   train_loss = 0.876
Epoch 181 Batch    6/8   train_loss = 0.858
Epoch 181 Batch    7/8   train_loss = 0.833
Epoch 182 Batch    0/8   train_loss = 0.859
Epoch 182 Batch    1/8   train_loss = 0.845
Epoch 182 Batch    2/8   train_loss = 0.858
Epoch 182 Batch    3/8   train_loss = 0.880
Epoch 182 Batch    4/8   train_loss = 0.828
Epoch 182 Batch    5/8   train_loss = 0.867
Epoch 182 Batch    6/8   train_loss = 0.840
Epoch 182 Batch    7/8   train_loss = 0.841
Epoch 183 Batch    0/8   train_loss = 0.875
Epoch 183 Batch    1/8   train_loss = 0.837
Epoch 183 Batch    2/8   train_loss = 0.834
Epoch 183 Batch    3/8   train_loss = 0.872
Epoch 183 Batch    4/8   train_loss = 0.845
Epoch 183 Batch    5/8   train_loss = 0.871
Epoch 183 Batch    6/8   train_loss = 0.826
Epoch 183 Batch    7/8   train_loss = 0.816
Epoch 184 Batch    0/8   train_loss = 0.875
Epoch 184 Batch    1/8   train_loss = 0.854
Epoch 184 Batch    2/8   train_loss = 0.835
Epoch 184 Batch    3/8   train_loss = 0.839
Epoch 184 Batch    4/8   train_loss = 0.824
Epoch 184 Batch    5/8   train_loss = 0.876
Epoch 184 Batch    6/8   train_loss = 0.843
Epoch 184 Batch    7/8   train_loss = 0.805
Epoch 185 Batch    0/8   train_loss = 0.831
Epoch 185 Batch    1/8   train_loss = 0.838
Epoch 185 Batch    2/8   train_loss = 0.855
Epoch 185 Batch    3/8   train_loss = 0.846
Epoch 185 Batch    4/8   train_loss = 0.806
Epoch 185 Batch    5/8   train_loss = 0.849
Epoch 185 Batch    6/8   train_loss = 0.832
Epoch 185 Batch    7/8   train_loss = 0.830
Epoch 186 Batch    0/8   train_loss = 0.843
Epoch 186 Batch    1/8   train_loss = 0.804
Epoch 186 Batch    2/8   train_loss = 0.833
Epoch 186 Batch    3/8   train_loss = 0.843
Epoch 186 Batch    4/8   train_loss = 0.817
Epoch 186 Batch    5/8   train_loss = 0.845
Epoch 186 Batch    6/8   train_loss = 0.804
Epoch 186 Batch    7/8   train_loss = 0.792
Epoch 187 Batch    0/8   train_loss = 0.851
Epoch 187 Batch    1/8   train_loss = 0.823
Epoch 187 Batch    2/8   train_loss = 0.812
Epoch 187 Batch    3/8   train_loss = 0.836
Epoch 187 Batch    4/8   train_loss = 0.787
Epoch 187 Batch    5/8   train_loss = 0.836
Epoch 187 Batch    6/8   train_loss = 0.819
Epoch 187 Batch    7/8   train_loss = 0.788
Epoch 188 Batch    0/8   train_loss = 0.829
Epoch 188 Batch    1/8   train_loss = 0.820
Epoch 188 Batch    2/8   train_loss = 0.814
Epoch 188 Batch    3/8   train_loss = 0.815
Epoch 188 Batch    4/8   train_loss = 0.801
Epoch 188 Batch    5/8   train_loss = 0.829
Epoch 188 Batch    6/8   train_loss = 0.794
Epoch 188 Batch    7/8   train_loss = 0.782
Epoch 189 Batch    0/8   train_loss = 0.820
Epoch 189 Batch    1/8   train_loss = 0.805
Epoch 189 Batch    2/8   train_loss = 0.803
Epoch 189 Batch    3/8   train_loss = 0.802
Epoch 189 Batch    4/8   train_loss = 0.761
Epoch 189 Batch    5/8   train_loss = 0.822
Epoch 189 Batch    6/8   train_loss = 0.801
Epoch 189 Batch    7/8   train_loss = 0.758
Epoch 190 Batch    0/8   train_loss = 0.780
Epoch 190 Batch    1/8   train_loss = 0.777
Epoch 190 Batch    2/8   train_loss = 0.796
Epoch 190 Batch    3/8   train_loss = 0.799
Epoch 190 Batch    4/8   train_loss = 0.742
Epoch 190 Batch    5/8   train_loss = 0.788
Epoch 190 Batch    6/8   train_loss = 0.774
Epoch 190 Batch    7/8   train_loss = 0.767
Epoch 191 Batch    0/8   train_loss = 0.768
Epoch 191 Batch    1/8   train_loss = 0.747
Epoch 191 Batch    2/8   train_loss = 0.770
Epoch 191 Batch    3/8   train_loss = 0.795
Epoch 191 Batch    4/8   train_loss = 0.741
Epoch 191 Batch    5/8   train_loss = 0.770
Epoch 191 Batch    6/8   train_loss = 0.747
Epoch 191 Batch    7/8   train_loss = 0.739
Epoch 192 Batch    0/8   train_loss = 0.764
Epoch 192 Batch    1/8   train_loss = 0.737
Epoch 192 Batch    2/8   train_loss = 0.737
Epoch 192 Batch    3/8   train_loss = 0.768
Epoch 192 Batch    4/8   train_loss = 0.729
Epoch 192 Batch    5/8   train_loss = 0.760
Epoch 192 Batch    6/8   train_loss = 0.719
Epoch 192 Batch    7/8   train_loss = 0.713
Epoch 193 Batch    0/8   train_loss = 0.748
Epoch 193 Batch    1/8   train_loss = 0.735
Epoch 193 Batch    2/8   train_loss = 0.725
Epoch 193 Batch    3/8   train_loss = 0.738
Epoch 193 Batch    4/8   train_loss = 0.712
Epoch 193 Batch    5/8   train_loss = 0.761
Epoch 193 Batch    6/8   train_loss = 0.721
Epoch 193 Batch    7/8   train_loss = 0.695
Epoch 194 Batch    0/8   train_loss = 0.728
Epoch 194 Batch    1/8   train_loss = 0.722
Epoch 194 Batch    2/8   train_loss = 0.734
Epoch 194 Batch    3/8   train_loss = 0.737
Epoch 194 Batch    4/8   train_loss = 0.682
Epoch 194 Batch    5/8   train_loss = 0.735
Epoch 194 Batch    6/8   train_loss = 0.722
Epoch 194 Batch    7/8   train_loss = 0.706
Epoch 195 Batch    0/8   train_loss = 0.716
Epoch 195 Batch    1/8   train_loss = 0.694
Epoch 195 Batch    2/8   train_loss = 0.716
Epoch 195 Batch    3/8   train_loss = 0.745
Epoch 195 Batch    4/8   train_loss = 0.690
Epoch 195 Batch    5/8   train_loss = 0.715
Epoch 195 Batch    6/8   train_loss = 0.694
Epoch 195 Batch    7/8   train_loss = 0.696
Epoch 196 Batch    0/8   train_loss = 0.721
Epoch 196 Batch    1/8   train_loss = 0.689
Epoch 196 Batch    2/8   train_loss = 0.694
Epoch 196 Batch    3/8   train_loss = 0.727
Epoch 196 Batch    4/8   train_loss = 0.691
Epoch 196 Batch    5/8   train_loss = 0.720
Epoch 196 Batch    6/8   train_loss = 0.690
Epoch 196 Batch    7/8   train_loss = 0.674
Epoch 197 Batch    0/8   train_loss = 0.715
Epoch 197 Batch    1/8   train_loss = 0.698
Epoch 197 Batch    2/8   train_loss = 0.690
Epoch 197 Batch    3/8   train_loss = 0.705
Epoch 197 Batch    4/8   train_loss = 0.674
Epoch 197 Batch    5/8   train_loss = 0.719
Epoch 197 Batch    6/8   train_loss = 0.687
Epoch 197 Batch    7/8   train_loss = 0.660
Epoch 198 Batch    0/8   train_loss = 0.695
Epoch 198 Batch    1/8   train_loss = 0.690
Epoch 198 Batch    2/8   train_loss = 0.692
Epoch 198 Batch    3/8   train_loss = 0.702
Epoch 198 Batch    4/8   train_loss = 0.657
Epoch 198 Batch    5/8   train_loss = 0.706
Epoch 198 Batch    6/8   train_loss = 0.690
Epoch 198 Batch    7/8   train_loss = 0.659
Epoch 199 Batch    0/8   train_loss = 0.682
Epoch 199 Batch    1/8   train_loss = 0.674
Epoch 199 Batch    2/8   train_loss = 0.692
Epoch 199 Batch    3/8   train_loss = 0.708
Epoch 199 Batch    4/8   train_loss = 0.653
Epoch 199 Batch    5/8   train_loss = 0.691
Epoch 199 Batch    6/8   train_loss = 0.677
Epoch 199 Batch    7/8   train_loss = 0.659
Model Trained and Saved

Save Parameters

Save seq_length and save_dir for generating a new TV script.


In [55]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Save parameters for checkpoint
helper.save_params((seq_length, save_dir))

Checkpoint


In [56]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import tensorflow as tf
import numpy as np
import helper
import problem_unittests as tests

_, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()
seq_length, load_dir = helper.load_params()

Implement Generate Functions

Get Tensors

Get tensors from loaded_graph using the function get_tensor_by_name(). Get the tensors using the following names:

  • "input:0"
  • "initial_state:0"
  • "final_state:0"
  • "probs:0"

Return the tensors in the following tuple (InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)


In [57]:
def get_tensors(loaded_graph):
    """
    Get input, initial state, final state, and probabilities tensor from <loaded_graph>
    :param loaded_graph: TensorFlow graph loaded from file
    :return: Tuple (InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)
    """
    InputTensor = loaded_graph.get_tensor_by_name("input:0")
    InitialStateTensor = loaded_graph.get_tensor_by_name("initial_state:0")
    FinalStateTensor = loaded_graph.get_tensor_by_name("final_state:0")
    ProbsTensor = loaded_graph.get_tensor_by_name("probs:0")
    return InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_tensors(get_tensors)


Tests Passed

Choose Word

Implement the pick_word() function to select the next word using probabilities.


In [58]:
def pick_word(probabilities, int_to_vocab):
    """
    Pick the next word in the generated text
    :param probabilities: Probabilites of the next word
    :param int_to_vocab: Dictionary of word ids as the keys and words as the values
    :return: String of the predicted word
    """
    val = np.random.uniform()
    for i, p in enumerate(probabilities):
        if val < p:
            return int_to_vocab[i]
        val -= p
    return None


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_pick_word(pick_word)


Tests Passed

Generate TV Script

This will generate the TV script for you. Set gen_length to the length of TV script you want to generate.


In [59]:
gen_length = 200
# homer_simpson, moe_szyslak, or Barney_Gumble
prime_word = 'moe_szyslak'

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
loaded_graph = tf.Graph()
with tf.Session(graph=loaded_graph) as sess:
    # Load saved model
    loader = tf.train.import_meta_graph(load_dir + '.meta')
    loader.restore(sess, load_dir)

    # Get Tensors from loaded model
    input_text, initial_state, final_state, probs = get_tensors(loaded_graph)

    # Sentences generation setup
    gen_sentences = [prime_word + ':']
    prev_state = sess.run(initial_state, {input_text: np.array([[1]])})

    # Generate sentences
    for n in range(gen_length):
        # Dynamic Input
        dyn_input = [[vocab_to_int[word] for word in gen_sentences[-seq_length:]]]
        dyn_seq_length = len(dyn_input[0])

        # Get Prediction
        probabilities, prev_state = sess.run(
            [probs, final_state],
            {input_text: dyn_input, initial_state: prev_state})
        
        pred_word = pick_word(probabilities[dyn_seq_length-1], int_to_vocab)

        gen_sentences.append(pred_word)
    
    # Remove tokens
    tv_script = ' '.join(gen_sentences)
    for key, token in token_dict.items():
        ending = ' ' if key in ['\n', '(', '"'] else ''
        tv_script = tv_script.replace(' ' + token.lower(), key)
    tv_script = tv_script.replace('\n ', '\n')
    tv_script = tv_script.replace('( ', '(')
        
    print(tv_script)


moe_szyslak: hell, to be the car.
lenny_leonard: hey, this was...
marge_simpson:(man) this is a ticket, homer, you're gonna never die.
homer_simpson:(" is there it out.
moe_szyslak: aw, that's the guy.
moe_szyslak:(singing) yeah, let me the lost along the card.
homer_simpson:(sing-song) wait a second wait!
homer_simpson: oh, that's the super bowl is, there's no blurbs on the lost in the bar and occupation.
moe_szyslak: it sounds.(to guys) okay. i wanna go out to do it) i am sure!
homer_simpson: a girl!
carl_carlson: which is everyone! not!
moe_szyslak:(on a glasses of underbridge academy!" all right. that's get.


moe_szyslak: it's one, huh?
moe_szyslak:(to moe... yeah, i got a huge settlement... yeah.
moe_szyslak: a little busy. and the other people it is the last way the poster homer ain't up my

The TV Script is Nonsensical

It's ok if the TV script doesn't make any sense. We trained on less than a megabyte of text. In order to get good results, you'll have to use a smaller vocabulary or get more data. Luckly there's more data! As we mentioned in the begging of this project, this is a subset of another dataset. We didn't have you train on all the data, because that would take too long. However, you are free to train your neural network on all the data. After you complete the project, of course.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_tv_script_generation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.