TV Script Generation

In this project, you'll generate your own Simpsons TV scripts using RNNs. You'll be using part of the Simpsons dataset of scripts from 27 seasons. The Neural Network you'll build will generate a new TV script for a scene at Moe's Tavern.

Get the Data

The data is already provided for you. You'll be using a subset of the original dataset. It consists of only the scenes in Moe's Tavern. This doesn't include other versions of the tavern, like "Moe's Cavern", "Flaming Moe's", "Uncle Moe's Family Feed-Bag", etc..


In [50]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper
import random

#data_dir = './data/simpsons/moes_tavern_lines.txt'
#data_dir = './data/all/simpsons_all.csv'
data_dir = './data/all/simpsons_norm_names_all.csv'
text = helper.load_data(data_dir)
# Ignore notice, since we don't use it for analysing the data
text = text[81:]

Explore the Data

Play around with view_sentence_range to view different parts of the data.


In [51]:
view_sentence_range = (0, 10)

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

print('Dataset Stats')
print('Roughly the number of unique words: {}'.format(len({word: None for word in text.split()})))
scenes = text.split('\n\n')
print('Number of scenes: {}'.format(len(scenes)))
sentence_count_scene = [scene.count('\n') for scene in scenes]
print('Average number of sentences in each scene: {}'.format(np.average(sentence_count_scene)))

sentences = [sentence for scene in scenes for sentence in scene.split('\n')]
print('Number of lines: {}'.format(len(sentences)))
word_count_sentence = [len(sentence.split()) for sentence in sentences]
print('Average number of words in each line: {}'.format(np.average(word_count_sentence)))

print()
print('The sentences {} to {}:'.format(*view_sentence_range))
print('\n'.join(text.split('\n')[view_sentence_range[0]:view_sentence_range[1]]))


Dataset Stats
Roughly the number of unique words: 139226
Number of scenes: 1
Average number of sentences in each scene: 157491.0
Number of lines: 157492
Average number of words in each line: 10.649391715134737

The sentences 0 to 10:
n all the magazines and all the news shows, it's only natural that you think you have it.
Lisa_Simpson: (NEAR TEARS) Where's Mr. Bergstrom?
Miss_Hoover: I don't know. Although I'd sure like to talk to him. He didn't touch my lesson plan. What did he teach you?
Lisa_Simpson: That life is worth living.
Edna_Krabappel-Flanders: The polls will be open from now until the end of recess. Now, (SOUR) just in case any of you have decided to put any thought into this, we'll have our final statements. Martin?
Martin_Prince: (HOARSE WHISPER) I don't think there's anything left to say.
Edna_Krabappel-Flanders: Bart?
Bart_Simpson: Victory party under the slide!
(Apartment_Building: Ext. apartment building - day)
Lisa_Simpson: (CALLING) Mr. Bergstrom! Mr. Bergstrom!

Implement Preprocessing Functions

The first thing to do to any dataset is preprocessing. Implement the following preprocessing functions below:

  • Lookup Table
  • Tokenize Punctuation

Lookup Table

To create a word embedding, you first need to transform the words to ids. In this function, create two dictionaries:

  • Dictionary to go from the words to an id, we'll call vocab_to_int
  • Dictionary to go from the id to word, we'll call int_to_vocab

Return these dictionaries in the following tuple (vocab_to_int, int_to_vocab)


In [52]:
import numpy as np
from collections import Counter
import problem_unittests as tests
from sklearn.feature_extraction.text import CountVectorizer

def create_lookup_tables(text, min_count=1):
    """
    Create lookup tables for vocabulary
    :param text: The text of tv scripts split into words
    :return: A tuple of dicts (vocab_to_int, int_to_vocab)
    """
    words = text
    #cv = CountVectorizer()
    #vectorized = cv.fit_transform(text)
    #print(vectorized)
    word_counts = Counter(words)
    #word_counts_2 = Counter(word_counts)

    for k in list(word_counts):
        if word_counts[k] < min_count:
            del word_counts[k]
    sorted_vocab = sorted(word_counts, key=word_counts.get, reverse=True)

    int_to_vocab = {ii: word for ii, word in enumerate(sorted_vocab)}
    vocab_to_int = {word: ii for ii, word in int_to_vocab.items()}
    print(len(sorted_vocab))

    return vocab_to_int, int_to_vocab

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_create_lookup_tables(create_lookup_tables)


71
Tests Passed

Tokenize Punctuation

We'll be splitting the script into a word array using spaces as delimiters. However, punctuations like periods and exclamation marks make it hard for the neural network to distinguish between the word "bye" and "bye!".

Implement the function token_lookup to return a dict that will be used to tokenize symbols like "!" into "||Exclamation_Mark||". Create a dictionary for the following symbols where the symbol is the key and value is the token:

  • Period ( . )
  • Comma ( , )
  • Quotation Mark ( " )
  • Semicolon ( ; )
  • Exclamation mark ( ! )
  • Question mark ( ? )
  • Left Parentheses ( ( )
  • Right Parentheses ( ) )
  • Dash ( -- )
  • Return ( \n )

This dictionary will be used to token the symbols and add the delimiter (space) around it. This separates the symbols as it's own word, making it easier for the neural network to predict on the next word. Make sure you don't use a token that could be confused as a word. Instead of using the token "dash", try using something like "||dash||".


In [53]:
def token_lookup():
    """
    Generate a dict to turn punctuation into a token.
    :return: Tokenize dictionary where the key is the punctuation and the value is the token
    """
    return {
        '.': '__period__',
        ',': '__comma__',
        '"': '__double_quote__',
        ';': '__semi-colon__',
        '!': '__exclamation__',
        '?': '__question__',
        '(': '__open_paren__',
        ')': '__close_paren__',
        '--': '__dash__',
        '\n': '__endline__'
    }

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_tokenize(token_lookup)


Tests Passed

Preprocess all the data and save it

Running the code cell below will preprocess all the data and save it to file.


In [54]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(data_dir, token_lookup, create_lookup_tables)


62219

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [55]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper
import numpy as np
import problem_unittests as tests

int_text, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()

Build the Neural Network

You'll build the components necessary to build a RNN by implementing the following functions below:

  • get_inputs
  • get_init_cell
  • get_embed
  • build_rnn
  • build_nn
  • get_batches

Check the Version of TensorFlow and Access to GPU


In [56]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer'
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


TensorFlow Version: 1.0.1
Default GPU Device: /gpu:0

Input

Implement the get_inputs() function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Input text placeholder named "input" using the TF Placeholder name parameter.
  • Targets placeholder
  • Learning Rate placeholder

Return the placeholders in the following tuple (Input, Targets, LearningRate)


In [57]:
def get_inputs():
    """
    Create TF Placeholders for input, targets, and learning rate.
    :return: Tuple (input, targets, learning rate)
    """
    input_test = tf.placeholder(tf.int32, [None, None], name='input')
    targets = tf.placeholder(tf.int32, [None, None], name='targets')
    learning_rate = tf.placeholder(tf.float32, name='learning_rate')
    return input_test, targets, learning_rate


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_inputs(get_inputs)


Tests Passed

Build RNN Cell and Initialize

Stack one or more BasicLSTMCells in a MultiRNNCell.

  • The Rnn size should be set using rnn_size
  • Initalize Cell State using the MultiRNNCell's zero_state() function
    • Apply the name "initial_state" to the initial state using tf.identity()

Return the cell and initial state in the following tuple (Cell, InitialState)


In [58]:
def get_init_cell(batch_size, rnn_size, lstm_layers=1, keep_prob=1.0):
    """
    Create an RNN Cell and initialize it.
    :param batch_size: Size of batches
    :param rnn_size: Size of RNNs
    :param lstm_layers: Number of layers to apply to LSTM
    :param keep_prob: Dropout keep probability for cell
    :return: Tuple (cell, initialize state)
    """
    # A basic LSTM cell
    lstm = tf.contrib.rnn.BasicLSTMCell(rnn_size)

    # Add dropout to the cell
    dropout_wrapper = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)

    # Stack up multiple LSTM layers, for deep learning
    cell = tf.contrib.rnn.MultiRNNCell([dropout_wrapper] * lstm_layers)
    
    # Getting an initial state of all zeros
    initial_state = tf.identity(cell.zero_state(batch_size, tf.float32), 'initial_state')
    return cell, initial_state

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_init_cell(get_init_cell)


Tests Passed

Word Embedding

Apply embedding to input_data using TensorFlow. Return the embedded sequence.


In [59]:
def get_embed(input_data, vocab_size, embed_dim):
    """
    Create embedding for <input_data>.
    :param input_data: TF placeholder for text input.
    :param vocab_size: Number of words in vocabulary.
    :param embed_dim: Number of embedding dimensions
    :return: Embedded input.
    """
    #embedding = tf.Variable(tf.random_uniform((vocab_size, embed_dim), -1, 1))
    #embed = tf.nn.embedding_lookup(embedding, input_data)
    #return embed
    # consider using:
    return tf.contrib.layers.embed_sequence(
        input_data, vocab_size=vocab_size, embed_dim=embed_dim)

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_embed(get_embed)


Tests Passed

Build RNN

You created a RNN Cell in the get_init_cell() function. Time to use the cell to create a RNN.

Return the outputs and final_state state in the following tuple (Outputs, FinalState)


In [60]:
def build_rnn(cell, inputs):
    """
    Create a RNN using a RNN Cell
    :param cell: RNN Cell
    :param inputs: Input text data
    :return: Tuple (Outputs, Final State)
    """
    # note: third argument is placeholder for initial_state
    outputs, final_state = tf.nn.dynamic_rnn(
        cell=cell, inputs=inputs, dtype=tf.float32)
    final_state = tf.identity(final_state, 'final_state')
    return outputs, final_state

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_build_rnn(build_rnn)


Tests Passed

Build the Neural Network

Apply the functions you implemented above to:

  • Apply embedding to input_data using your get_embed(input_data, vocab_size, embed_dim) function.
  • Build RNN using cell and your build_rnn(cell, inputs) function.
  • Apply a fully connected layer with a linear activation and vocab_size as the number of outputs.

Return the logits and final state in the following tuple (Logits, FinalState)


In [61]:
def build_nn(cell, rnn_size, input_data, vocab_size, embed_dim):
    """
    Build part of the neural network
    :param cell: RNN cell
    :param embed_dim: Size of word embeddings to use
    :param input_data: Input data
    :param vocab_size: Vocabulary size
    :param embed_dim: Number of embedding dimensions
    :return: Tuple (Logits, FinalState)
    """
    embedding = get_embed(input_data, vocab_size, embed_dim)
    lstm_outputs, final_state = build_rnn(cell, embedding)

    logits = tf.contrib.layers.fully_connected(
        lstm_outputs,
        vocab_size,
        weights_initializer=tf.truncated_normal_initializer(stddev=0.1),
        biases_initializer=tf.zeros_initializer(),
        activation_fn=None)

    return logits, final_state


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_build_nn(build_nn)


Tests Passed

Batches

Implement get_batches to create batches of input and targets using int_text. The batches should be a Numpy array with the shape (number of batches, 2, batch size, sequence length). Each batch contains two elements:

  • The first element is a single batch of input with the shape [batch size, sequence length]
  • The second element is a single batch of targets with the shape [batch size, sequence length]

If you can't fill the last batch with enough data, drop the last batch.

For exmple, get_batches([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20], 3, 2) would return a Numpy array of the following:

[
  # First Batch
  [
    # Batch of Input
    [[ 1  2], [ 7  8], [13 14]]
    # Batch of targets
    [[ 2  3], [ 8  9], [14 15]]
  ]

  # Second Batch
  [
    # Batch of Input
    [[ 3  4], [ 9 10], [15 16]]
    # Batch of targets
    [[ 4  5], [10 11], [16 17]]
  ]

  # Third Batch
  [
    # Batch of Input
    [[ 5  6], [11 12], [17 18]]
    # Batch of targets
    [[ 6  7], [12 13], [18  1]]
  ]
]

Notice that the last target value in the last batch is the first input value of the first batch. In this case, 1. This is a common technique used when creating sequence batches, although it is rather unintuitive.


In [62]:
def get_batches(int_text, batch_size, seq_length):
    """
    Return batches of input and target
    :param int_text: Text with the words replaced by their ids
    :param batch_size: The size of batch
    :param seq_length: The length of sequence
    :return: Batches as a Numpy array
    """
    n_batches = (len(int_text)-1)//(batch_size * seq_length)    
    int_text = int_text[:n_batches * batch_size * seq_length + 1]

    int_text_input_seq = [int_text[i*seq_length:i*seq_length+seq_length] for i in range(0, n_batches * batch_size)]
    int_text = int_text[1:]
    int_text_output = [int_text[i*seq_length:i*seq_length+seq_length] for i in range(0, n_batches * batch_size)]

    all_data = []
    for row in range(n_batches):
        input_cols = []
        target_cols = []
        for col in range(batch_size):
            input_cols.append(int_text_input_seq[col * n_batches + row])
            target_cols.append(int_text_output[col * n_batches + row])
        all_data.append([input_cols, target_cols])

    return np.array(all_data)

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_batches(get_batches)


Tests Passed

Neural Network Training

Hyperparameters

Tune the following parameters:

  • Set num_epochs to the number of epochs.
  • Set batch_size to the batch size.
  • Set rnn_size to the size of the RNNs.
  • Set embed_dim to the size of the text word embeddings.
  • Set seq_length to the length of sequence.
  • Set learning_rate to the learning rate.
  • Set show_every_n_batches to the number of batches the neural network should print progress.

In [63]:
# reminder: tune hyper params according to advice at
# check out https://nd101.slack.com/messages/C3PJV4741/convo/C3PJV4741-1490412688.590254/


# Number of Epochs
num_epochs = 50
# Batch Size
batch_size = 128
# RNN Size
rnn_size = 512
# Number of embedding dimensions
embed_dim = 300
# Sequence Length
seq_length = 16
# Learning Rate
learning_rate = 0.01
# Show stats for every n number of batches
show_every_n_batches = 25
keep_prob = 1.0
lstm_layers = 2

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
save_dir = './save'

Build the Graph

Build the graph using the neural network you implemented.


In [64]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from tensorflow.contrib import seq2seq

train_graph = tf.Graph()
with train_graph.as_default():
    vocab_size = len(int_to_vocab)
    input_text, targets, lr = get_inputs()
    input_data_shape = tf.shape(input_text)
    cell, initial_state = get_init_cell(input_data_shape[0], rnn_size, lstm_layers=lstm_layers, keep_prob=keep_prob)
    logits, final_state = build_nn(cell, embed_dim, input_text, vocab_size)

    # Probabilities for generating words
    probs = tf.nn.softmax(logits, name='probs')

    # Loss function
    cost = seq2seq.sequence_loss(
        logits,
        targets,
        tf.ones([input_data_shape[0], input_data_shape[1]]))

    # Optimizer
    optimizer = tf.train.AdamOptimizer(lr)

    # Gradient Clipping
    gradients = optimizer.compute_gradients(cost)
    capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients if grad is not None]
    train_op = optimizer.apply_gradients(capped_gradients)

Train

Train the neural network on the preprocessed data. If you have a hard time getting a good loss, check the forms to see if anyone is having the same problem.


In [65]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
batches = get_batches(int_text, batch_size, seq_length)

with tf.Session(graph=train_graph) as sess:
    sess.run(tf.global_variables_initializer())

    for epoch_i in range(num_epochs):
        state = sess.run(initial_state, {input_text: batches[0][0]})

        for batch_i, (x, y) in enumerate(batches):
            feed = {
                input_text: x,
                targets: y,
                initial_state: state,
                lr: learning_rate}
            train_loss, state, _ = sess.run([cost, final_state, train_op], feed)

            # Show every <show_every_n_batches> batches
            if (epoch_i * len(batches) + batch_i) % show_every_n_batches == 0:
                print('Epoch {:>3} Batch {:>4}/{}   train_loss = {:.3f}'.format(
                    epoch_i,
                    batch_i,
                    len(batches),
                    train_loss))

    # Save Model
    saver = tf.train.Saver()
    saver.save(sess, save_dir)
    print('Model Trained and Saved')


Epoch   0 Batch    0/1173   train_loss = 11.039
Epoch   0 Batch   25/1173   train_loss = 6.307
Epoch   0 Batch   50/1173   train_loss = 5.730
Epoch   0 Batch   75/1173   train_loss = 5.603
Epoch   0 Batch  100/1173   train_loss = 5.301
Epoch   0 Batch  125/1173   train_loss = 5.266
Epoch   0 Batch  150/1173   train_loss = 5.090
Epoch   0 Batch  175/1173   train_loss = 5.056
Epoch   0 Batch  200/1173   train_loss = 5.012
Epoch   0 Batch  225/1173   train_loss = 5.050
Epoch   0 Batch  250/1173   train_loss = 4.836
Epoch   0 Batch  275/1173   train_loss = 5.038
Epoch   0 Batch  300/1173   train_loss = 4.655
Epoch   0 Batch  325/1173   train_loss = 4.738
Epoch   0 Batch  350/1173   train_loss = 4.915
Epoch   0 Batch  375/1173   train_loss = 4.749
Epoch   0 Batch  400/1173   train_loss = 4.825
Epoch   0 Batch  425/1173   train_loss = 4.911
Epoch   0 Batch  450/1173   train_loss = 4.921
Epoch   0 Batch  475/1173   train_loss = 4.750
Epoch   0 Batch  500/1173   train_loss = 4.682
Epoch   0 Batch  525/1173   train_loss = 4.673
Epoch   0 Batch  550/1173   train_loss = 4.615
Epoch   0 Batch  575/1173   train_loss = 4.552
Epoch   0 Batch  600/1173   train_loss = 4.683
Epoch   0 Batch  625/1173   train_loss = 4.623
Epoch   0 Batch  650/1173   train_loss = 4.520
Epoch   0 Batch  675/1173   train_loss = 4.614
Epoch   0 Batch  700/1173   train_loss = 4.506
Epoch   0 Batch  725/1173   train_loss = 4.565
Epoch   0 Batch  750/1173   train_loss = 4.580
Epoch   0 Batch  775/1173   train_loss = 4.462
Epoch   0 Batch  800/1173   train_loss = 4.466
Epoch   0 Batch  825/1173   train_loss = 4.433
Epoch   0 Batch  850/1173   train_loss = 4.683
Epoch   0 Batch  875/1173   train_loss = 4.400
Epoch   0 Batch  900/1173   train_loss = 4.494
Epoch   0 Batch  925/1173   train_loss = 4.682
Epoch   0 Batch  950/1173   train_loss = 4.612
Epoch   0 Batch  975/1173   train_loss = 4.561
Epoch   0 Batch 1000/1173   train_loss = 4.605
Epoch   0 Batch 1025/1173   train_loss = 4.629
Epoch   0 Batch 1050/1173   train_loss = 4.534
Epoch   0 Batch 1075/1173   train_loss = 4.519
Epoch   0 Batch 1100/1173   train_loss = 4.319
Epoch   0 Batch 1125/1173   train_loss = 4.412
Epoch   0 Batch 1150/1173   train_loss = 4.627
Epoch   1 Batch    2/1173   train_loss = 4.477
Epoch   1 Batch   27/1173   train_loss = 4.360
Epoch   1 Batch   52/1173   train_loss = 4.509
Epoch   1 Batch   77/1173   train_loss = 4.342
Epoch   1 Batch  102/1173   train_loss = 4.507
Epoch   1 Batch  127/1173   train_loss = 4.491
Epoch   1 Batch  152/1173   train_loss = 4.526
Epoch   1 Batch  177/1173   train_loss = 4.286
Epoch   1 Batch  202/1173   train_loss = 4.270
Epoch   1 Batch  227/1173   train_loss = 4.254
Epoch   1 Batch  252/1173   train_loss = 4.205
Epoch   1 Batch  277/1173   train_loss = 4.270
Epoch   1 Batch  302/1173   train_loss = 4.192
Epoch   1 Batch  327/1173   train_loss = 4.224
Epoch   1 Batch  352/1173   train_loss = 4.266
Epoch   1 Batch  377/1173   train_loss = 4.353
Epoch   1 Batch  402/1173   train_loss = 4.333
Epoch   1 Batch  427/1173   train_loss = 4.107
Epoch   1 Batch  452/1173   train_loss = 4.397
Epoch   1 Batch  477/1173   train_loss = 4.343
Epoch   1 Batch  502/1173   train_loss = 4.191
Epoch   1 Batch  527/1173   train_loss = 4.211
Epoch   1 Batch  552/1173   train_loss = 4.144
Epoch   1 Batch  577/1173   train_loss = 4.077
Epoch   1 Batch  602/1173   train_loss = 4.242
Epoch   1 Batch  627/1173   train_loss = 4.108
Epoch   1 Batch  652/1173   train_loss = 4.130
Epoch   1 Batch  677/1173   train_loss = 4.286
Epoch   1 Batch  702/1173   train_loss = 4.136
Epoch   1 Batch  727/1173   train_loss = 4.278
Epoch   1 Batch  752/1173   train_loss = 4.070
Epoch   1 Batch  777/1173   train_loss = 4.107
Epoch   1 Batch  802/1173   train_loss = 4.233
Epoch   1 Batch  827/1173   train_loss = 4.072
Epoch   1 Batch  852/1173   train_loss = 4.243
Epoch   1 Batch  877/1173   train_loss = 4.228
Epoch   1 Batch  902/1173   train_loss = 4.211
Epoch   1 Batch  927/1173   train_loss = 4.182
Epoch   1 Batch  952/1173   train_loss = 4.130
Epoch   1 Batch  977/1173   train_loss = 4.212
Epoch   1 Batch 1002/1173   train_loss = 4.285
Epoch   1 Batch 1027/1173   train_loss = 4.338
Epoch   1 Batch 1052/1173   train_loss = 4.175
Epoch   1 Batch 1077/1173   train_loss = 4.156
Epoch   1 Batch 1102/1173   train_loss = 4.224
Epoch   1 Batch 1127/1173   train_loss = 4.123
Epoch   1 Batch 1152/1173   train_loss = 4.045
Epoch   2 Batch    4/1173   train_loss = 3.993
Epoch   2 Batch   29/1173   train_loss = 4.362
Epoch   2 Batch   54/1173   train_loss = 4.235
Epoch   2 Batch   79/1173   train_loss = 4.068
Epoch   2 Batch  104/1173   train_loss = 4.097
Epoch   2 Batch  129/1173   train_loss = 4.126
Epoch   2 Batch  154/1173   train_loss = 4.171
Epoch   2 Batch  179/1173   train_loss = 4.178
Epoch   2 Batch  204/1173   train_loss = 4.170
Epoch   2 Batch  229/1173   train_loss = 4.035
Epoch   2 Batch  254/1173   train_loss = 4.041
Epoch   2 Batch  279/1173   train_loss = 4.051
Epoch   2 Batch  304/1173   train_loss = 4.115
Epoch   2 Batch  329/1173   train_loss = 4.156
Epoch   2 Batch  354/1173   train_loss = 4.046
Epoch   2 Batch  379/1173   train_loss = 4.078
Epoch   2 Batch  404/1173   train_loss = 4.201
Epoch   2 Batch  429/1173   train_loss = 3.916
Epoch   2 Batch  454/1173   train_loss = 4.154
Epoch   2 Batch  479/1173   train_loss = 4.073
Epoch   2 Batch  504/1173   train_loss = 3.991
Epoch   2 Batch  529/1173   train_loss = 4.169
Epoch   2 Batch  554/1173   train_loss = 4.038
Epoch   2 Batch  579/1173   train_loss = 4.193
Epoch   2 Batch  604/1173   train_loss = 4.072
Epoch   2 Batch  629/1173   train_loss = 4.094
Epoch   2 Batch  654/1173   train_loss = 4.091
Epoch   2 Batch  679/1173   train_loss = 4.204
Epoch   2 Batch  704/1173   train_loss = 3.941
Epoch   2 Batch  729/1173   train_loss = 3.953
Epoch   2 Batch  754/1173   train_loss = 4.183
Epoch   2 Batch  779/1173   train_loss = 4.025
Epoch   2 Batch  804/1173   train_loss = 4.150
Epoch   2 Batch  829/1173   train_loss = 4.111
Epoch   2 Batch  854/1173   train_loss = 4.219
Epoch   2 Batch  879/1173   train_loss = 4.035
Epoch   2 Batch  904/1173   train_loss = 3.965
Epoch   2 Batch  929/1173   train_loss = 4.009
Epoch   2 Batch  954/1173   train_loss = 3.895
Epoch   2 Batch  979/1173   train_loss = 3.957
Epoch   2 Batch 1004/1173   train_loss = 4.014
Epoch   2 Batch 1029/1173   train_loss = 4.098
Epoch   2 Batch 1054/1173   train_loss = 3.922
Epoch   2 Batch 1079/1173   train_loss = 4.017
Epoch   2 Batch 1104/1173   train_loss = 4.016
Epoch   2 Batch 1129/1173   train_loss = 3.935
Epoch   2 Batch 1154/1173   train_loss = 3.967
Epoch   3 Batch    6/1173   train_loss = 4.079
Epoch   3 Batch   31/1173   train_loss = 4.012
Epoch   3 Batch   56/1173   train_loss = 4.105
Epoch   3 Batch   81/1173   train_loss = 4.222
Epoch   3 Batch  106/1173   train_loss = 3.978
Epoch   3 Batch  131/1173   train_loss = 4.001
Epoch   3 Batch  156/1173   train_loss = 3.987
Epoch   3 Batch  181/1173   train_loss = 4.028
Epoch   3 Batch  206/1173   train_loss = 3.955
Epoch   3 Batch  231/1173   train_loss = 4.072
Epoch   3 Batch  256/1173   train_loss = 3.923
Epoch   3 Batch  281/1173   train_loss = 3.897
Epoch   3 Batch  306/1173   train_loss = 4.138
Epoch   3 Batch  331/1173   train_loss = 4.033
Epoch   3 Batch  356/1173   train_loss = 3.919
Epoch   3 Batch  381/1173   train_loss = 3.930
Epoch   3 Batch  406/1173   train_loss = 4.096
Epoch   3 Batch  431/1173   train_loss = 3.955
Epoch   3 Batch  456/1173   train_loss = 4.052
Epoch   3 Batch  481/1173   train_loss = 4.031
Epoch   3 Batch  506/1173   train_loss = 3.986
Epoch   3 Batch  531/1173   train_loss = 4.035
Epoch   3 Batch  556/1173   train_loss = 3.859
Epoch   3 Batch  581/1173   train_loss = 3.962
Epoch   3 Batch  606/1173   train_loss = 4.056
Epoch   3 Batch  631/1173   train_loss = 3.995
Epoch   3 Batch  656/1173   train_loss = 4.115
Epoch   3 Batch  681/1173   train_loss = 4.018
Epoch   3 Batch  706/1173   train_loss = 3.946
Epoch   3 Batch  731/1173   train_loss = 4.018
Epoch   3 Batch  756/1173   train_loss = 4.103
Epoch   3 Batch  781/1173   train_loss = 3.939
Epoch   3 Batch  806/1173   train_loss = 3.923
Epoch   3 Batch  831/1173   train_loss = 4.017
Epoch   3 Batch  856/1173   train_loss = 4.084
Epoch   3 Batch  881/1173   train_loss = 3.979
Epoch   3 Batch  906/1173   train_loss = 3.765
Epoch   3 Batch  931/1173   train_loss = 3.927
Epoch   3 Batch  956/1173   train_loss = 3.850
Epoch   3 Batch  981/1173   train_loss = 3.738
Epoch   3 Batch 1006/1173   train_loss = 3.894
Epoch   3 Batch 1031/1173   train_loss = 3.917
Epoch   3 Batch 1056/1173   train_loss = 3.958
Epoch   3 Batch 1081/1173   train_loss = 3.941
Epoch   3 Batch 1106/1173   train_loss = 3.899
Epoch   3 Batch 1131/1173   train_loss = 3.821
Epoch   3 Batch 1156/1173   train_loss = 3.835
Epoch   4 Batch    8/1173   train_loss = 3.827
Epoch   4 Batch   33/1173   train_loss = 3.861
Epoch   4 Batch   58/1173   train_loss = 4.000
Epoch   4 Batch   83/1173   train_loss = 4.008
Epoch   4 Batch  108/1173   train_loss = 3.874
Epoch   4 Batch  133/1173   train_loss = 3.904
Epoch   4 Batch  158/1173   train_loss = 3.733
Epoch   4 Batch  183/1173   train_loss = 3.878
Epoch   4 Batch  208/1173   train_loss = 3.962
Epoch   4 Batch  233/1173   train_loss = 3.856
Epoch   4 Batch  258/1173   train_loss = 3.829
Epoch   4 Batch  283/1173   train_loss = 3.874
Epoch   4 Batch  308/1173   train_loss = 3.832
Epoch   4 Batch  333/1173   train_loss = 4.023
Epoch   4 Batch  358/1173   train_loss = 3.878
Epoch   4 Batch  383/1173   train_loss = 3.876
Epoch   4 Batch  408/1173   train_loss = 4.036
Epoch   4 Batch  433/1173   train_loss = 3.906
Epoch   4 Batch  458/1173   train_loss = 3.907
Epoch   4 Batch  483/1173   train_loss = 3.973
Epoch   4 Batch  508/1173   train_loss = 3.877
Epoch   4 Batch  533/1173   train_loss = 3.898
Epoch   4 Batch  558/1173   train_loss = 3.707
Epoch   4 Batch  583/1173   train_loss = 3.894
Epoch   4 Batch  608/1173   train_loss = 3.941
Epoch   4 Batch  633/1173   train_loss = 3.919
Epoch   4 Batch  658/1173   train_loss = 3.847
Epoch   4 Batch  683/1173   train_loss = 3.781
Epoch   4 Batch  708/1173   train_loss = 3.841
Epoch   4 Batch  733/1173   train_loss = 3.990
Epoch   4 Batch  758/1173   train_loss = 3.938
Epoch   4 Batch  783/1173   train_loss = 4.008
Epoch   4 Batch  808/1173   train_loss = 3.822
Epoch   4 Batch  833/1173   train_loss = 3.857
Epoch   4 Batch  858/1173   train_loss = 3.785
Epoch   4 Batch  883/1173   train_loss = 3.889
Epoch   4 Batch  908/1173   train_loss = 3.806
Epoch   4 Batch  933/1173   train_loss = 3.851
Epoch   4 Batch  958/1173   train_loss = 3.722
Epoch   4 Batch  983/1173   train_loss = 3.790
Epoch   4 Batch 1008/1173   train_loss = 3.831
Epoch   4 Batch 1033/1173   train_loss = 3.873
Epoch   4 Batch 1058/1173   train_loss = 3.932
Epoch   4 Batch 1083/1173   train_loss = 3.894
Epoch   4 Batch 1108/1173   train_loss = 3.694
Epoch   4 Batch 1133/1173   train_loss = 3.772
Epoch   4 Batch 1158/1173   train_loss = 3.721
Epoch   5 Batch   10/1173   train_loss = 3.899
Epoch   5 Batch   35/1173   train_loss = 3.931
Epoch   5 Batch   60/1173   train_loss = 3.959
Epoch   5 Batch   85/1173   train_loss = 3.932
Epoch   5 Batch  110/1173   train_loss = 3.826
Epoch   5 Batch  135/1173   train_loss = 3.863
Epoch   5 Batch  160/1173   train_loss = 3.908
Epoch   5 Batch  185/1173   train_loss = 3.911
Epoch   5 Batch  210/1173   train_loss = 3.760
Epoch   5 Batch  235/1173   train_loss = 3.868
Epoch   5 Batch  260/1173   train_loss = 3.747
Epoch   5 Batch  285/1173   train_loss = 3.822
Epoch   5 Batch  310/1173   train_loss = 3.567
Epoch   5 Batch  335/1173   train_loss = 3.853
Epoch   5 Batch  360/1173   train_loss = 3.813
Epoch   5 Batch  385/1173   train_loss = 3.750
Epoch   5 Batch  410/1173   train_loss = 3.818
Epoch   5 Batch  435/1173   train_loss = 3.735
Epoch   5 Batch  460/1173   train_loss = 3.751
Epoch   5 Batch  485/1173   train_loss = 3.684
Epoch   5 Batch  510/1173   train_loss = 3.913
Epoch   5 Batch  535/1173   train_loss = 3.868
Epoch   5 Batch  560/1173   train_loss = 3.930
Epoch   5 Batch  585/1173   train_loss = 3.729
Epoch   5 Batch  610/1173   train_loss = 3.900
Epoch   5 Batch  635/1173   train_loss = 3.768
Epoch   5 Batch  660/1173   train_loss = 3.729
Epoch   5 Batch  685/1173   train_loss = 3.773
Epoch   5 Batch  710/1173   train_loss = 3.682
Epoch   5 Batch  735/1173   train_loss = 3.670
Epoch   5 Batch  760/1173   train_loss = 3.924
Epoch   5 Batch  785/1173   train_loss = 3.761
Epoch   5 Batch  810/1173   train_loss = 3.908
Epoch   5 Batch  835/1173   train_loss = 3.713
Epoch   5 Batch  860/1173   train_loss = 3.712
Epoch   5 Batch  885/1173   train_loss = 3.826
Epoch   5 Batch  910/1173   train_loss = 3.721
Epoch   5 Batch  935/1173   train_loss = 3.819
Epoch   5 Batch  960/1173   train_loss = 3.790
Epoch   5 Batch  985/1173   train_loss = 3.811
Epoch   5 Batch 1010/1173   train_loss = 3.861
Epoch   5 Batch 1035/1173   train_loss = 3.709
Epoch   5 Batch 1060/1173   train_loss = 3.621
Epoch   5 Batch 1085/1173   train_loss = 3.681
Epoch   5 Batch 1110/1173   train_loss = 3.585
Epoch   5 Batch 1135/1173   train_loss = 3.788
Epoch   5 Batch 1160/1173   train_loss = 3.795
Epoch   6 Batch   12/1173   train_loss = 3.841
Epoch   6 Batch   37/1173   train_loss = 3.776
Epoch   6 Batch   62/1173   train_loss = 3.862
Epoch   6 Batch   87/1173   train_loss = 3.818
Epoch   6 Batch  112/1173   train_loss = 3.795
Epoch   6 Batch  137/1173   train_loss = 3.762
Epoch   6 Batch  162/1173   train_loss = 3.832
Epoch   6 Batch  187/1173   train_loss = 3.800
Epoch   6 Batch  212/1173   train_loss = 3.818
Epoch   6 Batch  237/1173   train_loss = 3.783
Epoch   6 Batch  262/1173   train_loss = 3.842
Epoch   6 Batch  287/1173   train_loss = 3.738
Epoch   6 Batch  312/1173   train_loss = 3.662
Epoch   6 Batch  337/1173   train_loss = 3.790
Epoch   6 Batch  362/1173   train_loss = 3.798
Epoch   6 Batch  387/1173   train_loss = 3.689
Epoch   6 Batch  412/1173   train_loss = 3.602
Epoch   6 Batch  437/1173   train_loss = 3.822
Epoch   6 Batch  462/1173   train_loss = 3.785
Epoch   6 Batch  487/1173   train_loss = 3.676
Epoch   6 Batch  512/1173   train_loss = 3.726
Epoch   6 Batch  537/1173   train_loss = 3.750
Epoch   6 Batch  562/1173   train_loss = 3.792
Epoch   6 Batch  587/1173   train_loss = 3.682
Epoch   6 Batch  612/1173   train_loss = 3.677
Epoch   6 Batch  637/1173   train_loss = 3.711
Epoch   6 Batch  662/1173   train_loss = 3.797
Epoch   6 Batch  687/1173   train_loss = 3.659
Epoch   6 Batch  712/1173   train_loss = 3.659
Epoch   6 Batch  737/1173   train_loss = 3.936
Epoch   6 Batch  762/1173   train_loss = 3.868
Epoch   6 Batch  787/1173   train_loss = 3.814
Epoch   6 Batch  812/1173   train_loss = 3.714
Epoch   6 Batch  837/1173   train_loss = 3.728
Epoch   6 Batch  862/1173   train_loss = 3.697
Epoch   6 Batch  887/1173   train_loss = 3.752
Epoch   6 Batch  912/1173   train_loss = 3.670
Epoch   6 Batch  937/1173   train_loss = 3.779
Epoch   6 Batch  962/1173   train_loss = 3.766
Epoch   6 Batch  987/1173   train_loss = 3.679
Epoch   6 Batch 1012/1173   train_loss = 3.706
Epoch   6 Batch 1037/1173   train_loss = 3.710
Epoch   6 Batch 1062/1173   train_loss = 3.693
Epoch   6 Batch 1087/1173   train_loss = 3.660
Epoch   6 Batch 1112/1173   train_loss = 3.608
Epoch   6 Batch 1137/1173   train_loss = 3.656
Epoch   6 Batch 1162/1173   train_loss = 3.698
Epoch   7 Batch   14/1173   train_loss = 3.737
Epoch   7 Batch   39/1173   train_loss = 3.718
Epoch   7 Batch   64/1173   train_loss = 3.737
Epoch   7 Batch   89/1173   train_loss = 3.828
Epoch   7 Batch  114/1173   train_loss = 3.674
Epoch   7 Batch  139/1173   train_loss = 3.716
Epoch   7 Batch  164/1173   train_loss = 3.703
Epoch   7 Batch  189/1173   train_loss = 3.829
Epoch   7 Batch  214/1173   train_loss = 3.668
Epoch   7 Batch  239/1173   train_loss = 3.759
Epoch   7 Batch  264/1173   train_loss = 3.483
Epoch   7 Batch  289/1173   train_loss = 3.704
Epoch   7 Batch  314/1173   train_loss = 3.827
Epoch   7 Batch  339/1173   train_loss = 3.748
Epoch   7 Batch  364/1173   train_loss = 3.788
Epoch   7 Batch  389/1173   train_loss = 3.565
Epoch   7 Batch  414/1173   train_loss = 3.711
Epoch   7 Batch  439/1173   train_loss = 3.651
Epoch   7 Batch  464/1173   train_loss = 3.774
Epoch   7 Batch  489/1173   train_loss = 3.774
Epoch   7 Batch  514/1173   train_loss = 3.809
Epoch   7 Batch  539/1173   train_loss = 3.779
Epoch   7 Batch  564/1173   train_loss = 3.628
Epoch   7 Batch  589/1173   train_loss = 3.684
Epoch   7 Batch  614/1173   train_loss = 3.637
Epoch   7 Batch  639/1173   train_loss = 3.711
Epoch   7 Batch  664/1173   train_loss = 3.651
Epoch   7 Batch  689/1173   train_loss = 3.589
Epoch   7 Batch  714/1173   train_loss = 3.571
Epoch   7 Batch  739/1173   train_loss = 3.669
Epoch   7 Batch  764/1173   train_loss = 3.630
Epoch   7 Batch  789/1173   train_loss = 3.708
Epoch   7 Batch  814/1173   train_loss = 3.578
Epoch   7 Batch  839/1173   train_loss = 3.666
Epoch   7 Batch  864/1173   train_loss = 3.635
Epoch   7 Batch  889/1173   train_loss = 3.684
Epoch   7 Batch  914/1173   train_loss = 3.640
Epoch   7 Batch  939/1173   train_loss = 3.674
Epoch   7 Batch  964/1173   train_loss = 3.658
Epoch   7 Batch  989/1173   train_loss = 3.630
Epoch   7 Batch 1014/1173   train_loss = 3.650
Epoch   7 Batch 1039/1173   train_loss = 3.680
Epoch   7 Batch 1064/1173   train_loss = 3.443
Epoch   7 Batch 1089/1173   train_loss = 3.740
Epoch   7 Batch 1114/1173   train_loss = 3.527
Epoch   7 Batch 1139/1173   train_loss = 3.632
Epoch   7 Batch 1164/1173   train_loss = 3.498
Epoch   8 Batch   16/1173   train_loss = 3.665
Epoch   8 Batch   41/1173   train_loss = 3.524
Epoch   8 Batch   66/1173   train_loss = 3.589
Epoch   8 Batch   91/1173   train_loss = 3.808
Epoch   8 Batch  116/1173   train_loss = 3.785
Epoch   8 Batch  141/1173   train_loss = 3.630
Epoch   8 Batch  166/1173   train_loss = 3.643
Epoch   8 Batch  191/1173   train_loss = 3.529
Epoch   8 Batch  216/1173   train_loss = 3.632
Epoch   8 Batch  241/1173   train_loss = 3.604
Epoch   8 Batch  266/1173   train_loss = 3.547
Epoch   8 Batch  291/1173   train_loss = 3.690
Epoch   8 Batch  316/1173   train_loss = 3.673
Epoch   8 Batch  341/1173   train_loss = 3.747
Epoch   8 Batch  366/1173   train_loss = 3.636
Epoch   8 Batch  391/1173   train_loss = 3.695
Epoch   8 Batch  416/1173   train_loss = 3.713
Epoch   8 Batch  441/1173   train_loss = 3.667
Epoch   8 Batch  466/1173   train_loss = 3.594
Epoch   8 Batch  491/1173   train_loss = 3.514
Epoch   8 Batch  516/1173   train_loss = 3.724
Epoch   8 Batch  541/1173   train_loss = 3.688
Epoch   8 Batch  566/1173   train_loss = 3.788
Epoch   8 Batch  591/1173   train_loss = 3.630
Epoch   8 Batch  616/1173   train_loss = 3.620
Epoch   8 Batch  641/1173   train_loss = 3.613
Epoch   8 Batch  666/1173   train_loss = 3.619
Epoch   8 Batch  691/1173   train_loss = 3.516
Epoch   8 Batch  716/1173   train_loss = 3.651
Epoch   8 Batch  741/1173   train_loss = 3.801
Epoch   8 Batch  766/1173   train_loss = 3.722
Epoch   8 Batch  791/1173   train_loss = 3.657
Epoch   8 Batch  816/1173   train_loss = 3.694
Epoch   8 Batch  841/1173   train_loss = 3.715
Epoch   8 Batch  866/1173   train_loss = 3.595
Epoch   8 Batch  891/1173   train_loss = 3.619
Epoch   8 Batch  916/1173   train_loss = 3.657
Epoch   8 Batch  941/1173   train_loss = 3.503
Epoch   8 Batch  966/1173   train_loss = 3.691
Epoch   8 Batch  991/1173   train_loss = 3.710
Epoch   8 Batch 1016/1173   train_loss = 3.506
Epoch   8 Batch 1041/1173   train_loss = 3.539
Epoch   8 Batch 1066/1173   train_loss = 3.626
Epoch   8 Batch 1091/1173   train_loss = 3.520
Epoch   8 Batch 1116/1173   train_loss = 3.642
Epoch   8 Batch 1141/1173   train_loss = 3.673
Epoch   8 Batch 1166/1173   train_loss = 3.647
Epoch   9 Batch   18/1173   train_loss = 3.692
Epoch   9 Batch   43/1173   train_loss = 3.571
Epoch   9 Batch   68/1173   train_loss = 3.673
Epoch   9 Batch   93/1173   train_loss = 3.715
Epoch   9 Batch  118/1173   train_loss = 3.782
Epoch   9 Batch  143/1173   train_loss = 3.603
Epoch   9 Batch  168/1173   train_loss = 3.394
Epoch   9 Batch  193/1173   train_loss = 3.538
Epoch   9 Batch  218/1173   train_loss = 3.500
Epoch   9 Batch  243/1173   train_loss = 3.620
Epoch   9 Batch  268/1173   train_loss = 3.537
Epoch   9 Batch  293/1173   train_loss = 3.727
Epoch   9 Batch  318/1173   train_loss = 3.538
Epoch   9 Batch  343/1173   train_loss = 3.662
Epoch   9 Batch  368/1173   train_loss = 3.580
Epoch   9 Batch  393/1173   train_loss = 3.691
Epoch   9 Batch  418/1173   train_loss = 3.608
Epoch   9 Batch  443/1173   train_loss = 3.692
Epoch   9 Batch  468/1173   train_loss = 3.630
Epoch   9 Batch  493/1173   train_loss = 3.710
Epoch   9 Batch  518/1173   train_loss = 3.636
Epoch   9 Batch  543/1173   train_loss = 3.650
Epoch   9 Batch  568/1173   train_loss = 3.630
Epoch   9 Batch  593/1173   train_loss = 3.583
Epoch   9 Batch  618/1173   train_loss = 3.673
Epoch   9 Batch  643/1173   train_loss = 3.646
Epoch   9 Batch  668/1173   train_loss = 3.483
Epoch   9 Batch  693/1173   train_loss = 3.577
Epoch   9 Batch  718/1173   train_loss = 3.374
Epoch   9 Batch  743/1173   train_loss = 3.567
Epoch   9 Batch  768/1173   train_loss = 3.706
Epoch   9 Batch  793/1173   train_loss = 3.578
Epoch   9 Batch  818/1173   train_loss = 3.587
Epoch   9 Batch  843/1173   train_loss = 3.542
Epoch   9 Batch  868/1173   train_loss = 3.533
Epoch   9 Batch  893/1173   train_loss = 3.497
Epoch   9 Batch  918/1173   train_loss = 3.615
Epoch   9 Batch  943/1173   train_loss = 3.558
Epoch   9 Batch  968/1173   train_loss = 3.639
Epoch   9 Batch  993/1173   train_loss = 3.514
Epoch   9 Batch 1018/1173   train_loss = 3.510
Epoch   9 Batch 1043/1173   train_loss = 3.661
Epoch   9 Batch 1068/1173   train_loss = 3.634
Epoch   9 Batch 1093/1173   train_loss = 3.576
Epoch   9 Batch 1118/1173   train_loss = 3.547
Epoch   9 Batch 1143/1173   train_loss = 3.533
Epoch   9 Batch 1168/1173   train_loss = 3.484
Epoch  10 Batch   20/1173   train_loss = 3.482
Epoch  10 Batch   45/1173   train_loss = 3.452
Epoch  10 Batch   70/1173   train_loss = 3.565
Epoch  10 Batch   95/1173   train_loss = 3.673
Epoch  10 Batch  120/1173   train_loss = 3.646
Epoch  10 Batch  145/1173   train_loss = 3.638
Epoch  10 Batch  170/1173   train_loss = 3.535
Epoch  10 Batch  195/1173   train_loss = 3.558
Epoch  10 Batch  220/1173   train_loss = 3.505
Epoch  10 Batch  245/1173   train_loss = 3.541
Epoch  10 Batch  270/1173   train_loss = 3.595
Epoch  10 Batch  295/1173   train_loss = 3.514
Epoch  10 Batch  320/1173   train_loss = 3.566
Epoch  10 Batch  345/1173   train_loss = 3.591
Epoch  10 Batch  370/1173   train_loss = 3.557
Epoch  10 Batch  395/1173   train_loss = 3.527
Epoch  10 Batch  420/1173   train_loss = 3.636
Epoch  10 Batch  445/1173   train_loss = 3.473
Epoch  10 Batch  470/1173   train_loss = 3.676
Epoch  10 Batch  495/1173   train_loss = 3.601
Epoch  10 Batch  520/1173   train_loss = 3.590
Epoch  10 Batch  545/1173   train_loss = 3.616
Epoch  10 Batch  570/1173   train_loss = 3.568
Epoch  10 Batch  595/1173   train_loss = 3.778
Epoch  10 Batch  620/1173   train_loss = 3.705
Epoch  10 Batch  645/1173   train_loss = 3.638
Epoch  10 Batch  670/1173   train_loss = 3.544
Epoch  10 Batch  695/1173   train_loss = 3.621
Epoch  10 Batch  720/1173   train_loss = 3.493
Epoch  10 Batch  745/1173   train_loss = 3.622
Epoch  10 Batch  770/1173   train_loss = 3.643
Epoch  10 Batch  795/1173   train_loss = 3.539
Epoch  10 Batch  820/1173   train_loss = 3.573
Epoch  10 Batch  845/1173   train_loss = 3.679
Epoch  10 Batch  870/1173   train_loss = 3.583
Epoch  10 Batch  895/1173   train_loss = 3.661
Epoch  10 Batch  920/1173   train_loss = 3.625
Epoch  10 Batch  945/1173   train_loss = 3.573
Epoch  10 Batch  970/1173   train_loss = 3.433
Epoch  10 Batch  995/1173   train_loss = 3.543
Epoch  10 Batch 1020/1173   train_loss = 3.482
Epoch  10 Batch 1045/1173   train_loss = 3.449
Epoch  10 Batch 1070/1173   train_loss = 3.521
Epoch  10 Batch 1095/1173   train_loss = 3.568
Epoch  10 Batch 1120/1173   train_loss = 3.461
Epoch  10 Batch 1145/1173   train_loss = 3.681
Epoch  10 Batch 1170/1173   train_loss = 3.571
Epoch  11 Batch   22/1173   train_loss = 3.563
Epoch  11 Batch   47/1173   train_loss = 3.579
Epoch  11 Batch   72/1173   train_loss = 3.573
Epoch  11 Batch   97/1173   train_loss = 3.534
Epoch  11 Batch  122/1173   train_loss = 3.626
Epoch  11 Batch  147/1173   train_loss = 3.620
Epoch  11 Batch  172/1173   train_loss = 3.423
Epoch  11 Batch  197/1173   train_loss = 3.501
Epoch  11 Batch  222/1173   train_loss = 3.646
Epoch  11 Batch  247/1173   train_loss = 3.628
Epoch  11 Batch  272/1173   train_loss = 3.637
Epoch  11 Batch  297/1173   train_loss = 3.664
Epoch  11 Batch  322/1173   train_loss = 3.639
Epoch  11 Batch  347/1173   train_loss = 3.534
Epoch  11 Batch  372/1173   train_loss = 3.665
Epoch  11 Batch  397/1173   train_loss = 3.689
Epoch  11 Batch  422/1173   train_loss = 3.518
Epoch  11 Batch  447/1173   train_loss = 3.730
Epoch  11 Batch  472/1173   train_loss = 3.537
Epoch  11 Batch  497/1173   train_loss = 3.515
Epoch  11 Batch  522/1173   train_loss = 3.553
Epoch  11 Batch  547/1173   train_loss = 3.658
Epoch  11 Batch  572/1173   train_loss = 3.590
Epoch  11 Batch  597/1173   train_loss = 3.465
Epoch  11 Batch  622/1173   train_loss = 3.501
Epoch  11 Batch  647/1173   train_loss = 3.721
Epoch  11 Batch  672/1173   train_loss = 3.521
Epoch  11 Batch  697/1173   train_loss = 3.550
Epoch  11 Batch  722/1173   train_loss = 3.605
Epoch  11 Batch  747/1173   train_loss = 3.620
Epoch  11 Batch  772/1173   train_loss = 3.439
Epoch  11 Batch  797/1173   train_loss = 3.554
Epoch  11 Batch  822/1173   train_loss = 3.568
Epoch  11 Batch  847/1173   train_loss = 3.438
Epoch  11 Batch  872/1173   train_loss = 3.524
Epoch  11 Batch  897/1173   train_loss = 3.570
Epoch  11 Batch  922/1173   train_loss = 3.610
Epoch  11 Batch  947/1173   train_loss = 3.596
Epoch  11 Batch  972/1173   train_loss = 3.484
Epoch  11 Batch  997/1173   train_loss = 3.489
Epoch  11 Batch 1022/1173   train_loss = 3.553
Epoch  11 Batch 1047/1173   train_loss = 3.559
Epoch  11 Batch 1072/1173   train_loss = 3.384
Epoch  11 Batch 1097/1173   train_loss = 3.485
Epoch  11 Batch 1122/1173   train_loss = 3.595
Epoch  11 Batch 1147/1173   train_loss = 3.654
Epoch  11 Batch 1172/1173   train_loss = 3.728
Epoch  12 Batch   24/1173   train_loss = 3.526
Epoch  12 Batch   49/1173   train_loss = 3.452
Epoch  12 Batch   74/1173   train_loss = 3.699
Epoch  12 Batch   99/1173   train_loss = 3.440
Epoch  12 Batch  124/1173   train_loss = 3.420
Epoch  12 Batch  149/1173   train_loss = 3.749
Epoch  12 Batch  174/1173   train_loss = 3.538
Epoch  12 Batch  199/1173   train_loss = 3.492
Epoch  12 Batch  224/1173   train_loss = 3.601
Epoch  12 Batch  249/1173   train_loss = 3.621
Epoch  12 Batch  274/1173   train_loss = 3.488
Epoch  12 Batch  299/1173   train_loss = 3.589
Epoch  12 Batch  324/1173   train_loss = 3.553
Epoch  12 Batch  349/1173   train_loss = 3.533
Epoch  12 Batch  374/1173   train_loss = 3.595
Epoch  12 Batch  399/1173   train_loss = 3.627
Epoch  12 Batch  424/1173   train_loss = 3.619
Epoch  12 Batch  449/1173   train_loss = 3.555
Epoch  12 Batch  474/1173   train_loss = 3.603
Epoch  12 Batch  499/1173   train_loss = 3.422
Epoch  12 Batch  524/1173   train_loss = 3.547
Epoch  12 Batch  549/1173   train_loss = 3.443
Epoch  12 Batch  574/1173   train_loss = 3.643
Epoch  12 Batch  599/1173   train_loss = 3.377
Epoch  12 Batch  624/1173   train_loss = 3.549
Epoch  12 Batch  649/1173   train_loss = 3.554
Epoch  12 Batch  674/1173   train_loss = 3.573
Epoch  12 Batch  699/1173   train_loss = 3.563
Epoch  12 Batch  724/1173   train_loss = 3.411
Epoch  12 Batch  749/1173   train_loss = 3.579
Epoch  12 Batch  774/1173   train_loss = 3.494
Epoch  12 Batch  799/1173   train_loss = 3.392
Epoch  12 Batch  824/1173   train_loss = 3.433
Epoch  12 Batch  849/1173   train_loss = 3.604
Epoch  12 Batch  874/1173   train_loss = 3.554
Epoch  12 Batch  899/1173   train_loss = 3.401
Epoch  12 Batch  924/1173   train_loss = 3.398
Epoch  12 Batch  949/1173   train_loss = 3.526
Epoch  12 Batch  974/1173   train_loss = 3.333
Epoch  12 Batch  999/1173   train_loss = 3.311
Epoch  12 Batch 1024/1173   train_loss = 3.476
Epoch  12 Batch 1049/1173   train_loss = 3.619
Epoch  12 Batch 1074/1173   train_loss = 3.576
Epoch  12 Batch 1099/1173   train_loss = 3.464
Epoch  12 Batch 1124/1173   train_loss = 3.583
Epoch  12 Batch 1149/1173   train_loss = 3.611
Epoch  13 Batch    1/1173   train_loss = 3.471
Epoch  13 Batch   26/1173   train_loss = 3.490
Epoch  13 Batch   51/1173   train_loss = 3.622
Epoch  13 Batch   76/1173   train_loss = 3.525
Epoch  13 Batch  101/1173   train_loss = 3.641
Epoch  13 Batch  126/1173   train_loss = 3.589
Epoch  13 Batch  151/1173   train_loss = 3.633
Epoch  13 Batch  176/1173   train_loss = 3.500
Epoch  13 Batch  201/1173   train_loss = 3.526
Epoch  13 Batch  226/1173   train_loss = 3.518
Epoch  13 Batch  251/1173   train_loss = 3.569
Epoch  13 Batch  276/1173   train_loss = 3.542
Epoch  13 Batch  301/1173   train_loss = 3.449
Epoch  13 Batch  326/1173   train_loss = 3.677
Epoch  13 Batch  351/1173   train_loss = 3.533
Epoch  13 Batch  376/1173   train_loss = 3.414
Epoch  13 Batch  401/1173   train_loss = 3.614
Epoch  13 Batch  426/1173   train_loss = 3.582
Epoch  13 Batch  451/1173   train_loss = 3.555
Epoch  13 Batch  476/1173   train_loss = 3.488
Epoch  13 Batch  501/1173   train_loss = 3.472
Epoch  13 Batch  526/1173   train_loss = 3.574
Epoch  13 Batch  551/1173   train_loss = 3.532
Epoch  13 Batch  576/1173   train_loss = 3.485
Epoch  13 Batch  601/1173   train_loss = 3.628
Epoch  13 Batch  626/1173   train_loss = 3.519
Epoch  13 Batch  651/1173   train_loss = 3.585
Epoch  13 Batch  676/1173   train_loss = 3.459
Epoch  13 Batch  701/1173   train_loss = 3.664
Epoch  13 Batch  726/1173   train_loss = 3.540
Epoch  13 Batch  751/1173   train_loss = 3.471
Epoch  13 Batch  776/1173   train_loss = 3.430
Epoch  13 Batch  801/1173   train_loss = 3.533
Epoch  13 Batch  826/1173   train_loss = 3.478
Epoch  13 Batch  851/1173   train_loss = 3.507
Epoch  13 Batch  876/1173   train_loss = 3.528
Epoch  13 Batch  901/1173   train_loss = 3.593
Epoch  13 Batch  926/1173   train_loss = 3.504
Epoch  13 Batch  951/1173   train_loss = 3.593
Epoch  13 Batch  976/1173   train_loss = 3.608
Epoch  13 Batch 1001/1173   train_loss = 3.481
Epoch  13 Batch 1026/1173   train_loss = 3.407
Epoch  13 Batch 1051/1173   train_loss = 3.433
Epoch  13 Batch 1076/1173   train_loss = 3.559
Epoch  13 Batch 1101/1173   train_loss = 3.459
Epoch  13 Batch 1126/1173   train_loss = 3.408
Epoch  13 Batch 1151/1173   train_loss = 3.460
Epoch  14 Batch    3/1173   train_loss = 3.454
Epoch  14 Batch   28/1173   train_loss = 3.481
Epoch  14 Batch   53/1173   train_loss = 3.474
Epoch  14 Batch   78/1173   train_loss = 3.512
Epoch  14 Batch  103/1173   train_loss = 3.541
Epoch  14 Batch  128/1173   train_loss = 3.523
Epoch  14 Batch  153/1173   train_loss = 3.435
Epoch  14 Batch  178/1173   train_loss = 3.439
Epoch  14 Batch  203/1173   train_loss = 3.418
Epoch  14 Batch  228/1173   train_loss = 3.313
Epoch  14 Batch  253/1173   train_loss = 3.503
Epoch  14 Batch  278/1173   train_loss = 3.565
Epoch  14 Batch  303/1173   train_loss = 3.434
Epoch  14 Batch  328/1173   train_loss = 3.356
Epoch  14 Batch  353/1173   train_loss = 3.567
Epoch  14 Batch  378/1173   train_loss = 3.302
Epoch  14 Batch  403/1173   train_loss = 3.581
Epoch  14 Batch  428/1173   train_loss = 3.413
Epoch  14 Batch  453/1173   train_loss = 3.599
Epoch  14 Batch  478/1173   train_loss = 3.471
Epoch  14 Batch  503/1173   train_loss = 3.393
Epoch  14 Batch  528/1173   train_loss = 3.431
Epoch  14 Batch  553/1173   train_loss = 3.440
Epoch  14 Batch  578/1173   train_loss = 3.493
Epoch  14 Batch  603/1173   train_loss = 3.502
Epoch  14 Batch  628/1173   train_loss = 3.437
Epoch  14 Batch  653/1173   train_loss = 3.506
Epoch  14 Batch  678/1173   train_loss = 3.523
Epoch  14 Batch  703/1173   train_loss = 3.537
Epoch  14 Batch  728/1173   train_loss = 3.456
Epoch  14 Batch  753/1173   train_loss = 3.436
Epoch  14 Batch  778/1173   train_loss = 3.625
Epoch  14 Batch  803/1173   train_loss = 3.306
Epoch  14 Batch  828/1173   train_loss = 3.612
Epoch  14 Batch  853/1173   train_loss = 3.519
Epoch  14 Batch  878/1173   train_loss = 3.354
Epoch  14 Batch  903/1173   train_loss = 3.452
Epoch  14 Batch  928/1173   train_loss = 3.591
Epoch  14 Batch  953/1173   train_loss = 3.515
Epoch  14 Batch  978/1173   train_loss = 3.327
Epoch  14 Batch 1003/1173   train_loss = 3.304
Epoch  14 Batch 1028/1173   train_loss = 3.454
Epoch  14 Batch 1053/1173   train_loss = 3.469
Epoch  14 Batch 1078/1173   train_loss = 3.535
Epoch  14 Batch 1103/1173   train_loss = 3.561
Epoch  14 Batch 1128/1173   train_loss = 3.454
Epoch  14 Batch 1153/1173   train_loss = 3.498
Epoch  15 Batch    5/1173   train_loss = 3.266
Epoch  15 Batch   30/1173   train_loss = 3.493
Epoch  15 Batch   55/1173   train_loss = 3.439
Epoch  15 Batch   80/1173   train_loss = 3.426
Epoch  15 Batch  105/1173   train_loss = 3.435
Epoch  15 Batch  130/1173   train_loss = 3.413
Epoch  15 Batch  155/1173   train_loss = 3.541
Epoch  15 Batch  180/1173   train_loss = 3.545
Epoch  15 Batch  205/1173   train_loss = 3.395
Epoch  15 Batch  230/1173   train_loss = 3.469
Epoch  15 Batch  255/1173   train_loss = 3.496
Epoch  15 Batch  280/1173   train_loss = 3.522
Epoch  15 Batch  305/1173   train_loss = 3.530
Epoch  15 Batch  330/1173   train_loss = 3.405
Epoch  15 Batch  355/1173   train_loss = 3.513
Epoch  15 Batch  380/1173   train_loss = 3.329
Epoch  15 Batch  405/1173   train_loss = 3.526
Epoch  15 Batch  430/1173   train_loss = 3.319
Epoch  15 Batch  455/1173   train_loss = 3.456
Epoch  15 Batch  480/1173   train_loss = 3.396
Epoch  15 Batch  505/1173   train_loss = 3.520
Epoch  15 Batch  530/1173   train_loss = 3.404
Epoch  15 Batch  555/1173   train_loss = 3.476
Epoch  15 Batch  580/1173   train_loss = 3.466
Epoch  15 Batch  605/1173   train_loss = 3.570
Epoch  15 Batch  630/1173   train_loss = 3.448
Epoch  15 Batch  655/1173   train_loss = 3.380
Epoch  15 Batch  680/1173   train_loss = 3.408
Epoch  15 Batch  705/1173   train_loss = 3.468
Epoch  15 Batch  730/1173   train_loss = 3.505
Epoch  15 Batch  755/1173   train_loss = 3.595
Epoch  15 Batch  780/1173   train_loss = 3.398
Epoch  15 Batch  805/1173   train_loss = 3.514
Epoch  15 Batch  830/1173   train_loss = 3.542
Epoch  15 Batch  855/1173   train_loss = 3.466
Epoch  15 Batch  880/1173   train_loss = 3.420
Epoch  15 Batch  905/1173   train_loss = 3.432
Epoch  15 Batch  930/1173   train_loss = 3.322
Epoch  15 Batch  955/1173   train_loss = 3.463
Epoch  15 Batch  980/1173   train_loss = 3.452
Epoch  15 Batch 1005/1173   train_loss = 3.360
Epoch  15 Batch 1030/1173   train_loss = 3.418
Epoch  15 Batch 1055/1173   train_loss = 3.456
Epoch  15 Batch 1080/1173   train_loss = 3.418
Epoch  15 Batch 1105/1173   train_loss = 3.484
Epoch  15 Batch 1130/1173   train_loss = 3.429
Epoch  15 Batch 1155/1173   train_loss = 3.411
Epoch  16 Batch    7/1173   train_loss = 3.399
Epoch  16 Batch   32/1173   train_loss = 3.497
Epoch  16 Batch   57/1173   train_loss = 3.354
Epoch  16 Batch   82/1173   train_loss = 3.401
Epoch  16 Batch  107/1173   train_loss = 3.516
Epoch  16 Batch  132/1173   train_loss = 3.373
Epoch  16 Batch  157/1173   train_loss = 3.561
Epoch  16 Batch  182/1173   train_loss = 3.477
Epoch  16 Batch  207/1173   train_loss = 3.389
Epoch  16 Batch  232/1173   train_loss = 3.537
Epoch  16 Batch  257/1173   train_loss = 3.419
Epoch  16 Batch  282/1173   train_loss = 3.481
Epoch  16 Batch  307/1173   train_loss = 3.390
Epoch  16 Batch  332/1173   train_loss = 3.388
Epoch  16 Batch  357/1173   train_loss = 3.524
Epoch  16 Batch  382/1173   train_loss = 3.472
Epoch  16 Batch  407/1173   train_loss = 3.544
Epoch  16 Batch  432/1173   train_loss = 3.588
Epoch  16 Batch  457/1173   train_loss = 3.538
Epoch  16 Batch  482/1173   train_loss = 3.493
Epoch  16 Batch  507/1173   train_loss = 3.558
Epoch  16 Batch  532/1173   train_loss = 3.563
Epoch  16 Batch  557/1173   train_loss = 3.431
Epoch  16 Batch  582/1173   train_loss = 3.527
Epoch  16 Batch  607/1173   train_loss = 3.390
Epoch  16 Batch  632/1173   train_loss = 3.489
Epoch  16 Batch  657/1173   train_loss = 3.482
Epoch  16 Batch  682/1173   train_loss = 3.389
Epoch  16 Batch  707/1173   train_loss = 3.425
Epoch  16 Batch  732/1173   train_loss = 3.413
Epoch  16 Batch  757/1173   train_loss = 3.472
Epoch  16 Batch  782/1173   train_loss = 3.370
Epoch  16 Batch  807/1173   train_loss = 3.494
Epoch  16 Batch  832/1173   train_loss = 3.505
Epoch  16 Batch  857/1173   train_loss = 3.506
Epoch  16 Batch  882/1173   train_loss = 3.485
Epoch  16 Batch  907/1173   train_loss = 3.338
Epoch  16 Batch  932/1173   train_loss = 3.333
Epoch  16 Batch  957/1173   train_loss = 3.366
Epoch  16 Batch  982/1173   train_loss = 3.172
Epoch  16 Batch 1007/1173   train_loss = 3.419
Epoch  16 Batch 1032/1173   train_loss = 3.287
Epoch  16 Batch 1057/1173   train_loss = 3.326
Epoch  16 Batch 1082/1173   train_loss = 3.409
Epoch  16 Batch 1107/1173   train_loss = 3.469
Epoch  16 Batch 1132/1173   train_loss = 3.212
Epoch  16 Batch 1157/1173   train_loss = 3.455
Epoch  17 Batch    9/1173   train_loss = 3.436
Epoch  17 Batch   34/1173   train_loss = 3.374
Epoch  17 Batch   59/1173   train_loss = 3.545
Epoch  17 Batch   84/1173   train_loss = 3.372
Epoch  17 Batch  109/1173   train_loss = 3.528
Epoch  17 Batch  134/1173   train_loss = 3.585
Epoch  17 Batch  159/1173   train_loss = 3.551
Epoch  17 Batch  184/1173   train_loss = 3.499
Epoch  17 Batch  209/1173   train_loss = 3.455
Epoch  17 Batch  234/1173   train_loss = 3.550
Epoch  17 Batch  259/1173   train_loss = 3.371
Epoch  17 Batch  284/1173   train_loss = 3.507
Epoch  17 Batch  309/1173   train_loss = 3.496
Epoch  17 Batch  334/1173   train_loss = 3.421
Epoch  17 Batch  359/1173   train_loss = 3.536
Epoch  17 Batch  384/1173   train_loss = 3.389
Epoch  17 Batch  409/1173   train_loss = 3.345
Epoch  17 Batch  434/1173   train_loss = 3.493
Epoch  17 Batch  459/1173   train_loss = 3.434
Epoch  17 Batch  484/1173   train_loss = 3.499
Epoch  17 Batch  509/1173   train_loss = 3.326
Epoch  17 Batch  534/1173   train_loss = 3.489
Epoch  17 Batch  559/1173   train_loss = 3.432
Epoch  17 Batch  584/1173   train_loss = 3.400
Epoch  17 Batch  609/1173   train_loss = 3.467
Epoch  17 Batch  634/1173   train_loss = 3.434
Epoch  17 Batch  659/1173   train_loss = 3.379
Epoch  17 Batch  684/1173   train_loss = 3.428
Epoch  17 Batch  709/1173   train_loss = 3.414
Epoch  17 Batch  734/1173   train_loss = 3.424
Epoch  17 Batch  759/1173   train_loss = 3.359
Epoch  17 Batch  784/1173   train_loss = 3.415
Epoch  17 Batch  809/1173   train_loss = 3.507
Epoch  17 Batch  834/1173   train_loss = 3.332
Epoch  17 Batch  859/1173   train_loss = 3.506
Epoch  17 Batch  884/1173   train_loss = 3.457
Epoch  17 Batch  909/1173   train_loss = 3.375
Epoch  17 Batch  934/1173   train_loss = 3.411
Epoch  17 Batch  959/1173   train_loss = 3.255
Epoch  17 Batch  984/1173   train_loss = 3.431
Epoch  17 Batch 1009/1173   train_loss = 3.393
Epoch  17 Batch 1034/1173   train_loss = 3.530
Epoch  17 Batch 1059/1173   train_loss = 3.407
Epoch  17 Batch 1084/1173   train_loss = 3.466
Epoch  17 Batch 1109/1173   train_loss = 3.318
Epoch  17 Batch 1134/1173   train_loss = 3.379
Epoch  17 Batch 1159/1173   train_loss = 3.274
Epoch  18 Batch   11/1173   train_loss = 3.544
Epoch  18 Batch   36/1173   train_loss = 3.632
Epoch  18 Batch   61/1173   train_loss = 3.421
Epoch  18 Batch   86/1173   train_loss = 3.576
Epoch  18 Batch  111/1173   train_loss = 3.320
Epoch  18 Batch  136/1173   train_loss = 3.360
Epoch  18 Batch  161/1173   train_loss = 3.361
Epoch  18 Batch  186/1173   train_loss = 3.444
Epoch  18 Batch  211/1173   train_loss = 3.493
Epoch  18 Batch  236/1173   train_loss = 3.316
Epoch  18 Batch  261/1173   train_loss = 3.391
Epoch  18 Batch  286/1173   train_loss = 3.499
Epoch  18 Batch  311/1173   train_loss = 3.467
Epoch  18 Batch  336/1173   train_loss = 3.380
Epoch  18 Batch  361/1173   train_loss = 3.423
Epoch  18 Batch  386/1173   train_loss = 3.471
Epoch  18 Batch  411/1173   train_loss = 3.372
Epoch  18 Batch  436/1173   train_loss = 3.395
Epoch  18 Batch  461/1173   train_loss = 3.517
Epoch  18 Batch  486/1173   train_loss = 3.510
Epoch  18 Batch  511/1173   train_loss = 3.391
Epoch  18 Batch  536/1173   train_loss = 3.571
Epoch  18 Batch  561/1173   train_loss = 3.379
Epoch  18 Batch  586/1173   train_loss = 3.349
Epoch  18 Batch  611/1173   train_loss = 3.305
Epoch  18 Batch  636/1173   train_loss = 3.394
Epoch  18 Batch  661/1173   train_loss = 3.293
Epoch  18 Batch  686/1173   train_loss = 3.446
Epoch  18 Batch  711/1173   train_loss = 3.387
Epoch  18 Batch  736/1173   train_loss = 3.502
Epoch  18 Batch  761/1173   train_loss = 3.432
Epoch  18 Batch  786/1173   train_loss = 3.411
Epoch  18 Batch  811/1173   train_loss = 3.477
Epoch  18 Batch  836/1173   train_loss = 3.459
Epoch  18 Batch  861/1173   train_loss = 3.376
Epoch  18 Batch  886/1173   train_loss = 3.382
Epoch  18 Batch  911/1173   train_loss = 3.460
Epoch  18 Batch  936/1173   train_loss = 3.359
Epoch  18 Batch  961/1173   train_loss = 3.384
Epoch  18 Batch  986/1173   train_loss = 3.447
Epoch  18 Batch 1011/1173   train_loss = 3.366
Epoch  18 Batch 1036/1173   train_loss = 3.346
Epoch  18 Batch 1061/1173   train_loss = 3.378
Epoch  18 Batch 1086/1173   train_loss = 3.426
Epoch  18 Batch 1111/1173   train_loss = 3.395
Epoch  18 Batch 1136/1173   train_loss = 3.419
Epoch  18 Batch 1161/1173   train_loss = 3.337
Epoch  19 Batch   13/1173   train_loss = 3.369
Epoch  19 Batch   38/1173   train_loss = 3.491
Epoch  19 Batch   63/1173   train_loss = 3.432
Epoch  19 Batch   88/1173   train_loss = 3.391
Epoch  19 Batch  113/1173   train_loss = 3.388
Epoch  19 Batch  138/1173   train_loss = 3.489
Epoch  19 Batch  163/1173   train_loss = 3.465
Epoch  19 Batch  188/1173   train_loss = 3.462
Epoch  19 Batch  213/1173   train_loss = 3.432
Epoch  19 Batch  238/1173   train_loss = 3.387
Epoch  19 Batch  263/1173   train_loss = 3.460
Epoch  19 Batch  288/1173   train_loss = 3.388
Epoch  19 Batch  313/1173   train_loss = 3.403
Epoch  19 Batch  338/1173   train_loss = 3.443
Epoch  19 Batch  363/1173   train_loss = 3.335
Epoch  19 Batch  388/1173   train_loss = 3.369
Epoch  19 Batch  413/1173   train_loss = 3.272
Epoch  19 Batch  438/1173   train_loss = 3.437
Epoch  19 Batch  463/1173   train_loss = 3.402
Epoch  19 Batch  488/1173   train_loss = 3.495
Epoch  19 Batch  513/1173   train_loss = 3.391
Epoch  19 Batch  538/1173   train_loss = 3.536
Epoch  19 Batch  563/1173   train_loss = 3.429
Epoch  19 Batch  588/1173   train_loss = 3.416
Epoch  19 Batch  613/1173   train_loss = 3.443
Epoch  19 Batch  638/1173   train_loss = 3.273
Epoch  19 Batch  663/1173   train_loss = 3.467
Epoch  19 Batch  688/1173   train_loss = 3.379
Epoch  19 Batch  713/1173   train_loss = 3.387
Epoch  19 Batch  738/1173   train_loss = 3.504
Epoch  19 Batch  763/1173   train_loss = 3.330
Epoch  19 Batch  788/1173   train_loss = 3.356
Epoch  19 Batch  813/1173   train_loss = 3.461
Epoch  19 Batch  838/1173   train_loss = 3.431
Epoch  19 Batch  863/1173   train_loss = 3.494
Epoch  19 Batch  888/1173   train_loss = 3.359
Epoch  19 Batch  913/1173   train_loss = 3.358
Epoch  19 Batch  938/1173   train_loss = 3.421
Epoch  19 Batch  963/1173   train_loss = 3.417
Epoch  19 Batch  988/1173   train_loss = 3.359
Epoch  19 Batch 1013/1173   train_loss = 3.352
Epoch  19 Batch 1038/1173   train_loss = 3.538
Epoch  19 Batch 1063/1173   train_loss = 3.240
Epoch  19 Batch 1088/1173   train_loss = 3.433
Epoch  19 Batch 1113/1173   train_loss = 3.446
Epoch  19 Batch 1138/1173   train_loss = 3.433
Epoch  19 Batch 1163/1173   train_loss = 3.292
Epoch  20 Batch   15/1173   train_loss = 3.452
Epoch  20 Batch   40/1173   train_loss = 3.309
Epoch  20 Batch   65/1173   train_loss = 3.489
Epoch  20 Batch   90/1173   train_loss = 3.442
Epoch  20 Batch  115/1173   train_loss = 3.402
Epoch  20 Batch  140/1173   train_loss = 3.356
Epoch  20 Batch  165/1173   train_loss = 3.512
Epoch  20 Batch  190/1173   train_loss = 3.422
Epoch  20 Batch  215/1173   train_loss = 3.451
Epoch  20 Batch  240/1173   train_loss = 3.428
Epoch  20 Batch  265/1173   train_loss = 3.358
Epoch  20 Batch  290/1173   train_loss = 3.319
Epoch  20 Batch  315/1173   train_loss = 3.486
Epoch  20 Batch  340/1173   train_loss = 3.476
Epoch  20 Batch  365/1173   train_loss = 3.403
Epoch  20 Batch  390/1173   train_loss = 3.347
Epoch  20 Batch  415/1173   train_loss = 3.561
Epoch  20 Batch  440/1173   train_loss = 3.384
Epoch  20 Batch  465/1173   train_loss = 3.439
Epoch  20 Batch  490/1173   train_loss = 3.405
Epoch  20 Batch  515/1173   train_loss = 3.378
Epoch  20 Batch  540/1173   train_loss = 3.505
Epoch  20 Batch  565/1173   train_loss = 3.398
Epoch  20 Batch  590/1173   train_loss = 3.390
Epoch  20 Batch  615/1173   train_loss = 3.383
Epoch  20 Batch  640/1173   train_loss = 3.487
Epoch  20 Batch  665/1173   train_loss = 3.307
Epoch  20 Batch  690/1173   train_loss = 3.356
Epoch  20 Batch  715/1173   train_loss = 3.406
Epoch  20 Batch  740/1173   train_loss = 3.357
Epoch  20 Batch  765/1173   train_loss = 3.399
Epoch  20 Batch  790/1173   train_loss = 3.443
Epoch  20 Batch  815/1173   train_loss = 3.474
Epoch  20 Batch  840/1173   train_loss = 3.409
Epoch  20 Batch  865/1173   train_loss = 3.581
Epoch  20 Batch  890/1173   train_loss = 3.423
Epoch  20 Batch  915/1173   train_loss = 3.431
Epoch  20 Batch  940/1173   train_loss = 3.441
Epoch  20 Batch  965/1173   train_loss = 3.417
Epoch  20 Batch  990/1173   train_loss = 3.324
Epoch  20 Batch 1015/1173   train_loss = 3.378
Epoch  20 Batch 1040/1173   train_loss = 3.362
Epoch  20 Batch 1065/1173   train_loss = 3.307
Epoch  20 Batch 1090/1173   train_loss = 3.408
Epoch  20 Batch 1115/1173   train_loss = 3.261
Epoch  20 Batch 1140/1173   train_loss = 3.310
Epoch  20 Batch 1165/1173   train_loss = 3.407
Epoch  21 Batch   17/1173   train_loss = 3.347
Epoch  21 Batch   42/1173   train_loss = 3.338
Epoch  21 Batch   67/1173   train_loss = 3.439
Epoch  21 Batch   92/1173   train_loss = 3.425
Epoch  21 Batch  117/1173   train_loss = 3.429
Epoch  21 Batch  142/1173   train_loss = 3.434
Epoch  21 Batch  167/1173   train_loss = 3.431
Epoch  21 Batch  192/1173   train_loss = 3.279
Epoch  21 Batch  217/1173   train_loss = 3.261
Epoch  21 Batch  242/1173   train_loss = 3.447
Epoch  21 Batch  267/1173   train_loss = 3.444
Epoch  21 Batch  292/1173   train_loss = 3.249
Epoch  21 Batch  317/1173   train_loss = 3.512
Epoch  21 Batch  342/1173   train_loss = 3.400
Epoch  21 Batch  367/1173   train_loss = 3.368
Epoch  21 Batch  392/1173   train_loss = 3.289
Epoch  21 Batch  417/1173   train_loss = 3.488
Epoch  21 Batch  442/1173   train_loss = 3.457
Epoch  21 Batch  467/1173   train_loss = 3.388
Epoch  21 Batch  492/1173   train_loss = 3.409
Epoch  21 Batch  517/1173   train_loss = 3.321
Epoch  21 Batch  542/1173   train_loss = 3.539
Epoch  21 Batch  567/1173   train_loss = 3.316
Epoch  21 Batch  592/1173   train_loss = 3.352
Epoch  21 Batch  617/1173   train_loss = 3.329
Epoch  21 Batch  642/1173   train_loss = 3.449
Epoch  21 Batch  667/1173   train_loss = 3.335
Epoch  21 Batch  692/1173   train_loss = 3.222
Epoch  21 Batch  717/1173   train_loss = 3.483
Epoch  21 Batch  742/1173   train_loss = 3.410
Epoch  21 Batch  767/1173   train_loss = 3.346
Epoch  21 Batch  792/1173   train_loss = 3.290
Epoch  21 Batch  817/1173   train_loss = 3.549
Epoch  21 Batch  842/1173   train_loss = 3.360
Epoch  21 Batch  867/1173   train_loss = 3.384
Epoch  21 Batch  892/1173   train_loss = 3.482
Epoch  21 Batch  917/1173   train_loss = 3.408
Epoch  21 Batch  942/1173   train_loss = 3.365
Epoch  21 Batch  967/1173   train_loss = 3.327
Epoch  21 Batch  992/1173   train_loss = 3.266
Epoch  21 Batch 1017/1173   train_loss = 3.360
Epoch  21 Batch 1042/1173   train_loss = 3.301
Epoch  21 Batch 1067/1173   train_loss = 3.407
Epoch  21 Batch 1092/1173   train_loss = 3.424
Epoch  21 Batch 1117/1173   train_loss = 3.348
Epoch  21 Batch 1142/1173   train_loss = 3.371
Epoch  21 Batch 1167/1173   train_loss = 3.262
Epoch  22 Batch   19/1173   train_loss = 3.322
Epoch  22 Batch   44/1173   train_loss = 3.430
Epoch  22 Batch   69/1173   train_loss = 3.312
Epoch  22 Batch   94/1173   train_loss = 3.508
Epoch  22 Batch  119/1173   train_loss = 3.399
Epoch  22 Batch  144/1173   train_loss = 3.490
Epoch  22 Batch  169/1173   train_loss = 3.408
Epoch  22 Batch  194/1173   train_loss = 3.402
Epoch  22 Batch  219/1173   train_loss = 3.378
Epoch  22 Batch  244/1173   train_loss = 3.443
Epoch  22 Batch  269/1173   train_loss = 3.468
Epoch  22 Batch  294/1173   train_loss = 3.356
Epoch  22 Batch  319/1173   train_loss = 3.399
Epoch  22 Batch  344/1173   train_loss = 3.249
Epoch  22 Batch  369/1173   train_loss = 3.242
Epoch  22 Batch  394/1173   train_loss = 3.363
Epoch  22 Batch  419/1173   train_loss = 3.402
Epoch  22 Batch  444/1173   train_loss = 3.244
Epoch  22 Batch  469/1173   train_loss = 3.458
Epoch  22 Batch  494/1173   train_loss = 3.330
Epoch  22 Batch  519/1173   train_loss = 3.296
Epoch  22 Batch  544/1173   train_loss = 3.388
Epoch  22 Batch  569/1173   train_loss = 3.306
Epoch  22 Batch  594/1173   train_loss = 3.403
Epoch  22 Batch  619/1173   train_loss = 3.489
Epoch  22 Batch  644/1173   train_loss = 3.257
Epoch  22 Batch  669/1173   train_loss = 3.301
Epoch  22 Batch  694/1173   train_loss = 3.389
Epoch  22 Batch  719/1173   train_loss = 3.219
Epoch  22 Batch  744/1173   train_loss = 3.347
Epoch  22 Batch  769/1173   train_loss = 3.405
Epoch  22 Batch  794/1173   train_loss = 3.294
Epoch  22 Batch  819/1173   train_loss = 3.406
Epoch  22 Batch  844/1173   train_loss = 3.375
Epoch  22 Batch  869/1173   train_loss = 3.267
Epoch  22 Batch  894/1173   train_loss = 3.406
Epoch  22 Batch  919/1173   train_loss = 3.352
Epoch  22 Batch  944/1173   train_loss = 3.369
Epoch  22 Batch  969/1173   train_loss = 3.269
Epoch  22 Batch  994/1173   train_loss = 3.340
Epoch  22 Batch 1019/1173   train_loss = 3.259
Epoch  22 Batch 1044/1173   train_loss = 3.531
Epoch  22 Batch 1069/1173   train_loss = 3.278
Epoch  22 Batch 1094/1173   train_loss = 3.222
Epoch  22 Batch 1119/1173   train_loss = 3.226
Epoch  22 Batch 1144/1173   train_loss = 3.330
Epoch  22 Batch 1169/1173   train_loss = 3.280
Epoch  23 Batch   21/1173   train_loss = 3.239
Epoch  23 Batch   46/1173   train_loss = 3.346
Epoch  23 Batch   71/1173   train_loss = 3.484
Epoch  23 Batch   96/1173   train_loss = 3.253
Epoch  23 Batch  121/1173   train_loss = 3.370
Epoch  23 Batch  146/1173   train_loss = 3.477
Epoch  23 Batch  171/1173   train_loss = 3.385
Epoch  23 Batch  196/1173   train_loss = 3.470
Epoch  23 Batch  221/1173   train_loss = 3.370
Epoch  23 Batch  246/1173   train_loss = 3.502
Epoch  23 Batch  271/1173   train_loss = 3.413
Epoch  23 Batch  296/1173   train_loss = 3.357
Epoch  23 Batch  321/1173   train_loss = 3.384
Epoch  23 Batch  346/1173   train_loss = 3.439
Epoch  23 Batch  371/1173   train_loss = 3.367
Epoch  23 Batch  396/1173   train_loss = 3.394
Epoch  23 Batch  421/1173   train_loss = 3.234
Epoch  23 Batch  446/1173   train_loss = 3.342
Epoch  23 Batch  471/1173   train_loss = 3.333
Epoch  23 Batch  496/1173   train_loss = 3.452
Epoch  23 Batch  521/1173   train_loss = 3.286
Epoch  23 Batch  546/1173   train_loss = 3.436
Epoch  23 Batch  571/1173   train_loss = 3.435
Epoch  23 Batch  596/1173   train_loss = 3.401
Epoch  23 Batch  621/1173   train_loss = 3.382
Epoch  23 Batch  646/1173   train_loss = 3.454
Epoch  23 Batch  671/1173   train_loss = 3.462
Epoch  23 Batch  696/1173   train_loss = 3.424
Epoch  23 Batch  721/1173   train_loss = 3.324
Epoch  23 Batch  746/1173   train_loss = 3.303
Epoch  23 Batch  771/1173   train_loss = 3.299
Epoch  23 Batch  796/1173   train_loss = 3.212
Epoch  23 Batch  821/1173   train_loss = 3.324
Epoch  23 Batch  846/1173   train_loss = 3.315
Epoch  23 Batch  871/1173   train_loss = 3.350
Epoch  23 Batch  896/1173   train_loss = 3.403
Epoch  23 Batch  921/1173   train_loss = 3.396
Epoch  23 Batch  946/1173   train_loss = 3.437
Epoch  23 Batch  971/1173   train_loss = 3.250
Epoch  23 Batch  996/1173   train_loss = 3.379
Epoch  23 Batch 1021/1173   train_loss = 3.406
Epoch  23 Batch 1046/1173   train_loss = 3.321
Epoch  23 Batch 1071/1173   train_loss = 3.346
Epoch  23 Batch 1096/1173   train_loss = 3.382
Epoch  23 Batch 1121/1173   train_loss = 3.392
Epoch  23 Batch 1146/1173   train_loss = 3.341
Epoch  23 Batch 1171/1173   train_loss = 3.330
Epoch  24 Batch   23/1173   train_loss = 3.201
Epoch  24 Batch   48/1173   train_loss = 3.345
Epoch  24 Batch   73/1173   train_loss = 3.301
Epoch  24 Batch   98/1173   train_loss = 3.370
Epoch  24 Batch  123/1173   train_loss = 3.372
Epoch  24 Batch  148/1173   train_loss = 3.334
Epoch  24 Batch  173/1173   train_loss = 3.398
Epoch  24 Batch  198/1173   train_loss = 3.326
Epoch  24 Batch  223/1173   train_loss = 3.322
Epoch  24 Batch  248/1173   train_loss = 3.348
Epoch  24 Batch  273/1173   train_loss = 3.400
Epoch  24 Batch  298/1173   train_loss = 3.376
Epoch  24 Batch  323/1173   train_loss = 3.348
Epoch  24 Batch  348/1173   train_loss = 3.424
Epoch  24 Batch  373/1173   train_loss = 3.363
Epoch  24 Batch  398/1173   train_loss = 3.400
Epoch  24 Batch  423/1173   train_loss = 3.339
Epoch  24 Batch  448/1173   train_loss = 3.396
Epoch  24 Batch  473/1173   train_loss = 3.309
Epoch  24 Batch  498/1173   train_loss = 3.444
Epoch  24 Batch  523/1173   train_loss = 3.373
Epoch  24 Batch  548/1173   train_loss = 3.353
Epoch  24 Batch  573/1173   train_loss = 3.485
Epoch  24 Batch  598/1173   train_loss = 3.374
Epoch  24 Batch  623/1173   train_loss = 3.429
Epoch  24 Batch  648/1173   train_loss = 3.411
Epoch  24 Batch  673/1173   train_loss = 3.256
Epoch  24 Batch  698/1173   train_loss = 3.440
Epoch  24 Batch  723/1173   train_loss = 3.420
Epoch  24 Batch  748/1173   train_loss = 3.263
Epoch  24 Batch  773/1173   train_loss = 3.379
Epoch  24 Batch  798/1173   train_loss = 3.342
Epoch  24 Batch  823/1173   train_loss = 3.286
Epoch  24 Batch  848/1173   train_loss = 3.358
Epoch  24 Batch  873/1173   train_loss = 3.209
Epoch  24 Batch  898/1173   train_loss = 3.528
Epoch  24 Batch  923/1173   train_loss = 3.473
Epoch  24 Batch  948/1173   train_loss = 3.428
Epoch  24 Batch  973/1173   train_loss = 3.184
Epoch  24 Batch  998/1173   train_loss = 3.323
Epoch  24 Batch 1023/1173   train_loss = 3.373
Epoch  24 Batch 1048/1173   train_loss = 3.361
Epoch  24 Batch 1073/1173   train_loss = 3.414
Epoch  24 Batch 1098/1173   train_loss = 3.373
Epoch  24 Batch 1123/1173   train_loss = 3.254
Epoch  24 Batch 1148/1173   train_loss = 3.226
Epoch  25 Batch    0/1173   train_loss = 3.250
Epoch  25 Batch   25/1173   train_loss = 3.319
Epoch  25 Batch   50/1173   train_loss = 3.318
Epoch  25 Batch   75/1173   train_loss = 3.494
Epoch  25 Batch  100/1173   train_loss = 3.275
Epoch  25 Batch  125/1173   train_loss = 3.424
Epoch  25 Batch  150/1173   train_loss = 3.387
Epoch  25 Batch  175/1173   train_loss = 3.352
Epoch  25 Batch  200/1173   train_loss = 3.439
Epoch  25 Batch  225/1173   train_loss = 3.492
Epoch  25 Batch  250/1173   train_loss = 3.326
Epoch  25 Batch  275/1173   train_loss = 3.406
Epoch  25 Batch  300/1173   train_loss = 3.268
Epoch  25 Batch  325/1173   train_loss = 3.308
Epoch  25 Batch  350/1173   train_loss = 3.522
Epoch  25 Batch  375/1173   train_loss = 3.345
Epoch  25 Batch  400/1173   train_loss = 3.421
Epoch  25 Batch  425/1173   train_loss = 3.424
Epoch  25 Batch  450/1173   train_loss = 3.446
Epoch  25 Batch  475/1173   train_loss = 3.334
Epoch  25 Batch  500/1173   train_loss = 3.384
Epoch  25 Batch  525/1173   train_loss = 3.220
Epoch  25 Batch  550/1173   train_loss = 3.371
Epoch  25 Batch  575/1173   train_loss = 3.348
Epoch  25 Batch  600/1173   train_loss = 3.318
Epoch  25 Batch  625/1173   train_loss = 3.360
Epoch  25 Batch  650/1173   train_loss = 3.365
Epoch  25 Batch  675/1173   train_loss = 3.312
Epoch  25 Batch  700/1173   train_loss = 3.362
Epoch  25 Batch  725/1173   train_loss = 3.286
Epoch  25 Batch  750/1173   train_loss = 3.350
Epoch  25 Batch  775/1173   train_loss = 3.270
Epoch  25 Batch  800/1173   train_loss = 3.244
Epoch  25 Batch  825/1173   train_loss = 3.279
Epoch  25 Batch  850/1173   train_loss = 3.439
Epoch  25 Batch  875/1173   train_loss = 3.229
Epoch  25 Batch  900/1173   train_loss = 3.395
Epoch  25 Batch  925/1173   train_loss = 3.496
Epoch  25 Batch  950/1173   train_loss = 3.356
Epoch  25 Batch  975/1173   train_loss = 3.358
Epoch  25 Batch 1000/1173   train_loss = 3.335
Epoch  25 Batch 1025/1173   train_loss = 3.347
Epoch  25 Batch 1050/1173   train_loss = 3.292
Epoch  25 Batch 1075/1173   train_loss = 3.334
Epoch  25 Batch 1100/1173   train_loss = 3.240
Epoch  25 Batch 1125/1173   train_loss = 3.331
Epoch  25 Batch 1150/1173   train_loss = 3.311
Epoch  26 Batch    2/1173   train_loss = 3.314
Epoch  26 Batch   27/1173   train_loss = 3.255
Epoch  26 Batch   52/1173   train_loss = 3.453
Epoch  26 Batch   77/1173   train_loss = 3.220
Epoch  26 Batch  102/1173   train_loss = 3.454
Epoch  26 Batch  127/1173   train_loss = 3.405
Epoch  26 Batch  152/1173   train_loss = 3.503
Epoch  26 Batch  177/1173   train_loss = 3.297
Epoch  26 Batch  202/1173   train_loss = 3.337
Epoch  26 Batch  227/1173   train_loss = 3.340
Epoch  26 Batch  252/1173   train_loss = 3.293
Epoch  26 Batch  277/1173   train_loss = 3.385
Epoch  26 Batch  302/1173   train_loss = 3.306
Epoch  26 Batch  327/1173   train_loss = 3.345
Epoch  26 Batch  352/1173   train_loss = 3.325
Epoch  26 Batch  377/1173   train_loss = 3.381
Epoch  26 Batch  402/1173   train_loss = 3.374
Epoch  26 Batch  427/1173   train_loss = 3.251
Epoch  26 Batch  452/1173   train_loss = 3.382
Epoch  26 Batch  477/1173   train_loss = 3.328
Epoch  26 Batch  502/1173   train_loss = 3.359
Epoch  26 Batch  527/1173   train_loss = 3.243
Epoch  26 Batch  552/1173   train_loss = 3.310
Epoch  26 Batch  577/1173   train_loss = 3.219
Epoch  26 Batch  602/1173   train_loss = 3.311
Epoch  26 Batch  627/1173   train_loss = 3.327
Epoch  26 Batch  652/1173   train_loss = 3.119
Epoch  26 Batch  677/1173   train_loss = 3.441
Epoch  26 Batch  702/1173   train_loss = 3.330
Epoch  26 Batch  727/1173   train_loss = 3.356
Epoch  26 Batch  752/1173   train_loss = 3.245
Epoch  26 Batch  777/1173   train_loss = 3.281
Epoch  26 Batch  802/1173   train_loss = 3.368
Epoch  26 Batch  827/1173   train_loss = 3.281
Epoch  26 Batch  852/1173   train_loss = 3.390
Epoch  26 Batch  877/1173   train_loss = 3.261
Epoch  26 Batch  902/1173   train_loss = 3.333
Epoch  26 Batch  927/1173   train_loss = 3.345
Epoch  26 Batch  952/1173   train_loss = 3.279
Epoch  26 Batch  977/1173   train_loss = 3.338
Epoch  26 Batch 1002/1173   train_loss = 3.301
Epoch  26 Batch 1027/1173   train_loss = 3.340
Epoch  26 Batch 1052/1173   train_loss = 3.234
Epoch  26 Batch 1077/1173   train_loss = 3.270
Epoch  26 Batch 1102/1173   train_loss = 3.283
Epoch  26 Batch 1127/1173   train_loss = 3.312
Epoch  26 Batch 1152/1173   train_loss = 3.160
Epoch  27 Batch    4/1173   train_loss = 3.296
Epoch  27 Batch   29/1173   train_loss = 3.331
Epoch  27 Batch   54/1173   train_loss = 3.364
Epoch  27 Batch   79/1173   train_loss = 3.297
Epoch  27 Batch  104/1173   train_loss = 3.276
Epoch  27 Batch  129/1173   train_loss = 3.284
Epoch  27 Batch  154/1173   train_loss = 3.362
Epoch  27 Batch  179/1173   train_loss = 3.412
Epoch  27 Batch  204/1173   train_loss = 3.345
Epoch  27 Batch  229/1173   train_loss = 3.306
Epoch  27 Batch  254/1173   train_loss = 3.256
Epoch  27 Batch  279/1173   train_loss = 3.221
Epoch  27 Batch  304/1173   train_loss = 3.337
Epoch  27 Batch  329/1173   train_loss = 3.464
Epoch  27 Batch  354/1173   train_loss = 3.263
Epoch  27 Batch  379/1173   train_loss = 3.307
Epoch  27 Batch  404/1173   train_loss = 3.336
Epoch  27 Batch  429/1173   train_loss = 3.232
Epoch  27 Batch  454/1173   train_loss = 3.401
Epoch  27 Batch  479/1173   train_loss = 3.320
Epoch  27 Batch  504/1173   train_loss = 3.391
Epoch  27 Batch  529/1173   train_loss = 3.408
Epoch  27 Batch  554/1173   train_loss = 3.343
Epoch  27 Batch  579/1173   train_loss = 3.411
Epoch  27 Batch  604/1173   train_loss = 3.422
Epoch  27 Batch  629/1173   train_loss = 3.354
Epoch  27 Batch  654/1173   train_loss = 3.305
Epoch  27 Batch  679/1173   train_loss = 3.471
Epoch  27 Batch  704/1173   train_loss = 3.259
Epoch  27 Batch  729/1173   train_loss = 3.225
Epoch  27 Batch  754/1173   train_loss = 3.409
Epoch  27 Batch  779/1173   train_loss = 3.257
Epoch  27 Batch  804/1173   train_loss = 3.377
Epoch  27 Batch  829/1173   train_loss = 3.425
Epoch  27 Batch  854/1173   train_loss = 3.469
Epoch  27 Batch  879/1173   train_loss = 3.207
Epoch  27 Batch  904/1173   train_loss = 3.283
Epoch  27 Batch  929/1173   train_loss = 3.320
Epoch  27 Batch  954/1173   train_loss = 3.225
Epoch  27 Batch  979/1173   train_loss = 3.322
Epoch  27 Batch 1004/1173   train_loss = 3.273
Epoch  27 Batch 1029/1173   train_loss = 3.356
Epoch  27 Batch 1054/1173   train_loss = 3.242
Epoch  27 Batch 1079/1173   train_loss = 3.353
Epoch  27 Batch 1104/1173   train_loss = 3.338
Epoch  27 Batch 1129/1173   train_loss = 3.164
Epoch  27 Batch 1154/1173   train_loss = 3.239
Epoch  28 Batch    6/1173   train_loss = 3.389
Epoch  28 Batch   31/1173   train_loss = 3.208
Epoch  28 Batch   56/1173   train_loss = 3.275
Epoch  28 Batch   81/1173   train_loss = 3.471
Epoch  28 Batch  106/1173   train_loss = 3.285
Epoch  28 Batch  131/1173   train_loss = 3.337
Epoch  28 Batch  156/1173   train_loss = 3.327
Epoch  28 Batch  181/1173   train_loss = 3.406
Epoch  28 Batch  206/1173   train_loss = 3.309
Epoch  28 Batch  231/1173   train_loss = 3.510
Epoch  28 Batch  256/1173   train_loss = 3.241
Epoch  28 Batch  281/1173   train_loss = 3.208
Epoch  28 Batch  306/1173   train_loss = 3.405
Epoch  28 Batch  331/1173   train_loss = 3.374
Epoch  28 Batch  356/1173   train_loss = 3.259
Epoch  28 Batch  381/1173   train_loss = 3.238
Epoch  28 Batch  406/1173   train_loss = 3.405
Epoch  28 Batch  431/1173   train_loss = 3.313
Epoch  28 Batch  456/1173   train_loss = 3.437
Epoch  28 Batch  481/1173   train_loss = 3.291
Epoch  28 Batch  506/1173   train_loss = 3.440
Epoch  28 Batch  531/1173   train_loss = 3.361
Epoch  28 Batch  556/1173   train_loss = 3.160
Epoch  28 Batch  581/1173   train_loss = 3.388
Epoch  28 Batch  606/1173   train_loss = 3.436
Epoch  28 Batch  631/1173   train_loss = 3.361
Epoch  28 Batch  656/1173   train_loss = 3.346
Epoch  28 Batch  681/1173   train_loss = 3.404
Epoch  28 Batch  706/1173   train_loss = 3.397
Epoch  28 Batch  731/1173   train_loss = 3.320
Epoch  28 Batch  756/1173   train_loss = 3.345
Epoch  28 Batch  781/1173   train_loss = 3.263
Epoch  28 Batch  806/1173   train_loss = 3.270
Epoch  28 Batch  831/1173   train_loss = 3.350
Epoch  28 Batch  856/1173   train_loss = 3.366
Epoch  28 Batch  881/1173   train_loss = 3.383
Epoch  28 Batch  906/1173   train_loss = 3.232
Epoch  28 Batch  931/1173   train_loss = 3.301
Epoch  28 Batch  956/1173   train_loss = 3.268
Epoch  28 Batch  981/1173   train_loss = 3.269
Epoch  28 Batch 1006/1173   train_loss = 3.272
Epoch  28 Batch 1031/1173   train_loss = 3.301
Epoch  28 Batch 1056/1173   train_loss = 3.332
Epoch  28 Batch 1081/1173   train_loss = 3.310
Epoch  28 Batch 1106/1173   train_loss = 3.354
Epoch  28 Batch 1131/1173   train_loss = 3.192
Epoch  28 Batch 1156/1173   train_loss = 3.316
Epoch  29 Batch    8/1173   train_loss = 3.292
Epoch  29 Batch   33/1173   train_loss = 3.164
Epoch  29 Batch   58/1173   train_loss = 3.316
Epoch  29 Batch   83/1173   train_loss = 3.357
Epoch  29 Batch  108/1173   train_loss = 3.340
Epoch  29 Batch  133/1173   train_loss = 3.333
Epoch  29 Batch  158/1173   train_loss = 3.262
Epoch  29 Batch  183/1173   train_loss = 3.295
Epoch  29 Batch  208/1173   train_loss = 3.386
Epoch  29 Batch  233/1173   train_loss = 3.288
Epoch  29 Batch  258/1173   train_loss = 3.272
Epoch  29 Batch  283/1173   train_loss = 3.286
Epoch  29 Batch  308/1173   train_loss = 3.331
Epoch  29 Batch  333/1173   train_loss = 3.530
Epoch  29 Batch  358/1173   train_loss = 3.259
Epoch  29 Batch  383/1173   train_loss = 3.343
Epoch  29 Batch  408/1173   train_loss = 3.377
Epoch  29 Batch  433/1173   train_loss = 3.280
Epoch  29 Batch  458/1173   train_loss = 3.351
Epoch  29 Batch  483/1173   train_loss = 3.342
Epoch  29 Batch  508/1173   train_loss = 3.326
Epoch  29 Batch  533/1173   train_loss = 3.341
Epoch  29 Batch  558/1173   train_loss = 3.161
Epoch  29 Batch  583/1173   train_loss = 3.285
Epoch  29 Batch  608/1173   train_loss = 3.400
Epoch  29 Batch  633/1173   train_loss = 3.329
Epoch  29 Batch  658/1173   train_loss = 3.237
Epoch  29 Batch  683/1173   train_loss = 3.301
Epoch  29 Batch  708/1173   train_loss = 3.356
Epoch  29 Batch  733/1173   train_loss = 3.322
Epoch  29 Batch  758/1173   train_loss = 3.326
Epoch  29 Batch  783/1173   train_loss = 3.347
Epoch  29 Batch  808/1173   train_loss = 3.215
Epoch  29 Batch  833/1173   train_loss = 3.375
Epoch  29 Batch  858/1173   train_loss = 3.362
Epoch  29 Batch  883/1173   train_loss = 3.306
Epoch  29 Batch  908/1173   train_loss = 3.252
Epoch  29 Batch  933/1173   train_loss = 3.289
Epoch  29 Batch  958/1173   train_loss = 3.159
Epoch  29 Batch  983/1173   train_loss = 3.242
Epoch  29 Batch 1008/1173   train_loss = 3.194
Epoch  29 Batch 1033/1173   train_loss = 3.271
Epoch  29 Batch 1058/1173   train_loss = 3.419
Epoch  29 Batch 1083/1173   train_loss = 3.295
Epoch  29 Batch 1108/1173   train_loss = 3.193
Epoch  29 Batch 1133/1173   train_loss = 3.305
Epoch  29 Batch 1158/1173   train_loss = 3.139
Epoch  30 Batch   10/1173   train_loss = 3.380
Epoch  30 Batch   35/1173   train_loss = 3.332
Epoch  30 Batch   60/1173   train_loss = 3.462
Epoch  30 Batch   85/1173   train_loss = 3.361
Epoch  30 Batch  110/1173   train_loss = 3.338
Epoch  30 Batch  135/1173   train_loss = 3.313
Epoch  30 Batch  160/1173   train_loss = 3.358
Epoch  30 Batch  185/1173   train_loss = 3.362
Epoch  30 Batch  210/1173   train_loss = 3.270
Epoch  30 Batch  235/1173   train_loss = 3.408
Epoch  30 Batch  260/1173   train_loss = 3.257
Epoch  30 Batch  285/1173   train_loss = 3.254
Epoch  30 Batch  310/1173   train_loss = 3.188
Epoch  30 Batch  335/1173   train_loss = 3.342
Epoch  30 Batch  360/1173   train_loss = 3.171
Epoch  30 Batch  385/1173   train_loss = 3.233
Epoch  30 Batch  410/1173   train_loss = 3.375
Epoch  30 Batch  435/1173   train_loss = 3.198
Epoch  30 Batch  460/1173   train_loss = 3.289
Epoch  30 Batch  485/1173   train_loss = 3.267
Epoch  30 Batch  510/1173   train_loss = 3.387
Epoch  30 Batch  535/1173   train_loss = 3.325
Epoch  30 Batch  560/1173   train_loss = 3.399
Epoch  30 Batch  585/1173   train_loss = 3.269
Epoch  30 Batch  610/1173   train_loss = 3.387
Epoch  30 Batch  635/1173   train_loss = 3.261
Epoch  30 Batch  660/1173   train_loss = 3.189
Epoch  30 Batch  685/1173   train_loss = 3.298
Epoch  30 Batch  710/1173   train_loss = 3.271
Epoch  30 Batch  735/1173   train_loss = 3.248
Epoch  30 Batch  760/1173   train_loss = 3.329
Epoch  30 Batch  785/1173   train_loss = 3.171
Epoch  30 Batch  810/1173   train_loss = 3.368
Epoch  30 Batch  835/1173   train_loss = 3.185
Epoch  30 Batch  860/1173   train_loss = 3.256
Epoch  30 Batch  885/1173   train_loss = 3.310
Epoch  30 Batch  910/1173   train_loss = 3.323
Epoch  30 Batch  935/1173   train_loss = 3.259
Epoch  30 Batch  960/1173   train_loss = 3.333
Epoch  30 Batch  985/1173   train_loss = 3.306
Epoch  30 Batch 1010/1173   train_loss = 3.327
Epoch  30 Batch 1035/1173   train_loss = 3.257
Epoch  30 Batch 1060/1173   train_loss = 3.121
Epoch  30 Batch 1085/1173   train_loss = 3.252
Epoch  30 Batch 1110/1173   train_loss = 3.143
Epoch  30 Batch 1135/1173   train_loss = 3.259
Epoch  30 Batch 1160/1173   train_loss = 3.336
Epoch  31 Batch   12/1173   train_loss = 3.362
Epoch  31 Batch   37/1173   train_loss = 3.206
Epoch  31 Batch   62/1173   train_loss = 3.357
Epoch  31 Batch   87/1173   train_loss = 3.311
Epoch  31 Batch  112/1173   train_loss = 3.343
Epoch  31 Batch  137/1173   train_loss = 3.227
Epoch  31 Batch  162/1173   train_loss = 3.399
Epoch  31 Batch  187/1173   train_loss = 3.410
Epoch  31 Batch  212/1173   train_loss = 3.353
Epoch  31 Batch  237/1173   train_loss = 3.279
Epoch  31 Batch  262/1173   train_loss = 3.364
Epoch  31 Batch  287/1173   train_loss = 3.276
Epoch  31 Batch  312/1173   train_loss = 3.218
Epoch  31 Batch  337/1173   train_loss = 3.329
Epoch  31 Batch  362/1173   train_loss = 3.379
Epoch  31 Batch  387/1173   train_loss = 3.213
Epoch  31 Batch  412/1173   train_loss = 3.155
Epoch  31 Batch  437/1173   train_loss = 3.297
Epoch  31 Batch  462/1173   train_loss = 3.296
Epoch  31 Batch  487/1173   train_loss = 3.314
Epoch  31 Batch  512/1173   train_loss = 3.244
Epoch  31 Batch  537/1173   train_loss = 3.240
Epoch  31 Batch  562/1173   train_loss = 3.383
Epoch  31 Batch  587/1173   train_loss = 3.295
Epoch  31 Batch  612/1173   train_loss = 3.291
Epoch  31 Batch  637/1173   train_loss = 3.304
Epoch  31 Batch  662/1173   train_loss = 3.354
Epoch  31 Batch  687/1173   train_loss = 3.266
Epoch  31 Batch  712/1173   train_loss = 3.191
Epoch  31 Batch  737/1173   train_loss = 3.443
Epoch  31 Batch  762/1173   train_loss = 3.452
Epoch  31 Batch  787/1173   train_loss = 3.353
Epoch  31 Batch  812/1173   train_loss = 3.275
Epoch  31 Batch  837/1173   train_loss = 3.307
Epoch  31 Batch  862/1173   train_loss = 3.258
Epoch  31 Batch  887/1173   train_loss = 3.291
Epoch  31 Batch  912/1173   train_loss = 3.327
Epoch  31 Batch  937/1173   train_loss = 3.329
Epoch  31 Batch  962/1173   train_loss = 3.275
Epoch  31 Batch  987/1173   train_loss = 3.202
Epoch  31 Batch 1012/1173   train_loss = 3.243
Epoch  31 Batch 1037/1173   train_loss = 3.266
Epoch  31 Batch 1062/1173   train_loss = 3.268
Epoch  31 Batch 1087/1173   train_loss = 3.275
Epoch  31 Batch 1112/1173   train_loss = 3.197
Epoch  31 Batch 1137/1173   train_loss = 3.205
Epoch  31 Batch 1162/1173   train_loss = 3.292
Epoch  32 Batch   14/1173   train_loss = 3.308
Epoch  32 Batch   39/1173   train_loss = 3.321
Epoch  32 Batch   64/1173   train_loss = 3.312
Epoch  32 Batch   89/1173   train_loss = 3.346
Epoch  32 Batch  114/1173   train_loss = 3.319
Epoch  32 Batch  139/1173   train_loss = 3.333
Epoch  32 Batch  164/1173   train_loss = 3.359
Epoch  32 Batch  189/1173   train_loss = 3.423
Epoch  32 Batch  214/1173   train_loss = 3.229
Epoch  32 Batch  239/1173   train_loss = 3.344
Epoch  32 Batch  264/1173   train_loss = 3.158
Epoch  32 Batch  289/1173   train_loss = 3.274
Epoch  32 Batch  314/1173   train_loss = 3.445
Epoch  32 Batch  339/1173   train_loss = 3.319
Epoch  32 Batch  364/1173   train_loss = 3.319
Epoch  32 Batch  389/1173   train_loss = 3.202
Epoch  32 Batch  414/1173   train_loss = 3.339
Epoch  32 Batch  439/1173   train_loss = 3.197
Epoch  32 Batch  464/1173   train_loss = 3.322
Epoch  32 Batch  489/1173   train_loss = 3.373
Epoch  32 Batch  514/1173   train_loss = 3.434
Epoch  32 Batch  539/1173   train_loss = 3.356
Epoch  32 Batch  564/1173   train_loss = 3.204
Epoch  32 Batch  589/1173   train_loss = 3.227
Epoch  32 Batch  614/1173   train_loss = 3.312
Epoch  32 Batch  639/1173   train_loss = 3.333
Epoch  32 Batch  664/1173   train_loss = 3.253
Epoch  32 Batch  689/1173   train_loss = 3.216
Epoch  32 Batch  714/1173   train_loss = 3.211
Epoch  32 Batch  739/1173   train_loss = 3.202
Epoch  32 Batch  764/1173   train_loss = 3.250
Epoch  32 Batch  789/1173   train_loss = 3.227
Epoch  32 Batch  814/1173   train_loss = 3.180
Epoch  32 Batch  839/1173   train_loss = 3.213
Epoch  32 Batch  864/1173   train_loss = 3.224
Epoch  32 Batch  889/1173   train_loss = 3.336
Epoch  32 Batch  914/1173   train_loss = 3.310
Epoch  32 Batch  939/1173   train_loss = 3.287
Epoch  32 Batch  964/1173   train_loss = 3.264
Epoch  32 Batch  989/1173   train_loss = 3.243
Epoch  32 Batch 1014/1173   train_loss = 3.210
Epoch  32 Batch 1039/1173   train_loss = 3.238
Epoch  32 Batch 1064/1173   train_loss = 3.071
Epoch  32 Batch 1089/1173   train_loss = 3.409
Epoch  32 Batch 1114/1173   train_loss = 3.126
Epoch  32 Batch 1139/1173   train_loss = 3.230
Epoch  32 Batch 1164/1173   train_loss = 3.164
Epoch  33 Batch   16/1173   train_loss = 3.350
Epoch  33 Batch   41/1173   train_loss = 3.104
Epoch  33 Batch   66/1173   train_loss = 3.260
Epoch  33 Batch   91/1173   train_loss = 3.352
Epoch  33 Batch  116/1173   train_loss = 3.352
Epoch  33 Batch  141/1173   train_loss = 3.192
Epoch  33 Batch  166/1173   train_loss = 3.259
Epoch  33 Batch  191/1173   train_loss = 3.213
Epoch  33 Batch  216/1173   train_loss = 3.253
Epoch  33 Batch  241/1173   train_loss = 3.337
Epoch  33 Batch  266/1173   train_loss = 3.196
Epoch  33 Batch  291/1173   train_loss = 3.314
Epoch  33 Batch  316/1173   train_loss = 3.340
Epoch  33 Batch  341/1173   train_loss = 3.364
Epoch  33 Batch  366/1173   train_loss = 3.263
Epoch  33 Batch  391/1173   train_loss = 3.299
Epoch  33 Batch  416/1173   train_loss = 3.289
Epoch  33 Batch  441/1173   train_loss = 3.244
Epoch  33 Batch  466/1173   train_loss = 3.165
Epoch  33 Batch  491/1173   train_loss = 3.220
Epoch  33 Batch  516/1173   train_loss = 3.310
Epoch  33 Batch  541/1173   train_loss = 3.273
Epoch  33 Batch  566/1173   train_loss = 3.448
Epoch  33 Batch  591/1173   train_loss = 3.252
Epoch  33 Batch  616/1173   train_loss = 3.293
Epoch  33 Batch  641/1173   train_loss = 3.218
Epoch  33 Batch  666/1173   train_loss = 3.282
Epoch  33 Batch  691/1173   train_loss = 3.190
Epoch  33 Batch  716/1173   train_loss = 3.379
Epoch  33 Batch  741/1173   train_loss = 3.429
Epoch  33 Batch  766/1173   train_loss = 3.340
Epoch  33 Batch  791/1173   train_loss = 3.233
Epoch  33 Batch  816/1173   train_loss = 3.324
Epoch  33 Batch  841/1173   train_loss = 3.396
Epoch  33 Batch  866/1173   train_loss = 3.260
Epoch  33 Batch  891/1173   train_loss = 3.283
Epoch  33 Batch  916/1173   train_loss = 3.323
Epoch  33 Batch  941/1173   train_loss = 3.179
Epoch  33 Batch  966/1173   train_loss = 3.355
Epoch  33 Batch  991/1173   train_loss = 3.365
Epoch  33 Batch 1016/1173   train_loss = 3.180
Epoch  33 Batch 1041/1173   train_loss = 3.206
Epoch  33 Batch 1066/1173   train_loss = 3.272
Epoch  33 Batch 1091/1173   train_loss = 3.239
Epoch  33 Batch 1116/1173   train_loss = 3.349
Epoch  33 Batch 1141/1173   train_loss = 3.254
Epoch  33 Batch 1166/1173   train_loss = 3.258
Epoch  34 Batch   18/1173   train_loss = 3.350
Epoch  34 Batch   43/1173   train_loss = 3.204
Epoch  34 Batch   68/1173   train_loss = 3.296
Epoch  34 Batch   93/1173   train_loss = 3.283
Epoch  34 Batch  118/1173   train_loss = 3.416
Epoch  34 Batch  143/1173   train_loss = 3.263
Epoch  34 Batch  168/1173   train_loss = 3.160
Epoch  34 Batch  193/1173   train_loss = 3.222
Epoch  34 Batch  218/1173   train_loss = 3.129
Epoch  34 Batch  243/1173   train_loss = 3.339
Epoch  34 Batch  268/1173   train_loss = 3.225
Epoch  34 Batch  293/1173   train_loss = 3.301
Epoch  34 Batch  318/1173   train_loss = 3.224
Epoch  34 Batch  343/1173   train_loss = 3.311
Epoch  34 Batch  368/1173   train_loss = 3.299
Epoch  34 Batch  393/1173   train_loss = 3.294
Epoch  34 Batch  418/1173   train_loss = 3.307
Epoch  34 Batch  443/1173   train_loss = 3.379
Epoch  34 Batch  468/1173   train_loss = 3.271
Epoch  34 Batch  493/1173   train_loss = 3.400
Epoch  34 Batch  518/1173   train_loss = 3.237
Epoch  34 Batch  543/1173   train_loss = 3.316
Epoch  34 Batch  568/1173   train_loss = 3.318
Epoch  34 Batch  593/1173   train_loss = 3.241
Epoch  34 Batch  618/1173   train_loss = 3.348
Epoch  34 Batch  643/1173   train_loss = 3.357
Epoch  34 Batch  668/1173   train_loss = 3.155
Epoch  34 Batch  693/1173   train_loss = 3.262
Epoch  34 Batch  718/1173   train_loss = 3.126
Epoch  34 Batch  743/1173   train_loss = 3.258
Epoch  34 Batch  768/1173   train_loss = 3.306
Epoch  34 Batch  793/1173   train_loss = 3.215
Epoch  34 Batch  818/1173   train_loss = 3.352
Epoch  34 Batch  843/1173   train_loss = 3.194
Epoch  34 Batch  868/1173   train_loss = 3.237
Epoch  34 Batch  893/1173   train_loss = 3.261
Epoch  34 Batch  918/1173   train_loss = 3.284
Epoch  34 Batch  943/1173   train_loss = 3.243
Epoch  34 Batch  968/1173   train_loss = 3.327
Epoch  34 Batch  993/1173   train_loss = 3.228
Epoch  34 Batch 1018/1173   train_loss = 3.200
Epoch  34 Batch 1043/1173   train_loss = 3.281
Epoch  34 Batch 1068/1173   train_loss = 3.344
Epoch  34 Batch 1093/1173   train_loss = 3.304
Epoch  34 Batch 1118/1173   train_loss = 3.250
Epoch  34 Batch 1143/1173   train_loss = 3.217
Epoch  34 Batch 1168/1173   train_loss = 3.189
Epoch  35 Batch   20/1173   train_loss = 3.201
Epoch  35 Batch   45/1173   train_loss = 3.182
Epoch  35 Batch   70/1173   train_loss = 3.250
Epoch  35 Batch   95/1173   train_loss = 3.314
Epoch  35 Batch  120/1173   train_loss = 3.296
Epoch  35 Batch  145/1173   train_loss = 3.303
Epoch  35 Batch  170/1173   train_loss = 3.317
Epoch  35 Batch  195/1173   train_loss = 3.282
Epoch  35 Batch  220/1173   train_loss = 3.285
Epoch  35 Batch  245/1173   train_loss = 3.271
Epoch  35 Batch  270/1173   train_loss = 3.269
Epoch  35 Batch  295/1173   train_loss = 3.202
Epoch  35 Batch  320/1173   train_loss = 3.244
Epoch  35 Batch  345/1173   train_loss = 3.243
Epoch  35 Batch  370/1173   train_loss = 3.240
Epoch  35 Batch  395/1173   train_loss = 3.228
Epoch  35 Batch  420/1173   train_loss = 3.348
Epoch  35 Batch  445/1173   train_loss = 3.127
Epoch  35 Batch  470/1173   train_loss = 3.299
Epoch  35 Batch  495/1173   train_loss = 3.322
Epoch  35 Batch  520/1173   train_loss = 3.267
Epoch  35 Batch  545/1173   train_loss = 3.287
Epoch  35 Batch  570/1173   train_loss = 3.303
Epoch  35 Batch  595/1173   train_loss = 3.438
Epoch  35 Batch  620/1173   train_loss = 3.366
Epoch  35 Batch  645/1173   train_loss = 3.328
Epoch  35 Batch  670/1173   train_loss = 3.263
Epoch  35 Batch  695/1173   train_loss = 3.380
Epoch  35 Batch  720/1173   train_loss = 3.209
Epoch  35 Batch  745/1173   train_loss = 3.352
Epoch  35 Batch  770/1173   train_loss = 3.326
Epoch  35 Batch  795/1173   train_loss = 3.241
Epoch  35 Batch  820/1173   train_loss = 3.259
Epoch  35 Batch  845/1173   train_loss = 3.344
Epoch  35 Batch  870/1173   train_loss = 3.296
Epoch  35 Batch  895/1173   train_loss = 3.337
Epoch  35 Batch  920/1173   train_loss = 3.290
Epoch  35 Batch  945/1173   train_loss = 3.306
Epoch  35 Batch  970/1173   train_loss = 3.204
Epoch  35 Batch  995/1173   train_loss = 3.254
Epoch  35 Batch 1020/1173   train_loss = 3.150
Epoch  35 Batch 1045/1173   train_loss = 3.229
Epoch  35 Batch 1070/1173   train_loss = 3.246
Epoch  35 Batch 1095/1173   train_loss = 3.279
Epoch  35 Batch 1120/1173   train_loss = 3.230
Epoch  35 Batch 1145/1173   train_loss = 3.322
Epoch  35 Batch 1170/1173   train_loss = 3.257
Epoch  36 Batch   22/1173   train_loss = 3.265
Epoch  36 Batch   47/1173   train_loss = 3.253
Epoch  36 Batch   72/1173   train_loss = 3.289
Epoch  36 Batch   97/1173   train_loss = 3.167
Epoch  36 Batch  122/1173   train_loss = 3.252
Epoch  36 Batch  147/1173   train_loss = 3.232
Epoch  36 Batch  172/1173   train_loss = 3.116
Epoch  36 Batch  197/1173   train_loss = 3.263
Epoch  36 Batch  222/1173   train_loss = 3.332
Epoch  36 Batch  247/1173   train_loss = 3.333
Epoch  36 Batch  272/1173   train_loss = 3.334
Epoch  36 Batch  297/1173   train_loss = 3.370
Epoch  36 Batch  322/1173   train_loss = 3.362
Epoch  36 Batch  347/1173   train_loss = 3.256
Epoch  36 Batch  372/1173   train_loss = 3.433
Epoch  36 Batch  397/1173   train_loss = 3.362
Epoch  36 Batch  422/1173   train_loss = 3.249
Epoch  36 Batch  447/1173   train_loss = 3.359
Epoch  36 Batch  472/1173   train_loss = 3.196
Epoch  36 Batch  497/1173   train_loss = 3.238
Epoch  36 Batch  522/1173   train_loss = 3.243
Epoch  36 Batch  547/1173   train_loss = 3.373
Epoch  36 Batch  572/1173   train_loss = 3.268
Epoch  36 Batch  597/1173   train_loss = 3.220
Epoch  36 Batch  622/1173   train_loss = 3.217
Epoch  36 Batch  647/1173   train_loss = 3.461
Epoch  36 Batch  672/1173   train_loss = 3.218
Epoch  36 Batch  697/1173   train_loss = 3.336
Epoch  36 Batch  722/1173   train_loss = 3.317
Epoch  36 Batch  747/1173   train_loss = 3.382
Epoch  36 Batch  772/1173   train_loss = 3.166
Epoch  36 Batch  797/1173   train_loss = 3.195
Epoch  36 Batch  822/1173   train_loss = 3.293
Epoch  36 Batch  847/1173   train_loss = 3.134
Epoch  36 Batch  872/1173   train_loss = 3.261
Epoch  36 Batch  897/1173   train_loss = 3.312
Epoch  36 Batch  922/1173   train_loss = 3.310
Epoch  36 Batch  947/1173   train_loss = 3.282
Epoch  36 Batch  972/1173   train_loss = 3.229
Epoch  36 Batch  997/1173   train_loss = 3.241
Epoch  36 Batch 1022/1173   train_loss = 3.247
Epoch  36 Batch 1047/1173   train_loss = 3.212
Epoch  36 Batch 1072/1173   train_loss = 3.114
Epoch  36 Batch 1097/1173   train_loss = 3.308
Epoch  36 Batch 1122/1173   train_loss = 3.353
Epoch  36 Batch 1147/1173   train_loss = 3.356
Epoch  36 Batch 1172/1173   train_loss = 3.400
Epoch  37 Batch   24/1173   train_loss = 3.281
Epoch  37 Batch   49/1173   train_loss = 3.167
Epoch  37 Batch   74/1173   train_loss = 3.391
Epoch  37 Batch   99/1173   train_loss = 3.156
Epoch  37 Batch  124/1173   train_loss = 3.109
Epoch  37 Batch  149/1173   train_loss = 3.436
Epoch  37 Batch  174/1173   train_loss = 3.228
Epoch  37 Batch  199/1173   train_loss = 3.268
Epoch  37 Batch  224/1173   train_loss = 3.319
Epoch  37 Batch  249/1173   train_loss = 3.377
Epoch  37 Batch  274/1173   train_loss = 3.272
Epoch  37 Batch  299/1173   train_loss = 3.288
Epoch  37 Batch  324/1173   train_loss = 3.365
Epoch  37 Batch  349/1173   train_loss = 3.238
Epoch  37 Batch  374/1173   train_loss = 3.389
Epoch  37 Batch  399/1173   train_loss = 3.365
Epoch  37 Batch  424/1173   train_loss = 3.367
Epoch  37 Batch  449/1173   train_loss = 3.262
Epoch  37 Batch  474/1173   train_loss = 3.250
Epoch  37 Batch  499/1173   train_loss = 3.208
Epoch  37 Batch  524/1173   train_loss = 3.270
Epoch  37 Batch  549/1173   train_loss = 3.182
Epoch  37 Batch  574/1173   train_loss = 3.377
Epoch  37 Batch  599/1173   train_loss = 3.145
Epoch  37 Batch  624/1173   train_loss = 3.234
Epoch  37 Batch  649/1173   train_loss = 3.296
Epoch  37 Batch  674/1173   train_loss = 3.301
Epoch  37 Batch  699/1173   train_loss = 3.336
Epoch  37 Batch  724/1173   train_loss = 3.198
Epoch  37 Batch  749/1173   train_loss = 3.334
Epoch  37 Batch  774/1173   train_loss = 3.246
Epoch  37 Batch  799/1173   train_loss = 3.153
Epoch  37 Batch  824/1173   train_loss = 3.169
Epoch  37 Batch  849/1173   train_loss = 3.379
Epoch  37 Batch  874/1173   train_loss = 3.274
Epoch  37 Batch  899/1173   train_loss = 3.169
Epoch  37 Batch  924/1173   train_loss = 3.215
Epoch  37 Batch  949/1173   train_loss = 3.284
Epoch  37 Batch  974/1173   train_loss = 3.154
Epoch  37 Batch  999/1173   train_loss = 3.137
Epoch  37 Batch 1024/1173   train_loss = 3.169
Epoch  37 Batch 1049/1173   train_loss = 3.332
Epoch  37 Batch 1074/1173   train_loss = 3.359
Epoch  37 Batch 1099/1173   train_loss = 3.249
Epoch  37 Batch 1124/1173   train_loss = 3.285
Epoch  37 Batch 1149/1173   train_loss = 3.293
Epoch  38 Batch    1/1173   train_loss = 3.219
Epoch  38 Batch   26/1173   train_loss = 3.199
Epoch  38 Batch   51/1173   train_loss = 3.388
Epoch  38 Batch   76/1173   train_loss = 3.261
Epoch  38 Batch  101/1173   train_loss = 3.296
Epoch  38 Batch  126/1173   train_loss = 3.348
Epoch  38 Batch  151/1173   train_loss = 3.335
Epoch  38 Batch  176/1173   train_loss = 3.233
Epoch  38 Batch  201/1173   train_loss = 3.270
Epoch  38 Batch  226/1173   train_loss = 3.310
Epoch  38 Batch  251/1173   train_loss = 3.321
Epoch  38 Batch  276/1173   train_loss = 3.290
Epoch  38 Batch  301/1173   train_loss = 3.160
Epoch  38 Batch  326/1173   train_loss = 3.436
Epoch  38 Batch  351/1173   train_loss = 3.229
Epoch  38 Batch  376/1173   train_loss = 3.235
Epoch  38 Batch  401/1173   train_loss = 3.329
Epoch  38 Batch  426/1173   train_loss = 3.385
Epoch  38 Batch  451/1173   train_loss = 3.293
Epoch  38 Batch  476/1173   train_loss = 3.204
Epoch  38 Batch  501/1173   train_loss = 3.260
Epoch  38 Batch  526/1173   train_loss = 3.270
Epoch  38 Batch  551/1173   train_loss = 3.228
Epoch  38 Batch  576/1173   train_loss = 3.247
Epoch  38 Batch  601/1173   train_loss = 3.352
Epoch  38 Batch  626/1173   train_loss = 3.358
Epoch  38 Batch  651/1173   train_loss = 3.351
Epoch  38 Batch  676/1173   train_loss = 3.258
Epoch  38 Batch  701/1173   train_loss = 3.428
Epoch  38 Batch  726/1173   train_loss = 3.314
Epoch  38 Batch  751/1173   train_loss = 3.268
Epoch  38 Batch  776/1173   train_loss = 3.212
Epoch  38 Batch  801/1173   train_loss = 3.248
Epoch  38 Batch  826/1173   train_loss = 3.231
Epoch  38 Batch  851/1173   train_loss = 3.283
Epoch  38 Batch  876/1173   train_loss = 3.334
Epoch  38 Batch  901/1173   train_loss = 3.298
Epoch  38 Batch  926/1173   train_loss = 3.271
Epoch  38 Batch  951/1173   train_loss = 3.300
Epoch  38 Batch  976/1173   train_loss = 3.430
Epoch  38 Batch 1001/1173   train_loss = 3.243
Epoch  38 Batch 1026/1173   train_loss = 3.139
Epoch  38 Batch 1051/1173   train_loss = 3.250
Epoch  38 Batch 1076/1173   train_loss = 3.330
Epoch  38 Batch 1101/1173   train_loss = 3.276
Epoch  38 Batch 1126/1173   train_loss = 3.191
Epoch  38 Batch 1151/1173   train_loss = 3.213
Epoch  39 Batch    3/1173   train_loss = 3.159
Epoch  39 Batch   28/1173   train_loss = 3.296
Epoch  39 Batch   53/1173   train_loss = 3.273
Epoch  39 Batch   78/1173   train_loss = 3.258
Epoch  39 Batch  103/1173   train_loss = 3.299
Epoch  39 Batch  128/1173   train_loss = 3.295
Epoch  39 Batch  153/1173   train_loss = 3.157
Epoch  39 Batch  178/1173   train_loss = 3.239
Epoch  39 Batch  203/1173   train_loss = 3.191
Epoch  39 Batch  228/1173   train_loss = 3.137
Epoch  39 Batch  253/1173   train_loss = 3.301
Epoch  39 Batch  278/1173   train_loss = 3.307
Epoch  39 Batch  303/1173   train_loss = 3.189
Epoch  39 Batch  328/1173   train_loss = 3.170
Epoch  39 Batch  353/1173   train_loss = 3.304
Epoch  39 Batch  378/1173   train_loss = 3.122
Epoch  39 Batch  403/1173   train_loss = 3.352
Epoch  39 Batch  428/1173   train_loss = 3.183
Epoch  39 Batch  453/1173   train_loss = 3.354
Epoch  39 Batch  478/1173   train_loss = 3.239
Epoch  39 Batch  503/1173   train_loss = 3.236
Epoch  39 Batch  528/1173   train_loss = 3.107
Epoch  39 Batch  553/1173   train_loss = 3.215
Epoch  39 Batch  578/1173   train_loss = 3.317
Epoch  39 Batch  603/1173   train_loss = 3.312
Epoch  39 Batch  628/1173   train_loss = 3.227
Epoch  39 Batch  653/1173   train_loss = 3.258
Epoch  39 Batch  678/1173   train_loss = 3.326
Epoch  39 Batch  703/1173   train_loss = 3.369
Epoch  39 Batch  728/1173   train_loss = 3.236
Epoch  39 Batch  753/1173   train_loss = 3.228
Epoch  39 Batch  778/1173   train_loss = 3.429
Epoch  39 Batch  803/1173   train_loss = 3.085
Epoch  39 Batch  828/1173   train_loss = 3.424
Epoch  39 Batch  853/1173   train_loss = 3.350
Epoch  39 Batch  878/1173   train_loss = 3.118
Epoch  39 Batch  903/1173   train_loss = 3.305
Epoch  39 Batch  928/1173   train_loss = 3.440
Epoch  39 Batch  953/1173   train_loss = 3.310
Epoch  39 Batch  978/1173   train_loss = 3.197
Epoch  39 Batch 1003/1173   train_loss = 3.093
Epoch  39 Batch 1028/1173   train_loss = 3.202
Epoch  39 Batch 1053/1173   train_loss = 3.327
Epoch  39 Batch 1078/1173   train_loss = 3.283
Epoch  39 Batch 1103/1173   train_loss = 3.284
Epoch  39 Batch 1128/1173   train_loss = 3.307
Epoch  39 Batch 1153/1173   train_loss = 3.251
Epoch  40 Batch    5/1173   train_loss = 3.037
Epoch  40 Batch   30/1173   train_loss = 3.267
Epoch  40 Batch   55/1173   train_loss = 3.235
Epoch  40 Batch   80/1173   train_loss = 3.263
Epoch  40 Batch  105/1173   train_loss = 3.264
Epoch  40 Batch  130/1173   train_loss = 3.216
Epoch  40 Batch  155/1173   train_loss = 3.263
Epoch  40 Batch  180/1173   train_loss = 3.317
Epoch  40 Batch  205/1173   train_loss = 3.177
Epoch  40 Batch  230/1173   train_loss = 3.230
Epoch  40 Batch  255/1173   train_loss = 3.338
Epoch  40 Batch  280/1173   train_loss = 3.364
Epoch  40 Batch  305/1173   train_loss = 3.290
Epoch  40 Batch  330/1173   train_loss = 3.205
Epoch  40 Batch  355/1173   train_loss = 3.314
Epoch  40 Batch  380/1173   train_loss = 3.204
Epoch  40 Batch  405/1173   train_loss = 3.335
Epoch  40 Batch  430/1173   train_loss = 3.178
Epoch  40 Batch  455/1173   train_loss = 3.283
Epoch  40 Batch  480/1173   train_loss = 3.137
Epoch  40 Batch  505/1173   train_loss = 3.366
Epoch  40 Batch  530/1173   train_loss = 3.150
Epoch  40 Batch  555/1173   train_loss = 3.219
Epoch  40 Batch  580/1173   train_loss = 3.275
Epoch  40 Batch  605/1173   train_loss = 3.363
Epoch  40 Batch  630/1173   train_loss = 3.213
Epoch  40 Batch  655/1173   train_loss = 3.181
Epoch  40 Batch  680/1173   train_loss = 3.236
Epoch  40 Batch  705/1173   train_loss = 3.277
Epoch  40 Batch  730/1173   train_loss = 3.269
Epoch  40 Batch  755/1173   train_loss = 3.345
Epoch  40 Batch  780/1173   train_loss = 3.228
Epoch  40 Batch  805/1173   train_loss = 3.207
Epoch  40 Batch  830/1173   train_loss = 3.357
Epoch  40 Batch  855/1173   train_loss = 3.246
Epoch  40 Batch  880/1173   train_loss = 3.219
Epoch  40 Batch  905/1173   train_loss = 3.245
Epoch  40 Batch  930/1173   train_loss = 3.151
Epoch  40 Batch  955/1173   train_loss = 3.298
Epoch  40 Batch  980/1173   train_loss = 3.282
Epoch  40 Batch 1005/1173   train_loss = 3.190
Epoch  40 Batch 1030/1173   train_loss = 3.163
Epoch  40 Batch 1055/1173   train_loss = 3.267
Epoch  40 Batch 1080/1173   train_loss = 3.210
Epoch  40 Batch 1105/1173   train_loss = 3.245
Epoch  40 Batch 1130/1173   train_loss = 3.188
Epoch  40 Batch 1155/1173   train_loss = 3.141
Epoch  41 Batch    7/1173   train_loss = 3.194
Epoch  41 Batch   32/1173   train_loss = 3.266
Epoch  41 Batch   57/1173   train_loss = 3.182
Epoch  41 Batch   82/1173   train_loss = 3.251
Epoch  41 Batch  107/1173   train_loss = 3.316
Epoch  41 Batch  132/1173   train_loss = 3.214
Epoch  41 Batch  157/1173   train_loss = 3.355
Epoch  41 Batch  182/1173   train_loss = 3.240
Epoch  41 Batch  207/1173   train_loss = 3.155
Epoch  41 Batch  232/1173   train_loss = 3.295
Epoch  41 Batch  257/1173   train_loss = 3.298
Epoch  41 Batch  282/1173   train_loss = 3.264
Epoch  41 Batch  307/1173   train_loss = 3.194
Epoch  41 Batch  332/1173   train_loss = 3.179
Epoch  41 Batch  357/1173   train_loss = 3.349
Epoch  41 Batch  382/1173   train_loss = 3.364
Epoch  41 Batch  407/1173   train_loss = 3.317
Epoch  41 Batch  432/1173   train_loss = 3.415
Epoch  41 Batch  457/1173   train_loss = 3.388
Epoch  41 Batch  482/1173   train_loss = 3.258
Epoch  41 Batch  507/1173   train_loss = 3.334
Epoch  41 Batch  532/1173   train_loss = 3.340
Epoch  41 Batch  557/1173   train_loss = 3.190
Epoch  41 Batch  582/1173   train_loss = 3.288
Epoch  41 Batch  607/1173   train_loss = 3.239
Epoch  41 Batch  632/1173   train_loss = 3.247
Epoch  41 Batch  657/1173   train_loss = 3.274
Epoch  41 Batch  682/1173   train_loss = 3.206
Epoch  41 Batch  707/1173   train_loss = 3.274
Epoch  41 Batch  732/1173   train_loss = 3.198
Epoch  41 Batch  757/1173   train_loss = 3.274
Epoch  41 Batch  782/1173   train_loss = 3.201
Epoch  41 Batch  807/1173   train_loss = 3.261
Epoch  41 Batch  832/1173   train_loss = 3.338
Epoch  41 Batch  857/1173   train_loss = 3.300
Epoch  41 Batch  882/1173   train_loss = 3.291
Epoch  41 Batch  907/1173   train_loss = 3.173
Epoch  41 Batch  932/1173   train_loss = 3.165
Epoch  41 Batch  957/1173   train_loss = 3.142
Epoch  41 Batch  982/1173   train_loss = 3.074
Epoch  41 Batch 1007/1173   train_loss = 3.278
Epoch  41 Batch 1032/1173   train_loss = 3.088
Epoch  41 Batch 1057/1173   train_loss = 3.144
Epoch  41 Batch 1082/1173   train_loss = 3.233
Epoch  41 Batch 1107/1173   train_loss = 3.291
Epoch  41 Batch 1132/1173   train_loss = 3.023
Epoch  41 Batch 1157/1173   train_loss = 3.186
Epoch  42 Batch    9/1173   train_loss = 3.282
Epoch  42 Batch   34/1173   train_loss = 3.167
Epoch  42 Batch   59/1173   train_loss = 3.293
Epoch  42 Batch   84/1173   train_loss = 3.231
Epoch  42 Batch  109/1173   train_loss = 3.234
Epoch  42 Batch  134/1173   train_loss = 3.413
Epoch  42 Batch  159/1173   train_loss = 3.334
Epoch  42 Batch  184/1173   train_loss = 3.284
Epoch  42 Batch  209/1173   train_loss = 3.225
Epoch  42 Batch  234/1173   train_loss = 3.350
Epoch  42 Batch  259/1173   train_loss = 3.222
Epoch  42 Batch  284/1173   train_loss = 3.303
Epoch  42 Batch  309/1173   train_loss = 3.274
Epoch  42 Batch  334/1173   train_loss = 3.189
Epoch  42 Batch  359/1173   train_loss = 3.317
Epoch  42 Batch  384/1173   train_loss = 3.226
Epoch  42 Batch  409/1173   train_loss = 3.109
Epoch  42 Batch  434/1173   train_loss = 3.278
Epoch  42 Batch  459/1173   train_loss = 3.209
Epoch  42 Batch  484/1173   train_loss = 3.287
Epoch  42 Batch  509/1173   train_loss = 3.168
Epoch  42 Batch  534/1173   train_loss = 3.176
Epoch  42 Batch  559/1173   train_loss = 3.251
Epoch  42 Batch  584/1173   train_loss = 3.257
Epoch  42 Batch  609/1173   train_loss = 3.265
Epoch  42 Batch  634/1173   train_loss = 3.335
Epoch  42 Batch  659/1173   train_loss = 3.151
Epoch  42 Batch  684/1173   train_loss = 3.258
Epoch  42 Batch  709/1173   train_loss = 3.254
Epoch  42 Batch  734/1173   train_loss = 3.246
Epoch  42 Batch  759/1173   train_loss = 3.242
Epoch  42 Batch  784/1173   train_loss = 3.266
Epoch  42 Batch  809/1173   train_loss = 3.260
Epoch  42 Batch  834/1173   train_loss = 3.206
Epoch  42 Batch  859/1173   train_loss = 3.330
Epoch  42 Batch  884/1173   train_loss = 3.246
Epoch  42 Batch  909/1173   train_loss = 3.268
Epoch  42 Batch  934/1173   train_loss = 3.271
Epoch  42 Batch  959/1173   train_loss = 3.181
Epoch  42 Batch  984/1173   train_loss = 3.296
Epoch  42 Batch 1009/1173   train_loss = 3.180
Epoch  42 Batch 1034/1173   train_loss = 3.318
Epoch  42 Batch 1059/1173   train_loss = 3.224
Epoch  42 Batch 1084/1173   train_loss = 3.281
Epoch  42 Batch 1109/1173   train_loss = 3.173
Epoch  42 Batch 1134/1173   train_loss = 3.195
Epoch  42 Batch 1159/1173   train_loss = 3.053
Epoch  43 Batch   11/1173   train_loss = 3.405
Epoch  43 Batch   36/1173   train_loss = 3.453
Epoch  43 Batch   61/1173   train_loss = 3.240
Epoch  43 Batch   86/1173   train_loss = 3.459
Epoch  43 Batch  111/1173   train_loss = 3.134
Epoch  43 Batch  136/1173   train_loss = 3.217
Epoch  43 Batch  161/1173   train_loss = 3.104
Epoch  43 Batch  186/1173   train_loss = 3.251
Epoch  43 Batch  211/1173   train_loss = 3.261
Epoch  43 Batch  236/1173   train_loss = 3.155
Epoch  43 Batch  261/1173   train_loss = 3.202
Epoch  43 Batch  286/1173   train_loss = 3.322
Epoch  43 Batch  311/1173   train_loss = 3.290
Epoch  43 Batch  336/1173   train_loss = 3.225
Epoch  43 Batch  361/1173   train_loss = 3.208
Epoch  43 Batch  386/1173   train_loss = 3.306
Epoch  43 Batch  411/1173   train_loss = 3.278
Epoch  43 Batch  436/1173   train_loss = 3.285
Epoch  43 Batch  461/1173   train_loss = 3.301
Epoch  43 Batch  486/1173   train_loss = 3.329
Epoch  43 Batch  511/1173   train_loss = 3.211
Epoch  43 Batch  536/1173   train_loss = 3.302
Epoch  43 Batch  561/1173   train_loss = 3.201
Epoch  43 Batch  586/1173   train_loss = 3.214
Epoch  43 Batch  611/1173   train_loss = 3.146
Epoch  43 Batch  636/1173   train_loss = 3.263
Epoch  43 Batch  661/1173   train_loss = 3.104
Epoch  43 Batch  686/1173   train_loss = 3.302
Epoch  43 Batch  711/1173   train_loss = 3.248
Epoch  43 Batch  736/1173   train_loss = 3.380
Epoch  43 Batch  761/1173   train_loss = 3.230
Epoch  43 Batch  786/1173   train_loss = 3.286
Epoch  43 Batch  811/1173   train_loss = 3.238
Epoch  43 Batch  836/1173   train_loss = 3.326
Epoch  43 Batch  861/1173   train_loss = 3.264
Epoch  43 Batch  886/1173   train_loss = 3.273
Epoch  43 Batch  911/1173   train_loss = 3.309
Epoch  43 Batch  936/1173   train_loss = 3.248
Epoch  43 Batch  961/1173   train_loss = 3.297
Epoch  43 Batch  986/1173   train_loss = 3.316
Epoch  43 Batch 1011/1173   train_loss = 3.208
Epoch  43 Batch 1036/1173   train_loss = 3.194
Epoch  43 Batch 1061/1173   train_loss = 3.250
Epoch  43 Batch 1086/1173   train_loss = 3.308
Epoch  43 Batch 1111/1173   train_loss = 3.200
Epoch  43 Batch 1136/1173   train_loss = 3.300
Epoch  43 Batch 1161/1173   train_loss = 3.155
Epoch  44 Batch   13/1173   train_loss = 3.236
Epoch  44 Batch   38/1173   train_loss = 3.341
Epoch  44 Batch   63/1173   train_loss = 3.200
Epoch  44 Batch   88/1173   train_loss = 3.212
Epoch  44 Batch  113/1173   train_loss = 3.218
Epoch  44 Batch  138/1173   train_loss = 3.271
Epoch  44 Batch  163/1173   train_loss = 3.301
Epoch  44 Batch  188/1173   train_loss = 3.295
Epoch  44 Batch  213/1173   train_loss = 3.237
Epoch  44 Batch  238/1173   train_loss = 3.233
Epoch  44 Batch  263/1173   train_loss = 3.291
Epoch  44 Batch  288/1173   train_loss = 3.182
Epoch  44 Batch  313/1173   train_loss = 3.255
Epoch  44 Batch  338/1173   train_loss = 3.291
Epoch  44 Batch  363/1173   train_loss = 3.184
Epoch  44 Batch  388/1173   train_loss = 3.236
Epoch  44 Batch  413/1173   train_loss = 3.171
Epoch  44 Batch  438/1173   train_loss = 3.282
Epoch  44 Batch  463/1173   train_loss = 3.228
Epoch  44 Batch  488/1173   train_loss = 3.255
Epoch  44 Batch  513/1173   train_loss = 3.282
Epoch  44 Batch  538/1173   train_loss = 3.298
Epoch  44 Batch  563/1173   train_loss = 3.246
Epoch  44 Batch  588/1173   train_loss = 3.335
Epoch  44 Batch  613/1173   train_loss = 3.241
Epoch  44 Batch  638/1173   train_loss = 3.110
Epoch  44 Batch  663/1173   train_loss = 3.298
Epoch  44 Batch  688/1173   train_loss = 3.225
Epoch  44 Batch  713/1173   train_loss = 3.322
Epoch  44 Batch  738/1173   train_loss = 3.325
Epoch  44 Batch  763/1173   train_loss = 3.187
Epoch  44 Batch  788/1173   train_loss = 3.232
Epoch  44 Batch  813/1173   train_loss = 3.291
Epoch  44 Batch  838/1173   train_loss = 3.282
Epoch  44 Batch  863/1173   train_loss = 3.319
Epoch  44 Batch  888/1173   train_loss = 3.174
Epoch  44 Batch  913/1173   train_loss = 3.244
Epoch  44 Batch  938/1173   train_loss = 3.324
Epoch  44 Batch  963/1173   train_loss = 3.286
Epoch  44 Batch  988/1173   train_loss = 3.243
Epoch  44 Batch 1013/1173   train_loss = 3.192
Epoch  44 Batch 1038/1173   train_loss = 3.330
Epoch  44 Batch 1063/1173   train_loss = 3.067
Epoch  44 Batch 1088/1173   train_loss = 3.269
Epoch  44 Batch 1113/1173   train_loss = 3.322
Epoch  44 Batch 1138/1173   train_loss = 3.253
Epoch  44 Batch 1163/1173   train_loss = 3.132
Epoch  45 Batch   15/1173   train_loss = 3.319
Epoch  45 Batch   40/1173   train_loss = 3.119
Epoch  45 Batch   65/1173   train_loss = 3.357
Epoch  45 Batch   90/1173   train_loss = 3.329
Epoch  45 Batch  115/1173   train_loss = 3.249
Epoch  45 Batch  140/1173   train_loss = 3.231
Epoch  45 Batch  165/1173   train_loss = 3.340
Epoch  45 Batch  190/1173   train_loss = 3.248
Epoch  45 Batch  215/1173   train_loss = 3.307
Epoch  45 Batch  240/1173   train_loss = 3.304
Epoch  45 Batch  265/1173   train_loss = 3.164
Epoch  45 Batch  290/1173   train_loss = 3.223
Epoch  45 Batch  315/1173   train_loss = 3.387
Epoch  45 Batch  340/1173   train_loss = 3.335
Epoch  45 Batch  365/1173   train_loss = 3.220
Epoch  45 Batch  390/1173   train_loss = 3.197
Epoch  45 Batch  415/1173   train_loss = 3.397
Epoch  45 Batch  440/1173   train_loss = 3.233
Epoch  45 Batch  465/1173   train_loss = 3.281
Epoch  45 Batch  490/1173   train_loss = 3.261
Epoch  45 Batch  515/1173   train_loss = 3.197
Epoch  45 Batch  540/1173   train_loss = 3.269
Epoch  45 Batch  565/1173   train_loss = 3.228
Epoch  45 Batch  590/1173   train_loss = 3.216
Epoch  45 Batch  615/1173   train_loss = 3.210
Epoch  45 Batch  640/1173   train_loss = 3.303
Epoch  45 Batch  665/1173   train_loss = 3.225
Epoch  45 Batch  690/1173   train_loss = 3.179
Epoch  45 Batch  715/1173   train_loss = 3.321
Epoch  45 Batch  740/1173   train_loss = 3.198
Epoch  45 Batch  765/1173   train_loss = 3.275
Epoch  45 Batch  790/1173   train_loss = 3.275
Epoch  45 Batch  815/1173   train_loss = 3.260
Epoch  45 Batch  840/1173   train_loss = 3.235
Epoch  45 Batch  865/1173   train_loss = 3.420
Epoch  45 Batch  890/1173   train_loss = 3.270
Epoch  45 Batch  915/1173   train_loss = 3.295
Epoch  45 Batch  940/1173   train_loss = 3.253
Epoch  45 Batch  965/1173   train_loss = 3.303
Epoch  45 Batch  990/1173   train_loss = 3.136
Epoch  45 Batch 1015/1173   train_loss = 3.265
Epoch  45 Batch 1040/1173   train_loss = 3.255
Epoch  45 Batch 1065/1173   train_loss = 3.191
Epoch  45 Batch 1090/1173   train_loss = 3.236
Epoch  45 Batch 1115/1173   train_loss = 3.150
Epoch  45 Batch 1140/1173   train_loss = 3.165
Epoch  45 Batch 1165/1173   train_loss = 3.265
Epoch  46 Batch   17/1173   train_loss = 3.245
Epoch  46 Batch   42/1173   train_loss = 3.158
Epoch  46 Batch   67/1173   train_loss = 3.351
Epoch  46 Batch   92/1173   train_loss = 3.268
Epoch  46 Batch  117/1173   train_loss = 3.240
Epoch  46 Batch  142/1173   train_loss = 3.255
Epoch  46 Batch  167/1173   train_loss = 3.301
Epoch  46 Batch  192/1173   train_loss = 3.231
Epoch  46 Batch  217/1173   train_loss = 3.103
Epoch  46 Batch  242/1173   train_loss = 3.339
Epoch  46 Batch  267/1173   train_loss = 3.340
Epoch  46 Batch  292/1173   train_loss = 3.070
Epoch  46 Batch  317/1173   train_loss = 3.353
Epoch  46 Batch  342/1173   train_loss = 3.241
Epoch  46 Batch  367/1173   train_loss = 3.231
Epoch  46 Batch  392/1173   train_loss = 3.157
Epoch  46 Batch  417/1173   train_loss = 3.364
Epoch  46 Batch  442/1173   train_loss = 3.290
Epoch  46 Batch  467/1173   train_loss = 3.282
Epoch  46 Batch  492/1173   train_loss = 3.210
Epoch  46 Batch  517/1173   train_loss = 3.169
Epoch  46 Batch  542/1173   train_loss = 3.322
Epoch  46 Batch  567/1173   train_loss = 3.163
Epoch  46 Batch  592/1173   train_loss = 3.199
Epoch  46 Batch  617/1173   train_loss = 3.159
Epoch  46 Batch  642/1173   train_loss = 3.294
Epoch  46 Batch  667/1173   train_loss = 3.219
Epoch  46 Batch  692/1173   train_loss = 3.114
Epoch  46 Batch  717/1173   train_loss = 3.429
Epoch  46 Batch  742/1173   train_loss = 3.257
Epoch  46 Batch  767/1173   train_loss = 3.210
Epoch  46 Batch  792/1173   train_loss = 3.218
Epoch  46 Batch  817/1173   train_loss = 3.338
Epoch  46 Batch  842/1173   train_loss = 3.294
Epoch  46 Batch  867/1173   train_loss = 3.210
Epoch  46 Batch  892/1173   train_loss = 3.324
Epoch  46 Batch  917/1173   train_loss = 3.316
Epoch  46 Batch  942/1173   train_loss = 3.243
Epoch  46 Batch  967/1173   train_loss = 3.240
Epoch  46 Batch  992/1173   train_loss = 3.169
Epoch  46 Batch 1017/1173   train_loss = 3.265
Epoch  46 Batch 1042/1173   train_loss = 3.183
Epoch  46 Batch 1067/1173   train_loss = 3.349
Epoch  46 Batch 1092/1173   train_loss = 3.345
Epoch  46 Batch 1117/1173   train_loss = 3.204
Epoch  46 Batch 1142/1173   train_loss = 3.226
Epoch  46 Batch 1167/1173   train_loss = 3.086
Epoch  47 Batch   19/1173   train_loss = 3.205
Epoch  47 Batch   44/1173   train_loss = 3.308
Epoch  47 Batch   69/1173   train_loss = 3.153
Epoch  47 Batch   94/1173   train_loss = 3.374
Epoch  47 Batch  119/1173   train_loss = 3.285
Epoch  47 Batch  144/1173   train_loss = 3.283
Epoch  47 Batch  169/1173   train_loss = 3.217
Epoch  47 Batch  194/1173   train_loss = 3.299
Epoch  47 Batch  219/1173   train_loss = 3.176
Epoch  47 Batch  244/1173   train_loss = 3.357
Epoch  47 Batch  269/1173   train_loss = 3.322
Epoch  47 Batch  294/1173   train_loss = 3.184
Epoch  47 Batch  319/1173   train_loss = 3.285
Epoch  47 Batch  344/1173   train_loss = 3.133
Epoch  47 Batch  369/1173   train_loss = 3.089
Epoch  47 Batch  394/1173   train_loss = 3.177
Epoch  47 Batch  419/1173   train_loss = 3.280
Epoch  47 Batch  444/1173   train_loss = 3.136
Epoch  47 Batch  469/1173   train_loss = 3.334
Epoch  47 Batch  494/1173   train_loss = 3.200
Epoch  47 Batch  519/1173   train_loss = 3.185
Epoch  47 Batch  544/1173   train_loss = 3.162
Epoch  47 Batch  569/1173   train_loss = 3.160
Epoch  47 Batch  594/1173   train_loss = 3.266
Epoch  47 Batch  619/1173   train_loss = 3.339
Epoch  47 Batch  644/1173   train_loss = 3.177
Epoch  47 Batch  669/1173   train_loss = 3.230
Epoch  47 Batch  694/1173   train_loss = 3.267
Epoch  47 Batch  719/1173   train_loss = 3.104
Epoch  47 Batch  744/1173   train_loss = 3.205
Epoch  47 Batch  769/1173   train_loss = 3.286
Epoch  47 Batch  794/1173   train_loss = 3.144
Epoch  47 Batch  819/1173   train_loss = 3.196
Epoch  47 Batch  844/1173   train_loss = 3.240
Epoch  47 Batch  869/1173   train_loss = 3.125
Epoch  47 Batch  894/1173   train_loss = 3.269
Epoch  47 Batch  919/1173   train_loss = 3.227
Epoch  47 Batch  944/1173   train_loss = 3.266
Epoch  47 Batch  969/1173   train_loss = 3.192
Epoch  47 Batch  994/1173   train_loss = 3.300
Epoch  47 Batch 1019/1173   train_loss = 3.188
Epoch  47 Batch 1044/1173   train_loss = 3.436
Epoch  47 Batch 1069/1173   train_loss = 3.206
Epoch  47 Batch 1094/1173   train_loss = 3.114
Epoch  47 Batch 1119/1173   train_loss = 3.136
Epoch  47 Batch 1144/1173   train_loss = 3.213
Epoch  47 Batch 1169/1173   train_loss = 3.126
Epoch  48 Batch   21/1173   train_loss = 3.079
Epoch  48 Batch   46/1173   train_loss = 3.172
Epoch  48 Batch   71/1173   train_loss = 3.328
Epoch  48 Batch   96/1173   train_loss = 3.154
Epoch  48 Batch  121/1173   train_loss = 3.176
Epoch  48 Batch  146/1173   train_loss = 3.377
Epoch  48 Batch  171/1173   train_loss = 3.249
Epoch  48 Batch  196/1173   train_loss = 3.315
Epoch  48 Batch  221/1173   train_loss = 3.206
Epoch  48 Batch  246/1173   train_loss = 3.460
Epoch  48 Batch  271/1173   train_loss = 3.346
Epoch  48 Batch  296/1173   train_loss = 3.249
Epoch  48 Batch  321/1173   train_loss = 3.261
Epoch  48 Batch  346/1173   train_loss = 3.309
Epoch  48 Batch  371/1173   train_loss = 3.245
Epoch  48 Batch  396/1173   train_loss = 3.306
Epoch  48 Batch  421/1173   train_loss = 3.151
Epoch  48 Batch  446/1173   train_loss = 3.251
Epoch  48 Batch  471/1173   train_loss = 3.212
Epoch  48 Batch  496/1173   train_loss = 3.248
Epoch  48 Batch  521/1173   train_loss = 3.145
Epoch  48 Batch  546/1173   train_loss = 3.327
Epoch  48 Batch  571/1173   train_loss = 3.316
Epoch  48 Batch  596/1173   train_loss = 3.288
Epoch  48 Batch  621/1173   train_loss = 3.298
Epoch  48 Batch  646/1173   train_loss = 3.375
Epoch  48 Batch  671/1173   train_loss = 3.356
Epoch  48 Batch  696/1173   train_loss = 3.356
Epoch  48 Batch  721/1173   train_loss = 3.253
Epoch  48 Batch  746/1173   train_loss = 3.221
Epoch  48 Batch  771/1173   train_loss = 3.201
Epoch  48 Batch  796/1173   train_loss = 3.112
Epoch  48 Batch  821/1173   train_loss = 3.187
Epoch  48 Batch  846/1173   train_loss = 3.200
Epoch  48 Batch  871/1173   train_loss = 3.203
Epoch  48 Batch  896/1173   train_loss = 3.310
Epoch  48 Batch  921/1173   train_loss = 3.318
Epoch  48 Batch  946/1173   train_loss = 3.330
Epoch  48 Batch  971/1173   train_loss = 3.194
Epoch  48 Batch  996/1173   train_loss = 3.292
Epoch  48 Batch 1021/1173   train_loss = 3.342
Epoch  48 Batch 1046/1173   train_loss = 3.171
Epoch  48 Batch 1071/1173   train_loss = 3.244
Epoch  48 Batch 1096/1173   train_loss = 3.222
Epoch  48 Batch 1121/1173   train_loss = 3.281
Epoch  48 Batch 1146/1173   train_loss = 3.219
Epoch  48 Batch 1171/1173   train_loss = 3.205
Epoch  49 Batch   23/1173   train_loss = 3.137
Epoch  49 Batch   48/1173   train_loss = 3.259
Epoch  49 Batch   73/1173   train_loss = 3.168
Epoch  49 Batch   98/1173   train_loss = 3.291
Epoch  49 Batch  123/1173   train_loss = 3.230
Epoch  49 Batch  148/1173   train_loss = 3.200
Epoch  49 Batch  173/1173   train_loss = 3.198
Epoch  49 Batch  198/1173   train_loss = 3.209
Epoch  49 Batch  223/1173   train_loss = 3.164
Epoch  49 Batch  248/1173   train_loss = 3.263
Epoch  49 Batch  273/1173   train_loss = 3.243
Epoch  49 Batch  298/1173   train_loss = 3.235
Epoch  49 Batch  323/1173   train_loss = 3.233
Epoch  49 Batch  348/1173   train_loss = 3.333
Epoch  49 Batch  373/1173   train_loss = 3.277
Epoch  49 Batch  398/1173   train_loss = 3.302
Epoch  49 Batch  423/1173   train_loss = 3.240
Epoch  49 Batch  448/1173   train_loss = 3.271
Epoch  49 Batch  473/1173   train_loss = 3.207
Epoch  49 Batch  498/1173   train_loss = 3.349
Epoch  49 Batch  523/1173   train_loss = 3.308
Epoch  49 Batch  548/1173   train_loss = 3.220
Epoch  49 Batch  573/1173   train_loss = 3.404
Epoch  49 Batch  598/1173   train_loss = 3.260
Epoch  49 Batch  623/1173   train_loss = 3.310
Epoch  49 Batch  648/1173   train_loss = 3.328
Epoch  49 Batch  673/1173   train_loss = 3.181
Epoch  49 Batch  698/1173   train_loss = 3.307
Epoch  49 Batch  723/1173   train_loss = 3.305
Epoch  49 Batch  748/1173   train_loss = 3.216
Epoch  49 Batch  773/1173   train_loss = 3.270
Epoch  49 Batch  798/1173   train_loss = 3.243
Epoch  49 Batch  823/1173   train_loss = 3.159
Epoch  49 Batch  848/1173   train_loss = 3.245
Epoch  49 Batch  873/1173   train_loss = 3.186
Epoch  49 Batch  898/1173   train_loss = 3.388
Epoch  49 Batch  923/1173   train_loss = 3.362
Epoch  49 Batch  948/1173   train_loss = 3.322
Epoch  49 Batch  973/1173   train_loss = 3.143
Epoch  49 Batch  998/1173   train_loss = 3.237
Epoch  49 Batch 1023/1173   train_loss = 3.267
Epoch  49 Batch 1048/1173   train_loss = 3.262
Epoch  49 Batch 1073/1173   train_loss = 3.302
Epoch  49 Batch 1098/1173   train_loss = 3.309
Epoch  49 Batch 1123/1173   train_loss = 3.172
Epoch  49 Batch 1148/1173   train_loss = 3.142
Model Trained and Saved

Save Parameters

Save seq_length and save_dir for generating a new TV script.


In [66]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Save parameters for checkpoint
helper.save_params((seq_length, save_dir))

Checkpoint


In [67]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import tensorflow as tf
import numpy as np
import helper
import problem_unittests as tests

_, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()
seq_length, load_dir = helper.load_params()

Implement Generate Functions

Get Tensors

Get tensors from loaded_graph using the function get_tensor_by_name(). Get the tensors using the following names:

  • "input:0"
  • "initial_state:0"
  • "final_state:0"
  • "probs:0"

Return the tensors in the following tuple (InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)


In [68]:
def get_tensors(loaded_graph):
    """
    Get input, initial state, final state, and probabilities tensor from <loaded_graph>
    :param loaded_graph: TensorFlow graph loaded from file
    :return: Tuple (InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)
    """
    input_tensor = loaded_graph.get_tensor_by_name('input:0')
    init_state_tensor = loaded_graph.get_tensor_by_name('initial_state:0')
    final_state_tensor = loaded_graph.get_tensor_by_name('final_state:0')
    probs_tensor = loaded_graph.get_tensor_by_name('probs:0')
    return input_tensor, init_state_tensor, final_state_tensor, probs_tensor


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_get_tensors(get_tensors)


Tests Passed

Choose Word

Implement the pick_word() function to select the next word using probabilities.


In [69]:
def weighted_choice(choices):
    """
    Cribbed from http://stackoverflow.com/questions/3679694/a-weighted-version-of-random-choice
    """
    total = sum(w for c, w in choices)
    r = random.uniform(0, total)
    upto = 0
    for c, w in choices:
        if upto + w >= r:
            return c
        upto += w
    assert False, "Shouldn't get here"

def pick_word(probabilities, int_to_vocab, top_n=5):
    """
    Pick the next word in the generated text
    :param probabilities: Probabilites of the next word
    :param int_to_vocab: Dictionary of word ids as the keys and words as the values
    :return: String of the predicted word
    """
    #print('Num probs: {}'.format(len(probabilities)))

    top_n_choices = []
    for i in range(min(len(probabilities), top_n)):
        max_idx = np.argmax(probabilities)
        top_n_choices.append((max_idx, probabilities[max_idx]))
        probabilities.itemset(max_idx, 0)

    #print('Top {} highest indexes: {}'.format(top_n, top_n_choices))

    word_idx = weighted_choice(top_n_choices)
    word = int_to_vocab[word_idx]
    #print('Chosen word: {} (idx: {})'.format(word_idx, word))
    return word

    #highest_prob_idx = np.squeeze(np.argwhere(probabilities == np.max(probabilities)))
    #word_idx = np.random.choice(highest_prob_idx)
    #word = int_to_vocab[word_idx]
    #return word


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_pick_word(pick_word)


Tests Passed

Generate TV Script

This will generate the TV script for you. Set gen_length to the length of TV script you want to generate.


In [71]:
gen_length = 400
# homer_simpson, moe_szyslak, or Barney_Gumble
prime_word = 'bart_simpson'

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
loaded_graph = tf.Graph()
with tf.Session(graph=loaded_graph) as sess:
    # Load saved model
    loader = tf.train.import_meta_graph(load_dir + '.meta')
    loader.restore(sess, load_dir)

    # Get Tensors from loaded model
    input_text, initial_state, final_state, probs = get_tensors(loaded_graph)

    # Sentences generation setup
    gen_sentences = [prime_word + ':']
    prev_state = sess.run(initial_state, {input_text: np.array([[1]])})

    # Generate sentences
    for n in range(gen_length):
        # Dynamic Input
        dyn_input = [[vocab_to_int[word] for word in gen_sentences[-seq_length:]]]
        dyn_seq_length = len(dyn_input[0])

        # Get Prediction
        probabilities, prev_state = sess.run(
            [probs, final_state],
            {input_text: dyn_input, initial_state: prev_state})
        
        pred_word = pick_word(probabilities[dyn_seq_length-1], int_to_vocab)

        gen_sentences.append(pred_word)

    # Remove tokens
    tv_script = ' '.join(gen_sentences)
    for key, token in token_dict.items():
        ending = ' ' if key in ['\n', '(', '"'] else ''
        tv_script = tv_script.replace(' ' + token.lower(), key)
    tv_script = tv_script.replace('\n ', '\n')
    tv_script = tv_script.replace('( ', '(')

    print(tv_script)


bart_simpson:(sincere) i am not. i can't let you go back to the case of hers.
homer_simpson:(touched) oh, that's just a carp laundry.
homer_simpson:(horrified) baaart!
(wiggum_home: ext. video store - continuous)
marge_simpson:(into phone)
(s. a.) glands ointed...
homer_simpson: oh, my little homie-bear! you don't know what to do.
homer_simpson:(touched noise) well, i guess.
(hallway: int. hallway - continuous)
homer_simpson:(chuckles, looks at marge) i can't do anything. i can't do that.
(death_row: int. log - moments later)
lisa_simpson: i don't care what i am.(flips o. s.) a birthday service?!
(simpson_home: ext. simpson house - front steps - day)
lisa_simpson:(panting)
homer_simpson:(chuckles)
(sideshow_mel's_dressing_room: int. sideshow bob's cabin - continuous)
lisa_simpson:(blissed-out noise) oh my brother, i'm a real lot of my life, homer.
(krusty's_office: int. krusty's office - night)
homer_simpson: oh, my-diddily-i.
marge_simpson:(to lisa)" the beer"...(brief pause)
(below_bridge: ext. below.
(simpson_home: int. simpson house - living room - later)
bart_simpson:(to himself) wacky, oh, caramba!
homer_simpson:(moans)
(simpson_home: ext. simpson house - day - establishing)
homer_simpson: oh, my-diddily-i.
sheriff:(irish snarl)
marge_simpson:(gasp) oh, homie...
homer_simpson: well, i guess i'll be right to the level.
marge_simpson:(to marge) you don't know what to do.
(girder: ext. girder cab - later)
(simpson_living_room: int. simpson living room - continuous)
homer_simpson: i don't know.
homer_simpson: oh, my-diddily-i. i'm buyin'!
homer_simpson:(chuckles)
homer_simpson:(grooving sigh) oh, i know you're not a miracle! it's a miracle!
(adult_film_section:

The TV Script is Nonsensical

It's ok if the TV script doesn't make any sense. We trained on less than a megabyte of text. In order to get good results, you'll have to use a smaller vocabulary or get more data. Luckly there's more data! As we mentioned in the begging of this project, this is a subset of another dataset. We didn't have you train on all the data, because that would take too long. However, you are free to train your neural network on all the data. After you complete the project, of course.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_tv_script_generation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.