Language Translation

In this project, you’re going to take a peek into the realm of neural network machine translation. You’ll be training a sequence to sequence model on a dataset of English and French sentences that can translate new sentences from English to French.

Get the Data

Since translating the whole language of English to French will take lots of time to train, we have provided you with a small portion of the English corpus.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper
import problem_unittests as tests

source_path = 'data/small_vocab_en'
target_path = 'data/small_vocab_fr'
source_text = helper.load_data(source_path)
target_text = helper.load_data(target_path)

Explore the Data

Play around with view_sentence_range to view different parts of the data.


In [2]:
view_sentence_range = (0, 10)

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

print('Dataset Stats')
print('Roughly the number of unique words: {}'.format(len({word: None for word in source_text.split()})))

sentences = source_text.split('\n')
word_counts = [len(sentence.split()) for sentence in sentences]
print('Number of sentences: {}'.format(len(sentences)))
print('Average number of words in a sentence: {}'.format(np.average(word_counts)))

print()
print('English sentences {} to {}:'.format(*view_sentence_range))
print('\n'.join(source_text.split('\n')[view_sentence_range[0]:view_sentence_range[1]]))
print()
print('French sentences {} to {}:'.format(*view_sentence_range))
print('\n'.join(target_text.split('\n')[view_sentence_range[0]:view_sentence_range[1]]))


Dataset Stats
Roughly the number of unique words: 202
Number of sentences: 137861
Average number of words in a sentence: 13.283379636010185

English sentences 0 to 10:
new jersey is sometimes quiet during autumn , and it is snowy in april .
the united states is usually chilly during july , and it is usually freezing in november .
california is usually quiet during march , and it is usually hot in june .
the united states is sometimes mild during june , and it is cold in september .
your least liked fruit is the grape , but my least liked is the apple .
his favorite fruit is the orange , but my favorite is the grape .
paris is relaxing during december , but it is usually chilly in july .
new jersey is busy during spring , and it is never hot in march .
our least liked fruit is the lemon , but my least liked is the grape .
the united states is sometimes busy during january , and it is sometimes warm in november .

French sentences 0 to 10:
new jersey est parfois calme pendant l' automne , et il est neigeux en avril .
les états-unis est généralement froid en juillet , et il gèle habituellement en novembre .
california est généralement calme en mars , et il est généralement chaud en juin .
les états-unis est parfois légère en juin , et il fait froid en septembre .
votre moins aimé fruit est le raisin , mais mon moins aimé est la pomme .
son fruit préféré est l'orange , mais mon préféré est le raisin .
paris est relaxant en décembre , mais il est généralement froid en juillet .
new jersey est occupé au printemps , et il est jamais chaude en mars .
notre fruit est moins aimé le citron , mais mon moins aimé est le raisin .
les états-unis est parfois occupé en janvier , et il est parfois chaud en novembre .

Implement Preprocessing Function

Text to Word Ids

As you did with other RNNs, you must turn the text into a number so the computer can understand it. In the function text_to_ids(), you'll turn source_text and target_text from words to ids. However, you need to add the <EOS> word id at the end of target_text. This will help the neural network predict when the sentence should end.

You can get the <EOS> word id by doing:

target_vocab_to_int['<EOS>']

You can get other word ids using source_vocab_to_int and target_vocab_to_int.


In [3]:
def text_to_ids(source_text, target_text, source_vocab_to_int, target_vocab_to_int):
    """
    Convert source and target text to proper word ids
    :param source_text: String that contains all the source text.
    :param target_text: String that contains all the target text.
    :param source_vocab_to_int: Dictionary to go from the source words to an id
    :param target_vocab_to_int: Dictionary to go from the target words to an id
    :return: A tuple of lists (source_id_text, target_id_text)
    """
    
    source_sentences = source_text.split('\n')
    source_ids = [[source_vocab_to_int[word] for word in (sentence.split(' ')) if word in source_vocab_to_int]
                  for sentence in source_sentences]
    
    target_sentences = target_text.split('\n')
    target_ids = [[target_vocab_to_int[word] for word in (sentence.split(' ') + ['<EOS>']) if word in target_vocab_to_int]
                  for sentence in target_sentences]

    return source_ids, target_ids

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_text_to_ids(text_to_ids)


Tests Passed

Preprocess all the data and save it

Running the code cell below will preprocess all the data and save it to file.


In [4]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
helper.preprocess_and_save_data(source_path, target_path, text_to_ids)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np
import helper
import problem_unittests as tests

(source_int_text, target_int_text), (source_vocab_to_int, target_vocab_to_int), _ = helper.load_preprocess()

Check the Version of TensorFlow and Access to GPU

This will check to make sure you have the correct version of TensorFlow and access to a GPU


In [6]:
max_seq_length = max(max(map(len, source_int_text)), max(map(len, target_int_text)))

In [7]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf
from tensorflow.python.layers.core import Dense

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.1'), 'Please use TensorFlow version 1.1 or newer'
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


TensorFlow Version: 1.2.1
Default GPU Device: /gpu:0

Build the Neural Network

You'll build the components necessary to build a Sequence-to-Sequence model by implementing the following functions below:

  • model_inputs
  • process_decoder_input
  • encoding_layer
  • decoding_layer_train
  • decoding_layer_infer
  • decoding_layer
  • seq2seq_model

Input

Implement the model_inputs() function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Input text placeholder named "input" using the TF Placeholder name parameter with rank 2.
  • Targets placeholder with rank 2.
  • Learning rate placeholder with rank 0.
  • Keep probability placeholder named "keep_prob" using the TF Placeholder name parameter with rank 0.
  • Target sequence length placeholder named "target_sequence_length" with rank 1
  • Max target sequence length tensor named "max_target_len" getting its value from applying tf.reduce_max on the target_sequence_length placeholder. Rank 0.
  • Source sequence length placeholder named "source_sequence_length" with rank 1

Return the placeholders in the following the tuple (input, targets, learning rate, keep probability, target sequence length, max target sequence length, source sequence length)


In [8]:
import importlib
importlib.reload(tests)


Out[8]:
<module 'problem_unittests' from '/home/WORKSPACE/scott.quiring/MachineLearning/DeepLearning/Projects/newCheckout/deep-learning/language-translation/problem_unittests.py'>

In [9]:
def model_inputs():
    """
    Create TF Placeholders for input, targets, learning rate, and lengths of source and target sequences.
    :return: Tuple (input, targets, learning rate, keep probability, target sequence length,
    max target sequence length, source sequence length)
    """
    input_ = tf.placeholder(tf.int32, shape=[None,None], name="input")
    target = tf.placeholder(tf.int32, shape=[None,None], name="target")
    
    learning_rate = tf.placeholder(tf.float32, name="learning_rate")
    keep_prob = tf.placeholder(tf.float32, name="keep_prob")
    
    target_sequence_length = tf.placeholder(tf.int32, shape=[None], name="target_sequence_length")
    max_target_len = tf.reduce_max(target_sequence_length, name="max_target_len")
    
    source_sequence_length = tf.placeholder(tf.int32, shape=[None], name="source_sequence_length")
    
    # This is kind of silly to have a function that just returns 7 unnamed things.
    return input_, target, learning_rate, keep_prob, target_sequence_length, max_target_len, source_sequence_length
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_inputs(model_inputs)


Tests Passed

Process Decoder Input

Implement process_decoder_input by removing the last word id from each batch in target_data and concat the GO ID to the begining of each batch.


In [10]:
def process_decoder_input(target_data, target_vocab_to_int, batch_size):
    """
    Preprocess target data for encoding
    :param target_data: Target Placehoder
    :param target_vocab_to_int: Dictionary to go from the target words to an id
    :param batch_size: Batch Size
    :return: Preprocessed target data
    """
    go = tf.fill([batch_size, 1], target_vocab_to_int['<GO>'])
    # Why are we getting rid of the last word id?
    return tf.concat([go, target_data[:,:-1]], axis=1)

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_process_encoding_input(process_decoder_input)


Tests Passed

Encoding

Implement encoding_layer() to create a Encoder RNN layer:


In [11]:
from imp import reload
reload(tests)

def encoding_layer(rnn_inputs, rnn_size, num_layers, keep_prob, 
                   source_sequence_length, source_vocab_size, 
                   encoding_embedding_size):
    """
    Create encoding layer
    :param rnn_inputs: Inputs for the RNN
    :param rnn_size: RNN Size
    :param num_layers: Number of layers
    :param keep_prob: Dropout keep probability
    :param source_sequence_length: a list of the lengths of each sequence in the batch
    :param source_vocab_size: vocabulary size of source data
    :param encoding_embedding_size: embedding size of source data
    :return: tuple (RNN output, RNN state)
    """
    embed = tf.contrib.layers.embed_sequence(rnn_inputs, vocab_size=source_vocab_size,
                                             embed_dim=encoding_embedding_size)
    
    def lstm(_):
        lstm = tf.contrib.rnn.LSTMCell(rnn_size)
        return tf.contrib.rnn.DropoutWrapper(lstm, state_keep_prob=keep_prob)
    lstm = tf.contrib.rnn.MultiRNNCell(list(map(lstm, range(num_layers))))
    rnn = tf.nn.dynamic_rnn(lstm, inputs=embed, dtype=tf.float32)
    
    return rnn

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_encoding_layer(encoding_layer)


Tests Passed

Decoding - Training

Create a training decoding layer:


In [12]:
def decoding_layer_train(encoder_state, dec_cell, dec_embed_input, 
                         target_sequence_length, max_summary_length, 
                         output_layer, keep_prob):
    """
    Create a decoding layer for training
    :param encoder_state: Encoder State
    :param dec_cell: Decoder RNN Cell
    :param dec_embed_input: Decoder embedded input
    :param target_sequence_length: The lengths of each sequence in the target batch
    :param max_summary_length: The length of the longest sequence in the batch
    :param output_layer: Function to apply the output layer
    :param keep_prob: Dropout keep probability
    :return: BasicDecoderOutput containing training logits and sample_id
    """
    helper = tf.contrib.seq2seq.TrainingHelper(dec_embed_input, target_sequence_length,
                                               name="training_helper")
    decoder = tf.contrib.seq2seq.BasicDecoder(dec_cell, helper, encoder_state,
                                              output_layer=output_layer)
    
    (final_outputs, final_state, final_sequence_lengths) = \
        tf.contrib.seq2seq.dynamic_decode(decoder, maximum_iterations=max_summary_length,)
    return final_outputs

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_decoding_layer_train(decoding_layer_train)


Tests Passed

Decoding - Inference

Create inference decoder:


In [13]:
def decoding_layer_infer(encoder_state, dec_cell, dec_embeddings, start_of_sequence_id,
                         end_of_sequence_id, max_target_sequence_length,
                         vocab_size, output_layer, batch_size, keep_prob):
    """
    Create a decoding layer for inference
    :param encoder_state: Encoder state
    :param dec_cell: Decoder RNN Cell
    :param dec_embeddings: Decoder embeddings
    :param start_of_sequence_id: GO ID
    :param end_of_sequence_id: EOS Id
    :param max_target_sequence_length: Maximum length of target sequences
    :param vocab_size: Size of decoder/target vocabulary
    :param decoding_scope: TenorFlow Variable Scope for decoding
    :param output_layer: Function to apply the output layer
    :param batch_size: Batch size
    :param keep_prob: Dropout keep probability
    :return: BasicDecoderOutput containing inference logits and sample_id
    """
    start_tokens = tf.constant(start_of_sequence_id, dtype=tf.int32,
                               shape=[batch_size])
    
    helper = tf.contrib.seq2seq.GreedyEmbeddingHelper(dec_embeddings,
                                                      start_tokens,
                                                      end_of_sequence_id)
    
    decoder = tf.contrib.seq2seq.BasicDecoder(dec_cell, helper,
                                              encoder_state,
                                              output_layer)
    
    (final_outputs, final_state, final_sequence_lengths) = \
        tf.contrib.seq2seq.dynamic_decode(decoder, maximum_iterations=max_target_sequence_length)
    # TODO: Implement Function
    return final_outputs



"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_decoding_layer_infer(decoding_layer_infer)


Tests Passed

Build the Decoding Layer

Implement decoding_layer() to create a Decoder RNN layer.

  • Embed the target sequences
  • Construct the decoder LSTM cell (just like you constructed the encoder cell above)
  • Create an output layer to map the outputs of the decoder to the elements of our vocabulary
  • Use the your decoding_layer_train(encoder_state, dec_cell, dec_embed_input, target_sequence_length, max_target_sequence_length, output_layer, keep_prob) function to get the training logits.
  • Use your decoding_layer_infer(encoder_state, dec_cell, dec_embeddings, start_of_sequence_id, end_of_sequence_id, max_target_sequence_length, vocab_size, output_layer, batch_size, keep_prob) function to get the inference logits.

Note: You'll need to use tf.variable_scope to share variables between training and inference.


In [14]:
def decoding_layer(dec_input, encoder_state,
                   target_sequence_length, max_target_sequence_length,
                   rnn_size,
                   num_layers, target_vocab_to_int, target_vocab_size,
                   batch_size, keep_prob, decoding_embedding_size):
    """
    Create decoding layer
    :param dec_input: Decoder input
    :param encoder_state: Encoder state
    :param target_sequence_length: The lengths of each sequence in the target batch
    :param max_target_sequence_length: Maximum length of target sequences
    :param rnn_size: RNN Size
    :param num_layers: Number of layers
    :param target_vocab_to_int: Dictionary to go from the target words to an id
    :param target_vocab_size: Size of target vocabulary
    :param batch_size: The size of the batch
    :param keep_prob: Dropout keep probability
    :param decoding_embedding_size: Decoding embedding size
    :return: Tuple of (Training BasicDecoderOutput, Inference BasicDecoderOutput)
    """
    
    dec_embeddings = tf.Variable(tf.random_uniform([target_vocab_size, decoding_embedding_size]))
    dec_embed_input = tf.nn.embedding_lookup(dec_embeddings, dec_input)
    
    def lstm(_):
        lstm = tf.contrib.rnn.LSTMCell(rnn_size)
        return tf.contrib.rnn.DropoutWrapper(lstm, state_keep_prob=keep_prob)
    lstm = list(map(lstm, range(num_layers)))
    rnn_cell = tf.contrib.rnn.MultiRNNCell(lstm)
    
    output_layer = Dense(target_vocab_size,
                         kernel_initializer = tf.truncated_normal_initializer(mean = 0.0, stddev=0.1))
    
    with tf.variable_scope("decoding") as decoding_scope:
        train_output = decoding_layer_train(encoder_state, rnn_cell, dec_embed_input,
                                            target_sequence_length, max_target_sequence_length,
                                            output_layer, keep_prob)

        decoding_scope.reuse_variables()
        
        infer_output = decoding_layer_infer(encoder_state, rnn_cell, dec_embeddings,
                                            target_vocab_to_int['<GO>'], target_vocab_to_int['<EOS>'],
                                            max_target_sequence_length, target_vocab_size,
                                            output_layer, batch_size, keep_prob)
    
    
    # TODO: Implement Function
    return train_output, infer_output



"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_decoding_layer(decoding_layer)


Tests Passed

Build the Neural Network

Apply the functions you implemented above to:

  • Encode the input using your encoding_layer(rnn_inputs, rnn_size, num_layers, keep_prob, source_sequence_length, source_vocab_size, encoding_embedding_size).
  • Process target data using your process_decoder_input(target_data, target_vocab_to_int, batch_size) function.
  • Decode the encoded input using your decoding_layer(dec_input, enc_state, target_sequence_length, max_target_sentence_length, rnn_size, num_layers, target_vocab_to_int, target_vocab_size, batch_size, keep_prob, dec_embedding_size) function.

In [15]:
def seq2seq_model(input_data, target_data, keep_prob, batch_size,
                  source_sequence_length, target_sequence_length,
                  max_target_sentence_length,
                  source_vocab_size, target_vocab_size,
                  enc_embedding_size, dec_embedding_size,
                  rnn_size, num_layers, target_vocab_to_int):
    """
    Build the Sequence-to-Sequence part of the neural network
    :param input_data: Input placeholder
    :param target_data: Target placeholder
    :param keep_prob: Dropout keep probability placeholder
    :param batch_size: Batch Size
    :param source_sequence_length: Sequence Lengths of source sequences in the batch
    :param target_sequence_length: Sequence Lengths of target sequences in the batch
    :param source_vocab_size: Source vocabulary size
    :param target_vocab_size: Target vocabulary size
    :param enc_embedding_size: Decoder embedding size
    :param dec_embedding_size: Encoder embedding size
    :param rnn_size: RNN Size
    :param num_layers: Number of layers
    :param target_vocab_to_int: Dictionary to go from the target words to an id
    :return: Tuple of (Training BasicDecoderOutput, Inference BasicDecoderOutput)
    """
    encoding_output, encoding_state = encoding_layer(input_data, rnn_size, num_layers, keep_prob,
                                                     source_sequence_length, source_vocab_size,
                                                     enc_embedding_size)
    
    decoder_target = process_decoder_input(target_data, target_vocab_to_int, batch_size)
    
    train_output, infer_output = decoding_layer(decoder_target, encoding_state, target_sequence_length,
                                                max_target_sentence_length, rnn_size, num_layers,
                                                target_vocab_to_int, target_vocab_size,
                                                batch_size, keep_prob, dec_embedding_size)
    
    
    
    # TODO: Implement Function
    return train_output, infer_output


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_seq2seq_model(seq2seq_model)


Tests Passed

Neural Network Training

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of epochs.
  • Set batch_size to the batch size.
  • Set rnn_size to the size of the RNNs.
  • Set num_layers to the number of layers.
  • Set encoding_embedding_size to the size of the embedding for the encoder.
  • Set decoding_embedding_size to the size of the embedding for the decoder.
  • Set learning_rate to the learning rate.
  • Set keep_probability to the Dropout keep probability
  • Set display_step to state how many steps between each debug output statement

In [16]:
## Hyperparameters selected by a genetic algorithm.

# Number of Epochs
epochs = 12
# Batch Size
batch_size = 128
# RNN Size
rnn_size = 512
# Number of Layers
num_layers = 4
# Embedding Size
encoding_embedding_size = 128
decoding_embedding_size =  32
# Learning Rate
learning_rate = 0.001
# Dropout Keep Probability
keep_probability = 0.8
display_step = 10

Build the Graph

Build the graph using the neural network you implemented.


In [17]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_path = 'checkpoints/dev'
(source_int_text, target_int_text), (source_vocab_to_int, target_vocab_to_int), _ = helper.load_preprocess()
max_target_sentence_length = max([len(sentence) for sentence in source_int_text])

train_graph = tf.Graph()
with train_graph.as_default():
    input_data, targets, lr, keep_prob, target_sequence_length, max_target_sequence_length, source_sequence_length = model_inputs()

    #sequence_length = tf.placeholder_with_default(max_target_sentence_length, None, name='sequence_length')
    input_shape = tf.shape(input_data)

    train_logits, inference_logits = seq2seq_model(tf.reverse(input_data, [-1]),
                                                   targets,
                                                   keep_prob,
                                                   batch_size,
                                                   source_sequence_length,
                                                   target_sequence_length,
                                                   max_target_sequence_length,
                                                   len(source_vocab_to_int),
                                                   len(target_vocab_to_int),
                                                   encoding_embedding_size,
                                                   decoding_embedding_size,
                                                   rnn_size,
                                                   num_layers,
                                                   target_vocab_to_int)


    training_logits = tf.identity(train_logits.rnn_output, name='logits')
    inference_logits = tf.identity(inference_logits.sample_id, name='predictions')

    masks = tf.sequence_mask(target_sequence_length, max_target_sequence_length, dtype=tf.float32, name='masks')

    with tf.name_scope("optimization"):
        # Loss function
        cost = tf.contrib.seq2seq.sequence_loss(
            training_logits,
            targets,
            masks)

        # Optimizer
        optimizer = tf.train.AdamOptimizer(lr)

        # Gradient Clipping
        gradients = optimizer.compute_gradients(cost)
        capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients if grad is not None]
        train_op = optimizer.apply_gradients(capped_gradients)

Batch and pad the source and target sequences


In [18]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
def pad_sentence_batch(sentence_batch, pad_int):
    """Pad sentences with <PAD> so that each sentence of a batch has the same length"""
    max_sentence = max([len(sentence) for sentence in sentence_batch])
    return [sentence + [pad_int] * (max_sentence - len(sentence)) for sentence in sentence_batch]


def get_batches(sources, targets, batch_size, source_pad_int, target_pad_int):
    """Batch targets, sources, and the lengths of their sentences together"""
    for batch_i in range(0, len(sources)//batch_size):
        start_i = batch_i * batch_size

        # Slice the right amount for the batch
        sources_batch = sources[start_i:start_i + batch_size]
        targets_batch = targets[start_i:start_i + batch_size]

        # Pad
        pad_sources_batch = np.array(pad_sentence_batch(sources_batch, source_pad_int))
        pad_targets_batch = np.array(pad_sentence_batch(targets_batch, target_pad_int))

        # Need the lengths for the _lengths parameters
        pad_targets_lengths = []
        for target in pad_targets_batch:
            pad_targets_lengths.append(len(target))

        pad_source_lengths = []
        for source in pad_sources_batch:
            pad_source_lengths.append(len(source))

        yield pad_sources_batch, pad_targets_batch, pad_source_lengths, pad_targets_lengths

In [19]:
import matplotlib.pyplot as plt
%matplotlib inline

Train

Train the neural network on the preprocessed data. If you have a hard time getting a good loss, check the forms to see if anyone is having the same problem.


In [20]:
#len(train_source)

In [21]:
#len(valid_source)

In [22]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
def get_accuracy(target, logits):
    """
    Calculate accuracy
    """
    max_seq = max(target.shape[1], logits.shape[1])
    if max_seq - target.shape[1]:
        target = np.pad(
            target,
            [(0,0),(0,max_seq - target.shape[1])],
            'constant')
    if max_seq - logits.shape[1]:
        logits = np.pad(
            logits,
            [(0,0),(0,max_seq - logits.shape[1])],
            'constant')

    return np.mean(np.equal(target, logits))

# Split data to training and validation sets
train_source = source_int_text[batch_size:]
train_target = target_int_text[batch_size:]
valid_source = source_int_text[:batch_size]
valid_target = target_int_text[:batch_size]
(valid_sources_batch, valid_targets_batch, valid_sources_lengths, valid_targets_lengths ) = \
    next(get_batches(valid_source,
         valid_target,
         batch_size,
         source_vocab_to_int['<PAD>'],
         target_vocab_to_int['<PAD>']))

train_accs = []
valid_accs = []
losses = []

with tf.Session(graph=train_graph) as sess:
    sess.run(tf.global_variables_initializer())

    for epoch_i in range(epochs):
        for batch_i, (source_batch, target_batch, sources_lengths, targets_lengths) in enumerate(
                get_batches(train_source, train_target, batch_size,
                            source_vocab_to_int['<PAD>'],
                            target_vocab_to_int['<PAD>'])):

            _, loss = sess.run(
                [train_op, cost],
                {input_data: source_batch,
                 targets: target_batch,
                 lr: learning_rate,
                 target_sequence_length: targets_lengths,
                 source_sequence_length: sources_lengths,
                 keep_prob: keep_probability})
            losses.append(loss)


            if batch_i % display_step == 0 and batch_i > 0:


                batch_train_logits = sess.run(
                    inference_logits,
                    {input_data: source_batch,
                     source_sequence_length: sources_lengths,
                     target_sequence_length: targets_lengths,
                     keep_prob: 1.0})


                batch_valid_logits = sess.run(
                    inference_logits,
                    {input_data: valid_sources_batch,
                     source_sequence_length: valid_sources_lengths,
                     target_sequence_length: valid_targets_lengths,
                     keep_prob: 1.0})

                train_acc = get_accuracy(target_batch, batch_train_logits)

                valid_acc = get_accuracy(valid_targets_batch, batch_valid_logits)
                
                train_accs.append(train_acc)
                valid_accs.append(valid_acc)

                print('Epoch {:>3} Batch {:>4}/{} - Train Accuracy: {:>6.4f}, '
                      'Validation Accuracy: {:>6.4f}, Loss: {:>6.4f}'
                      .format(epoch_i, batch_i,
                              len(source_int_text) // batch_size,
                              train_acc, valid_acc, loss))
                
                if batch_i % (display_step*20) == 0:
                    fig,axes = plt.subplots(2,1, figsize=(13,10))
                    axes[0].plot(losses)
                    _, ymax = axes[0].get_ylim()
                    axes[0].set_ylim(0, ymax)
                    axes[1].plot(train_accs)
                    axes[1].plot(valid_accs)
                    axes[1].set_ylim(0,1)
                    plt.show()

    # Save Model
    saver = tf.train.Saver()
    saver.save(sess, save_path)
    print('Model Trained and Saved')


Epoch   0 Batch   10/1077 - Train Accuracy: 0.2669, Validation Accuracy: 0.3619, Loss: 3.5919
Epoch   0 Batch   20/1077 - Train Accuracy: 0.1141, Validation Accuracy: 0.1293, Loss: 2.9127
Epoch   0 Batch   30/1077 - Train Accuracy: 0.1562, Validation Accuracy: 0.1822, Loss: 2.6808
Epoch   0 Batch   40/1077 - Train Accuracy: 0.1426, Validation Accuracy: 0.1548, Loss: 2.6491
Epoch   0 Batch   50/1077 - Train Accuracy: 0.1723, Validation Accuracy: 0.2031, Loss: 2.4973
Epoch   0 Batch   60/1077 - Train Accuracy: 0.1942, Validation Accuracy: 0.2053, Loss: 2.2621
Epoch   0 Batch   70/1077 - Train Accuracy: 0.2664, Validation Accuracy: 0.3200, Loss: 2.3786
Epoch   0 Batch   80/1077 - Train Accuracy: 0.1957, Validation Accuracy: 0.2393, Loss: 2.1449
Epoch   0 Batch   90/1077 - Train Accuracy: 0.2395, Validation Accuracy: 0.3217, Loss: 2.1792
Epoch   0 Batch  100/1077 - Train Accuracy: 0.3098, Validation Accuracy: 0.3732, Loss: 2.0308
Epoch   0 Batch  110/1077 - Train Accuracy: 0.3762, Validation Accuracy: 0.4254, Loss: 1.9632
Epoch   0 Batch  120/1077 - Train Accuracy: 0.2676, Validation Accuracy: 0.3196, Loss: 2.0259
Epoch   0 Batch  130/1077 - Train Accuracy: 0.2768, Validation Accuracy: 0.3100, Loss: 1.8623
Epoch   0 Batch  140/1077 - Train Accuracy: 0.3047, Validation Accuracy: 0.3991, Loss: 1.9889
Epoch   0 Batch  150/1077 - Train Accuracy: 0.3400, Validation Accuracy: 0.3839, Loss: 1.7739
Epoch   0 Batch  160/1077 - Train Accuracy: 0.3277, Validation Accuracy: 0.4162, Loss: 1.8048
Epoch   0 Batch  170/1077 - Train Accuracy: 0.2148, Validation Accuracy: 0.2844, Loss: 1.9168
Epoch   0 Batch  180/1077 - Train Accuracy: 0.3570, Validation Accuracy: 0.4027, Loss: 1.7208
Epoch   0 Batch  190/1077 - Train Accuracy: 0.3570, Validation Accuracy: 0.4208, Loss: 1.5964
Epoch   0 Batch  200/1077 - Train Accuracy: 0.4238, Validation Accuracy: 0.4691, Loss: 1.6888
Epoch   0 Batch  210/1077 - Train Accuracy: 0.3438, Validation Accuracy: 0.3821, Loss: 1.6553
Epoch   0 Batch  220/1077 - Train Accuracy: 0.3010, Validation Accuracy: 0.3938, Loss: 1.5409
Epoch   0 Batch  230/1077 - Train Accuracy: 0.4074, Validation Accuracy: 0.4212, Loss: 1.4058
Epoch   0 Batch  240/1077 - Train Accuracy: 0.3074, Validation Accuracy: 0.3494, Loss: 1.4375
Epoch   0 Batch  250/1077 - Train Accuracy: 0.4442, Validation Accuracy: 0.4776, Loss: 1.2783
Epoch   0 Batch  260/1077 - Train Accuracy: 0.4304, Validation Accuracy: 0.4524, Loss: 1.2979
Epoch   0 Batch  270/1077 - Train Accuracy: 0.3742, Validation Accuracy: 0.4766, Loss: 1.4174
Epoch   0 Batch  280/1077 - Train Accuracy: 0.4215, Validation Accuracy: 0.4727, Loss: 1.4043
Epoch   0 Batch  290/1077 - Train Accuracy: 0.4562, Validation Accuracy: 0.5089, Loss: 1.3176
Epoch   0 Batch  300/1077 - Train Accuracy: 0.4248, Validation Accuracy: 0.4943, Loss: 1.2830
Epoch   0 Batch  310/1077 - Train Accuracy: 0.4422, Validation Accuracy: 0.5288, Loss: 1.2293
Epoch   0 Batch  320/1077 - Train Accuracy: 0.5004, Validation Accuracy: 0.5181, Loss: 1.2156
Epoch   0 Batch  330/1077 - Train Accuracy: 0.5066, Validation Accuracy: 0.5167, Loss: 1.2103
Epoch   0 Batch  340/1077 - Train Accuracy: 0.4112, Validation Accuracy: 0.4901, Loss: 1.1776
Epoch   0 Batch  350/1077 - Train Accuracy: 0.4715, Validation Accuracy: 0.4670, Loss: 1.3010
Epoch   0 Batch  360/1077 - Train Accuracy: 0.4973, Validation Accuracy: 0.4918, Loss: 1.1665
Epoch   0 Batch  370/1077 - Train Accuracy: 0.4576, Validation Accuracy: 0.5007, Loss: 1.1182
Epoch   0 Batch  380/1077 - Train Accuracy: 0.4875, Validation Accuracy: 0.4979, Loss: 1.0982
Epoch   0 Batch  390/1077 - Train Accuracy: 0.4270, Validation Accuracy: 0.5146, Loss: 1.2763
Epoch   0 Batch  400/1077 - Train Accuracy: 0.4996, Validation Accuracy: 0.5391, Loss: 1.1777
Epoch   0 Batch  410/1077 - Train Accuracy: 0.4519, Validation Accuracy: 0.5227, Loss: 1.1496
Epoch   0 Batch  420/1077 - Train Accuracy: 0.4969, Validation Accuracy: 0.5554, Loss: 1.0759
Epoch   0 Batch  430/1077 - Train Accuracy: 0.5105, Validation Accuracy: 0.5312, Loss: 1.0979
Epoch   0 Batch  440/1077 - Train Accuracy: 0.4418, Validation Accuracy: 0.4957, Loss: 1.1173
Epoch   0 Batch  450/1077 - Train Accuracy: 0.4586, Validation Accuracy: 0.5178, Loss: 1.0467
Epoch   0 Batch  460/1077 - Train Accuracy: 0.4945, Validation Accuracy: 0.5479, Loss: 1.0129
Epoch   0 Batch  470/1077 - Train Accuracy: 0.4453, Validation Accuracy: 0.4954, Loss: 1.0312
Epoch   0 Batch  480/1077 - Train Accuracy: 0.4955, Validation Accuracy: 0.5359, Loss: 0.9952
Epoch   0 Batch  490/1077 - Train Accuracy: 0.4699, Validation Accuracy: 0.5281, Loss: 1.0060
Epoch   0 Batch  500/1077 - Train Accuracy: 0.5406, Validation Accuracy: 0.5653, Loss: 1.0128
Epoch   0 Batch  510/1077 - Train Accuracy: 0.5328, Validation Accuracy: 0.5618, Loss: 1.0367
Epoch   0 Batch  520/1077 - Train Accuracy: 0.5394, Validation Accuracy: 0.5518, Loss: 0.9400
Epoch   0 Batch  530/1077 - Train Accuracy: 0.5375, Validation Accuracy: 0.5781, Loss: 0.9406
Epoch   0 Batch  540/1077 - Train Accuracy: 0.5496, Validation Accuracy: 0.5870, Loss: 0.8640
Epoch   0 Batch  550/1077 - Train Accuracy: 0.5527, Validation Accuracy: 0.5785, Loss: 0.8903
Epoch   0 Batch  560/1077 - Train Accuracy: 0.5602, Validation Accuracy: 0.5817, Loss: 0.8322
Epoch   0 Batch  570/1077 - Train Accuracy: 0.5308, Validation Accuracy: 0.5756, Loss: 0.8581
Epoch   0 Batch  580/1077 - Train Accuracy: 0.6209, Validation Accuracy: 0.5977, Loss: 0.7365
Epoch   0 Batch  590/1077 - Train Accuracy: 0.5436, Validation Accuracy: 0.5877, Loss: 0.8560
Epoch   0 Batch  600/1077 - Train Accuracy: 0.5722, Validation Accuracy: 0.5831, Loss: 0.7691
Epoch   0 Batch  610/1077 - Train Accuracy: 0.5687, Validation Accuracy: 0.5824, Loss: 0.8399
Epoch   0 Batch  620/1077 - Train Accuracy: 0.5621, Validation Accuracy: 0.5856, Loss: 0.7862
Epoch   0 Batch  630/1077 - Train Accuracy: 0.6016, Validation Accuracy: 0.5959, Loss: 0.7918
Epoch   0 Batch  640/1077 - Train Accuracy: 0.5714, Validation Accuracy: 0.6030, Loss: 0.7443
Epoch   0 Batch  650/1077 - Train Accuracy: 0.5289, Validation Accuracy: 0.5820, Loss: 0.7741
Epoch   0 Batch  660/1077 - Train Accuracy: 0.5441, Validation Accuracy: 0.6069, Loss: 0.7690
Epoch   0 Batch  670/1077 - Train Accuracy: 0.6349, Validation Accuracy: 0.6012, Loss: 0.7059
Epoch   0 Batch  680/1077 - Train Accuracy: 0.5610, Validation Accuracy: 0.5859, Loss: 0.7583
Epoch   0 Batch  690/1077 - Train Accuracy: 0.5645, Validation Accuracy: 0.5700, Loss: 0.7630
Epoch   0 Batch  700/1077 - Train Accuracy: 0.5258, Validation Accuracy: 0.5898, Loss: 0.7873
Epoch   0 Batch  710/1077 - Train Accuracy: 0.5137, Validation Accuracy: 0.5827, Loss: 0.8170
Epoch   0 Batch  720/1077 - Train Accuracy: 0.5481, Validation Accuracy: 0.5781, Loss: 0.9417
Epoch   0 Batch  730/1077 - Train Accuracy: 0.4449, Validation Accuracy: 0.5224, Loss: 0.8497
Epoch   0 Batch  740/1077 - Train Accuracy: 0.5254, Validation Accuracy: 0.5277, Loss: 0.8057
Epoch   0 Batch  750/1077 - Train Accuracy: 0.4977, Validation Accuracy: 0.5483, Loss: 0.7744
Epoch   0 Batch  760/1077 - Train Accuracy: 0.5895, Validation Accuracy: 0.6033, Loss: 0.7427
Epoch   0 Batch  770/1077 - Train Accuracy: 0.5796, Validation Accuracy: 0.5977, Loss: 0.6992
Epoch   0 Batch  780/1077 - Train Accuracy: 0.5410, Validation Accuracy: 0.5916, Loss: 0.7364
Epoch   0 Batch  790/1077 - Train Accuracy: 0.5012, Validation Accuracy: 0.5831, Loss: 0.7317
Epoch   0 Batch  800/1077 - Train Accuracy: 0.5555, Validation Accuracy: 0.5817, Loss: 0.6739
Epoch   0 Batch  810/1077 - Train Accuracy: 0.6153, Validation Accuracy: 0.5980, Loss: 0.6282
Epoch   0 Batch  820/1077 - Train Accuracy: 0.5531, Validation Accuracy: 0.6101, Loss: 0.6976
Epoch   0 Batch  830/1077 - Train Accuracy: 0.5687, Validation Accuracy: 0.6104, Loss: 0.6675
Epoch   0 Batch  840/1077 - Train Accuracy: 0.5887, Validation Accuracy: 0.5973, Loss: 0.6311
Epoch   0 Batch  850/1077 - Train Accuracy: 0.5625, Validation Accuracy: 0.6126, Loss: 0.6690
Epoch   0 Batch  860/1077 - Train Accuracy: 0.5926, Validation Accuracy: 0.6076, Loss: 0.6542
Epoch   0 Batch  870/1077 - Train Accuracy: 0.5650, Validation Accuracy: 0.6165, Loss: 0.6712
Epoch   0 Batch  880/1077 - Train Accuracy: 0.6227, Validation Accuracy: 0.6129, Loss: 0.6358
Epoch   0 Batch  890/1077 - Train Accuracy: 0.6336, Validation Accuracy: 0.6172, Loss: 0.6143
Epoch   0 Batch  900/1077 - Train Accuracy: 0.6145, Validation Accuracy: 0.6261, Loss: 0.6542
Epoch   0 Batch  910/1077 - Train Accuracy: 0.5900, Validation Accuracy: 0.6460, Loss: 0.6120
Epoch   0 Batch  920/1077 - Train Accuracy: 0.5797, Validation Accuracy: 0.6175, Loss: 0.6292
Epoch   0 Batch  930/1077 - Train Accuracy: 0.6074, Validation Accuracy: 0.6271, Loss: 0.6106
Epoch   0 Batch  940/1077 - Train Accuracy: 0.6121, Validation Accuracy: 0.6332, Loss: 0.6085
Epoch   0 Batch  950/1077 - Train Accuracy: 0.5763, Validation Accuracy: 0.6314, Loss: 0.5720
Epoch   0 Batch  960/1077 - Train Accuracy: 0.6198, Validation Accuracy: 0.6229, Loss: 0.5872
Epoch   0 Batch  970/1077 - Train Accuracy: 0.6285, Validation Accuracy: 0.6268, Loss: 0.6106
Epoch   0 Batch  980/1077 - Train Accuracy: 0.6125, Validation Accuracy: 0.6417, Loss: 0.5972
Epoch   0 Batch  990/1077 - Train Accuracy: 0.5592, Validation Accuracy: 0.6119, Loss: 0.6433
Epoch   0 Batch 1000/1077 - Train Accuracy: 0.6685, Validation Accuracy: 0.6460, Loss: 0.5550
Epoch   0 Batch 1010/1077 - Train Accuracy: 0.6211, Validation Accuracy: 0.6314, Loss: 0.5820
Epoch   0 Batch 1020/1077 - Train Accuracy: 0.6195, Validation Accuracy: 0.6190, Loss: 0.5656
Epoch   0 Batch 1030/1077 - Train Accuracy: 0.5969, Validation Accuracy: 0.6282, Loss: 0.5881
Epoch   0 Batch 1040/1077 - Train Accuracy: 0.6098, Validation Accuracy: 0.6328, Loss: 0.6084
Epoch   0 Batch 1050/1077 - Train Accuracy: 0.5430, Validation Accuracy: 0.6069, Loss: 0.5869
Epoch   0 Batch 1060/1077 - Train Accuracy: 0.6512, Validation Accuracy: 0.6605, Loss: 0.5501
Epoch   0 Batch 1070/1077 - Train Accuracy: 0.5941, Validation Accuracy: 0.6339, Loss: 0.5942
Epoch   1 Batch   10/1077 - Train Accuracy: 0.6229, Validation Accuracy: 0.6349, Loss: 0.5757
Epoch   1 Batch   20/1077 - Train Accuracy: 0.5984, Validation Accuracy: 0.6470, Loss: 0.5472
Epoch   1 Batch   30/1077 - Train Accuracy: 0.6426, Validation Accuracy: 0.6310, Loss: 0.5618
Epoch   1 Batch   40/1077 - Train Accuracy: 0.6125, Validation Accuracy: 0.6431, Loss: 0.5636
Epoch   1 Batch   50/1077 - Train Accuracy: 0.6297, Validation Accuracy: 0.6271, Loss: 0.5488
Epoch   1 Batch   60/1077 - Train Accuracy: 0.6310, Validation Accuracy: 0.6335, Loss: 0.5248
Epoch   1 Batch   70/1077 - Train Accuracy: 0.6201, Validation Accuracy: 0.6300, Loss: 0.5653
Epoch   1 Batch   80/1077 - Train Accuracy: 0.6316, Validation Accuracy: 0.6172, Loss: 0.5650
Epoch   1 Batch   90/1077 - Train Accuracy: 0.6137, Validation Accuracy: 0.6705, Loss: 0.5687
Epoch   1 Batch  100/1077 - Train Accuracy: 0.6277, Validation Accuracy: 0.6396, Loss: 0.5447
Epoch   1 Batch  110/1077 - Train Accuracy: 0.6746, Validation Accuracy: 0.6697, Loss: 0.5065
Epoch   1 Batch  120/1077 - Train Accuracy: 0.6461, Validation Accuracy: 0.6488, Loss: 0.5593
Epoch   1 Batch  130/1077 - Train Accuracy: 0.6373, Validation Accuracy: 0.6246, Loss: 0.5234
Epoch   1 Batch  140/1077 - Train Accuracy: 0.6012, Validation Accuracy: 0.6513, Loss: 0.5467
Epoch   1 Batch  150/1077 - Train Accuracy: 0.6417, Validation Accuracy: 0.6357, Loss: 0.5278
Epoch   1 Batch  160/1077 - Train Accuracy: 0.6242, Validation Accuracy: 0.6634, Loss: 0.5339
Epoch   1 Batch  170/1077 - Train Accuracy: 0.6129, Validation Accuracy: 0.6172, Loss: 0.5655
Epoch   1 Batch  180/1077 - Train Accuracy: 0.6328, Validation Accuracy: 0.6587, Loss: 0.5407
Epoch   1 Batch  190/1077 - Train Accuracy: 0.6504, Validation Accuracy: 0.6431, Loss: 0.5172
Epoch   1 Batch  200/1077 - Train Accuracy: 0.5918, Validation Accuracy: 0.6531, Loss: 0.5500
Epoch   1 Batch  210/1077 - Train Accuracy: 0.6246, Validation Accuracy: 0.6399, Loss: 0.4950
Epoch   1 Batch  220/1077 - Train Accuracy: 0.6254, Validation Accuracy: 0.6516, Loss: 0.5335
Epoch   1 Batch  230/1077 - Train Accuracy: 0.6298, Validation Accuracy: 0.6339, Loss: 0.5322
Epoch   1 Batch  240/1077 - Train Accuracy: 0.6301, Validation Accuracy: 0.6523, Loss: 0.4967
Epoch   1 Batch  250/1077 - Train Accuracy: 0.6371, Validation Accuracy: 0.6321, Loss: 0.4772
Epoch   1 Batch  260/1077 - Train Accuracy: 0.6484, Validation Accuracy: 0.6584, Loss: 0.4818
Epoch   1 Batch  270/1077 - Train Accuracy: 0.6164, Validation Accuracy: 0.6317, Loss: 0.5177
Epoch   1 Batch  280/1077 - Train Accuracy: 0.6504, Validation Accuracy: 0.6832, Loss: 0.5272
Epoch   1 Batch  290/1077 - Train Accuracy: 0.6180, Validation Accuracy: 0.6715, Loss: 0.5171
Epoch   1 Batch  300/1077 - Train Accuracy: 0.6345, Validation Accuracy: 0.6491, Loss: 0.4901
Epoch   1 Batch  310/1077 - Train Accuracy: 0.6414, Validation Accuracy: 0.6559, Loss: 0.5203
Epoch   1 Batch  320/1077 - Train Accuracy: 0.6652, Validation Accuracy: 0.6570, Loss: 0.4851
Epoch   1 Batch  330/1077 - Train Accuracy: 0.6719, Validation Accuracy: 0.6587, Loss: 0.4757
Epoch   1 Batch  340/1077 - Train Accuracy: 0.6324, Validation Accuracy: 0.6687, Loss: 0.4797
Epoch   1 Batch  350/1077 - Train Accuracy: 0.6266, Validation Accuracy: 0.6161, Loss: 0.4802
Epoch   1 Batch  360/1077 - Train Accuracy: 0.6473, Validation Accuracy: 0.6637, Loss: 0.4558
Epoch   1 Batch  370/1077 - Train Accuracy: 0.6678, Validation Accuracy: 0.6570, Loss: 0.4524
Epoch   1 Batch  380/1077 - Train Accuracy: 0.6570, Validation Accuracy: 0.6722, Loss: 0.4547
Epoch   1 Batch  390/1077 - Train Accuracy: 0.6238, Validation Accuracy: 0.6591, Loss: 0.4915
Epoch   1 Batch  400/1077 - Train Accuracy: 0.6875, Validation Accuracy: 0.6708, Loss: 0.4893
Epoch   1 Batch  410/1077 - Train Accuracy: 0.6086, Validation Accuracy: 0.6467, Loss: 0.4886
Epoch   1 Batch  420/1077 - Train Accuracy: 0.6699, Validation Accuracy: 0.6591, Loss: 0.4423
Epoch   1 Batch  430/1077 - Train Accuracy: 0.6195, Validation Accuracy: 0.6399, Loss: 0.4512
Epoch   1 Batch  440/1077 - Train Accuracy: 0.6348, Validation Accuracy: 0.6602, Loss: 0.5037
Epoch   1 Batch  450/1077 - Train Accuracy: 0.6297, Validation Accuracy: 0.6619, Loss: 0.4409
Epoch   1 Batch  460/1077 - Train Accuracy: 0.6652, Validation Accuracy: 0.6701, Loss: 0.4644
Epoch   1 Batch  470/1077 - Train Accuracy: 0.6604, Validation Accuracy: 0.6889, Loss: 0.4736
Epoch   1 Batch  480/1077 - Train Accuracy: 0.6768, Validation Accuracy: 0.6531, Loss: 0.4547
Epoch   1 Batch  490/1077 - Train Accuracy: 0.6664, Validation Accuracy: 0.6712, Loss: 0.4645
Epoch   1 Batch  500/1077 - Train Accuracy: 0.6797, Validation Accuracy: 0.6541, Loss: 0.4358
Epoch   1 Batch  510/1077 - Train Accuracy: 0.6730, Validation Accuracy: 0.6651, Loss: 0.4293
Epoch   1 Batch  520/1077 - Train Accuracy: 0.7310, Validation Accuracy: 0.6729, Loss: 0.4172
Epoch   1 Batch  530/1077 - Train Accuracy: 0.6555, Validation Accuracy: 0.6637, Loss: 0.4606
Epoch   1 Batch  540/1077 - Train Accuracy: 0.6844, Validation Accuracy: 0.6822, Loss: 0.3986
Epoch   1 Batch  550/1077 - Train Accuracy: 0.6641, Validation Accuracy: 0.6768, Loss: 0.4672
Epoch   1 Batch  560/1077 - Train Accuracy: 0.6922, Validation Accuracy: 0.6836, Loss: 0.4071
Epoch   1 Batch  570/1077 - Train Accuracy: 0.6575, Validation Accuracy: 0.6722, Loss: 0.4434
Epoch   1 Batch  580/1077 - Train Accuracy: 0.7359, Validation Accuracy: 0.6932, Loss: 0.3738
Epoch   1 Batch  590/1077 - Train Accuracy: 0.5720, Validation Accuracy: 0.6470, Loss: 0.4484
Epoch   1 Batch  600/1077 - Train Accuracy: 0.6987, Validation Accuracy: 0.6754, Loss: 0.3887
Epoch   1 Batch  610/1077 - Train Accuracy: 0.6896, Validation Accuracy: 0.6900, Loss: 0.4208
Epoch   1 Batch  620/1077 - Train Accuracy: 0.6781, Validation Accuracy: 0.6882, Loss: 0.4040
Epoch   1 Batch  630/1077 - Train Accuracy: 0.7008, Validation Accuracy: 0.6857, Loss: 0.4015
Epoch   1 Batch  640/1077 - Train Accuracy: 0.6775, Validation Accuracy: 0.6942, Loss: 0.4006
Epoch   1 Batch  650/1077 - Train Accuracy: 0.6141, Validation Accuracy: 0.7003, Loss: 0.4214
Epoch   1 Batch  660/1077 - Train Accuracy: 0.6813, Validation Accuracy: 0.6822, Loss: 0.4076
Epoch   1 Batch  670/1077 - Train Accuracy: 0.7191, Validation Accuracy: 0.7067, Loss: 0.3798
Epoch   1 Batch  680/1077 - Train Accuracy: 0.6786, Validation Accuracy: 0.6928, Loss: 0.3851
Epoch   1 Batch  690/1077 - Train Accuracy: 0.6762, Validation Accuracy: 0.6903, Loss: 0.4038
Epoch   1 Batch  700/1077 - Train Accuracy: 0.6465, Validation Accuracy: 0.7003, Loss: 0.3807
Epoch   1 Batch  710/1077 - Train Accuracy: 0.6238, Validation Accuracy: 0.7106, Loss: 0.3945
Epoch   1 Batch  720/1077 - Train Accuracy: 0.7138, Validation Accuracy: 0.7106, Loss: 0.4086
Epoch   1 Batch  730/1077 - Train Accuracy: 0.6219, Validation Accuracy: 0.6289, Loss: 0.4048
Epoch   1 Batch  740/1077 - Train Accuracy: 0.7070, Validation Accuracy: 0.6850, Loss: 0.3812
Epoch   1 Batch  750/1077 - Train Accuracy: 0.6813, Validation Accuracy: 0.6566, Loss: 0.4389
Epoch   1 Batch  760/1077 - Train Accuracy: 0.6789, Validation Accuracy: 0.6875, Loss: 0.4150
Epoch   1 Batch  770/1077 - Train Accuracy: 0.6897, Validation Accuracy: 0.6957, Loss: 0.3764
Epoch   1 Batch  780/1077 - Train Accuracy: 0.6590, Validation Accuracy: 0.6442, Loss: 0.5373
Epoch   1 Batch  790/1077 - Train Accuracy: 0.5902, Validation Accuracy: 0.6083, Loss: 0.6115
Epoch   1 Batch  800/1077 - Train Accuracy: 0.5941, Validation Accuracy: 0.6310, Loss: 0.4857
Epoch   1 Batch  810/1077 - Train Accuracy: 0.6555, Validation Accuracy: 0.6594, Loss: 0.4206
Epoch   1 Batch  820/1077 - Train Accuracy: 0.6602, Validation Accuracy: 0.6587, Loss: 0.4554
Epoch   1 Batch  830/1077 - Train Accuracy: 0.6316, Validation Accuracy: 0.6811, Loss: 0.4253
Epoch   1 Batch  840/1077 - Train Accuracy: 0.6523, Validation Accuracy: 0.6456, Loss: 0.3932
Epoch   1 Batch  850/1077 - Train Accuracy: 0.6514, Validation Accuracy: 0.6832, Loss: 0.4288
Epoch   1 Batch  860/1077 - Train Accuracy: 0.6365, Validation Accuracy: 0.6783, Loss: 0.4081
Epoch   1 Batch  870/1077 - Train Accuracy: 0.6653, Validation Accuracy: 0.6918, Loss: 0.4096
Epoch   1 Batch  880/1077 - Train Accuracy: 0.7359, Validation Accuracy: 0.7053, Loss: 0.3793
Epoch   1 Batch  890/1077 - Train Accuracy: 0.7124, Validation Accuracy: 0.6779, Loss: 0.3498
Epoch   1 Batch  900/1077 - Train Accuracy: 0.7113, Validation Accuracy: 0.6729, Loss: 0.3710
Epoch   1 Batch  910/1077 - Train Accuracy: 0.6425, Validation Accuracy: 0.6832, Loss: 0.3651
Epoch   1 Batch  920/1077 - Train Accuracy: 0.6859, Validation Accuracy: 0.6879, Loss: 0.3664
Epoch   1 Batch  930/1077 - Train Accuracy: 0.6816, Validation Accuracy: 0.6843, Loss: 0.3629
Epoch   1 Batch  940/1077 - Train Accuracy: 0.6844, Validation Accuracy: 0.6836, Loss: 0.3930
Epoch   1 Batch  950/1077 - Train Accuracy: 0.6711, Validation Accuracy: 0.6772, Loss: 0.3662
Epoch   1 Batch  960/1077 - Train Accuracy: 0.6834, Validation Accuracy: 0.6509, Loss: 0.3683
Epoch   1 Batch  970/1077 - Train Accuracy: 0.6613, Validation Accuracy: 0.6758, Loss: 0.3771
Epoch   1 Batch  980/1077 - Train Accuracy: 0.6961, Validation Accuracy: 0.6974, Loss: 0.4071
Epoch   1 Batch  990/1077 - Train Accuracy: 0.6016, Validation Accuracy: 0.6733, Loss: 0.4101
Epoch   1 Batch 1000/1077 - Train Accuracy: 0.7388, Validation Accuracy: 0.6871, Loss: 0.3258
Epoch   1 Batch 1010/1077 - Train Accuracy: 0.6941, Validation Accuracy: 0.6939, Loss: 0.3566
Epoch   1 Batch 1020/1077 - Train Accuracy: 0.6996, Validation Accuracy: 0.6957, Loss: 0.3586
Epoch   1 Batch 1030/1077 - Train Accuracy: 0.6945, Validation Accuracy: 0.7148, Loss: 0.3469
Epoch   1 Batch 1040/1077 - Train Accuracy: 0.7085, Validation Accuracy: 0.7074, Loss: 0.3604
Epoch   1 Batch 1050/1077 - Train Accuracy: 0.6598, Validation Accuracy: 0.6918, Loss: 0.3641
Epoch   1 Batch 1060/1077 - Train Accuracy: 0.7125, Validation Accuracy: 0.6783, Loss: 0.3313
Epoch   1 Batch 1070/1077 - Train Accuracy: 0.6520, Validation Accuracy: 0.6857, Loss: 0.3655
Epoch   2 Batch   10/1077 - Train Accuracy: 0.6822, Validation Accuracy: 0.6985, Loss: 0.3795
Epoch   2 Batch   20/1077 - Train Accuracy: 0.7078, Validation Accuracy: 0.6960, Loss: 0.3579
Epoch   2 Batch   30/1077 - Train Accuracy: 0.7023, Validation Accuracy: 0.7028, Loss: 0.3579
Epoch   2 Batch   40/1077 - Train Accuracy: 0.7117, Validation Accuracy: 0.6974, Loss: 0.3461
Epoch   2 Batch   50/1077 - Train Accuracy: 0.6961, Validation Accuracy: 0.7209, Loss: 0.3368
Epoch   2 Batch   60/1077 - Train Accuracy: 0.7128, Validation Accuracy: 0.7287, Loss: 0.3259
Epoch   2 Batch   70/1077 - Train Accuracy: 0.6974, Validation Accuracy: 0.7028, Loss: 0.3449
Epoch   2 Batch   80/1077 - Train Accuracy: 0.7191, Validation Accuracy: 0.7053, Loss: 0.3517
Epoch   2 Batch   90/1077 - Train Accuracy: 0.6957, Validation Accuracy: 0.7209, Loss: 0.3407
Epoch   2 Batch  100/1077 - Train Accuracy: 0.6957, Validation Accuracy: 0.7127, Loss: 0.3327
Epoch   2 Batch  110/1077 - Train Accuracy: 0.7445, Validation Accuracy: 0.7124, Loss: 0.2962
Epoch   2 Batch  120/1077 - Train Accuracy: 0.7137, Validation Accuracy: 0.7013, Loss: 0.3504
Epoch   2 Batch  130/1077 - Train Accuracy: 0.6979, Validation Accuracy: 0.6918, Loss: 0.3146
Epoch   2 Batch  140/1077 - Train Accuracy: 0.7138, Validation Accuracy: 0.7305, Loss: 0.3411
Epoch   2 Batch  150/1077 - Train Accuracy: 0.7362, Validation Accuracy: 0.6989, Loss: 0.3153
Epoch   2 Batch  160/1077 - Train Accuracy: 0.7340, Validation Accuracy: 0.7319, Loss: 0.3355
Epoch   2 Batch  170/1077 - Train Accuracy: 0.6937, Validation Accuracy: 0.7159, Loss: 0.3395
Epoch   2 Batch  180/1077 - Train Accuracy: 0.7246, Validation Accuracy: 0.7124, Loss: 0.3288
Epoch   2 Batch  190/1077 - Train Accuracy: 0.7551, Validation Accuracy: 0.6957, Loss: 0.3127
Epoch   2 Batch  200/1077 - Train Accuracy: 0.7039, Validation Accuracy: 0.7177, Loss: 0.3327
Epoch   2 Batch  210/1077 - Train Accuracy: 0.7344, Validation Accuracy: 0.7134, Loss: 0.3005
Epoch   2 Batch  220/1077 - Train Accuracy: 0.7344, Validation Accuracy: 0.7134, Loss: 0.3342
Epoch   2 Batch  230/1077 - Train Accuracy: 0.7407, Validation Accuracy: 0.7209, Loss: 0.3086
Epoch   2 Batch  240/1077 - Train Accuracy: 0.7625, Validation Accuracy: 0.7330, Loss: 0.2990
Epoch   2 Batch  250/1077 - Train Accuracy: 0.6992, Validation Accuracy: 0.7234, Loss: 0.2940
Epoch   2 Batch  260/1077 - Train Accuracy: 0.7031, Validation Accuracy: 0.7315, Loss: 0.2969
Epoch   2 Batch  270/1077 - Train Accuracy: 0.7332, Validation Accuracy: 0.7088, Loss: 0.3303
Epoch   2 Batch  280/1077 - Train Accuracy: 0.7090, Validation Accuracy: 0.7315, Loss: 0.3180
Epoch   2 Batch  290/1077 - Train Accuracy: 0.7070, Validation Accuracy: 0.7383, Loss: 0.3322
Epoch   2 Batch  300/1077 - Train Accuracy: 0.6970, Validation Accuracy: 0.7131, Loss: 0.3024
Epoch   2 Batch  310/1077 - Train Accuracy: 0.6977, Validation Accuracy: 0.7376, Loss: 0.3282
Epoch   2 Batch  320/1077 - Train Accuracy: 0.7266, Validation Accuracy: 0.7074, Loss: 0.3122
Epoch   2 Batch  330/1077 - Train Accuracy: 0.7703, Validation Accuracy: 0.7227, Loss: 0.3076
Epoch   2 Batch  340/1077 - Train Accuracy: 0.6912, Validation Accuracy: 0.7230, Loss: 0.3336
Epoch   2 Batch  350/1077 - Train Accuracy: 0.7070, Validation Accuracy: 0.6886, Loss: 0.3050
Epoch   2 Batch  360/1077 - Train Accuracy: 0.7145, Validation Accuracy: 0.7219, Loss: 0.3063
Epoch   2 Batch  370/1077 - Train Accuracy: 0.6897, Validation Accuracy: 0.7177, Loss: 0.3195
Epoch   2 Batch  380/1077 - Train Accuracy: 0.7375, Validation Accuracy: 0.7262, Loss: 0.2922
Epoch   2 Batch  390/1077 - Train Accuracy: 0.6621, Validation Accuracy: 0.7202, Loss: 0.3490
Epoch   2 Batch  400/1077 - Train Accuracy: 0.7508, Validation Accuracy: 0.7031, Loss: 0.3101
Epoch   2 Batch  410/1077 - Train Accuracy: 0.6423, Validation Accuracy: 0.6509, Loss: 0.5334
Epoch   2 Batch  420/1077 - Train Accuracy: 0.6996, Validation Accuracy: 0.6836, Loss: 0.4239
Epoch   2 Batch  430/1077 - Train Accuracy: 0.6293, Validation Accuracy: 0.6619, Loss: 0.3943
Epoch   2 Batch  440/1077 - Train Accuracy: 0.6727, Validation Accuracy: 0.7163, Loss: 0.4204
Epoch   2 Batch  450/1077 - Train Accuracy: 0.6777, Validation Accuracy: 0.6847, Loss: 0.3323
Epoch   2 Batch  460/1077 - Train Accuracy: 0.7398, Validation Accuracy: 0.7216, Loss: 0.3420
Epoch   2 Batch  470/1077 - Train Accuracy: 0.6965, Validation Accuracy: 0.7141, Loss: 0.3421
Epoch   2 Batch  480/1077 - Train Accuracy: 0.7508, Validation Accuracy: 0.7085, Loss: 0.3331
Epoch   2 Batch  490/1077 - Train Accuracy: 0.7059, Validation Accuracy: 0.7035, Loss: 0.3536
Epoch   2 Batch  500/1077 - Train Accuracy: 0.7324, Validation Accuracy: 0.7209, Loss: 0.3207
Epoch   2 Batch  510/1077 - Train Accuracy: 0.7156, Validation Accuracy: 0.7134, Loss: 0.3152
Epoch   2 Batch  520/1077 - Train Accuracy: 0.7917, Validation Accuracy: 0.7085, Loss: 0.2900
Epoch   2 Batch  530/1077 - Train Accuracy: 0.7203, Validation Accuracy: 0.7138, Loss: 0.3363
Epoch   2 Batch  540/1077 - Train Accuracy: 0.7172, Validation Accuracy: 0.7152, Loss: 0.2876
Epoch   2 Batch  550/1077 - Train Accuracy: 0.7148, Validation Accuracy: 0.7276, Loss: 0.3322
Epoch   2 Batch  560/1077 - Train Accuracy: 0.7500, Validation Accuracy: 0.7362, Loss: 0.2891
Epoch   2 Batch  570/1077 - Train Accuracy: 0.7220, Validation Accuracy: 0.7049, Loss: 0.3148
Epoch   2 Batch  580/1077 - Train Accuracy: 0.7917, Validation Accuracy: 0.7351, Loss: 0.2815
Epoch   2 Batch  590/1077 - Train Accuracy: 0.6645, Validation Accuracy: 0.6928, Loss: 0.3205
Epoch   2 Batch  600/1077 - Train Accuracy: 0.7701, Validation Accuracy: 0.7305, Loss: 0.2695
Epoch   2 Batch  610/1077 - Train Accuracy: 0.7418, Validation Accuracy: 0.7330, Loss: 0.3023
Epoch   2 Batch  620/1077 - Train Accuracy: 0.7266, Validation Accuracy: 0.7333, Loss: 0.2855
Epoch   2 Batch  630/1077 - Train Accuracy: 0.7547, Validation Accuracy: 0.7393, Loss: 0.2923
Epoch   2 Batch  640/1077 - Train Accuracy: 0.7180, Validation Accuracy: 0.7209, Loss: 0.2829
Epoch   2 Batch  650/1077 - Train Accuracy: 0.7086, Validation Accuracy: 0.7234, Loss: 0.3009
Epoch   2 Batch  660/1077 - Train Accuracy: 0.7398, Validation Accuracy: 0.7500, Loss: 0.3032
Epoch   2 Batch  670/1077 - Train Accuracy: 0.7372, Validation Accuracy: 0.7340, Loss: 0.2734
Epoch   2 Batch  680/1077 - Train Accuracy: 0.7351, Validation Accuracy: 0.7216, Loss: 0.2948
Epoch   2 Batch  690/1077 - Train Accuracy: 0.7512, Validation Accuracy: 0.7283, Loss: 0.2929
Epoch   2 Batch  700/1077 - Train Accuracy: 0.6977, Validation Accuracy: 0.7074, Loss: 0.2895
Epoch   2 Batch  710/1077 - Train Accuracy: 0.6766, Validation Accuracy: 0.7521, Loss: 0.3121
Epoch   2 Batch  720/1077 - Train Accuracy: 0.7340, Validation Accuracy: 0.7425, Loss: 0.2997
Epoch   2 Batch  730/1077 - Train Accuracy: 0.7004, Validation Accuracy: 0.7219, Loss: 0.3091
Epoch   2 Batch  740/1077 - Train Accuracy: 0.7684, Validation Accuracy: 0.7340, Loss: 0.2816
Epoch   2 Batch  750/1077 - Train Accuracy: 0.7121, Validation Accuracy: 0.7667, Loss: 0.2986
Epoch   2 Batch  760/1077 - Train Accuracy: 0.7594, Validation Accuracy: 0.7411, Loss: 0.2801
Epoch   2 Batch  770/1077 - Train Accuracy: 0.7485, Validation Accuracy: 0.7404, Loss: 0.2621
Epoch   2 Batch  780/1077 - Train Accuracy: 0.7359, Validation Accuracy: 0.7266, Loss: 0.2934
Epoch   2 Batch  790/1077 - Train Accuracy: 0.6594, Validation Accuracy: 0.7266, Loss: 0.3098
Epoch   2 Batch  800/1077 - Train Accuracy: 0.7172, Validation Accuracy: 0.7386, Loss: 0.2830
Epoch   2 Batch  810/1077 - Train Accuracy: 0.7489, Validation Accuracy: 0.7418, Loss: 0.2649
Epoch   2 Batch  820/1077 - Train Accuracy: 0.7246, Validation Accuracy: 0.7415, Loss: 0.3236
Epoch   2 Batch  830/1077 - Train Accuracy: 0.7070, Validation Accuracy: 0.7543, Loss: 0.3077
Epoch   2 Batch  840/1077 - Train Accuracy: 0.7277, Validation Accuracy: 0.7362, Loss: 0.2643
Epoch   2 Batch  850/1077 - Train Accuracy: 0.7225, Validation Accuracy: 0.7624, Loss: 0.3188
Epoch   2 Batch  860/1077 - Train Accuracy: 0.7254, Validation Accuracy: 0.7266, Loss: 0.2781
Epoch   2 Batch  870/1077 - Train Accuracy: 0.7270, Validation Accuracy: 0.7433, Loss: 0.3055
Epoch   2 Batch  880/1077 - Train Accuracy: 0.7773, Validation Accuracy: 0.7710, Loss: 0.2758
Epoch   2 Batch  890/1077 - Train Accuracy: 0.7753, Validation Accuracy: 0.7479, Loss: 0.2535
Epoch   2 Batch  900/1077 - Train Accuracy: 0.7660, Validation Accuracy: 0.7436, Loss: 0.2737
Epoch   2 Batch  910/1077 - Train Accuracy: 0.7243, Validation Accuracy: 0.7443, Loss: 0.2858
Epoch   2 Batch  920/1077 - Train Accuracy: 0.7285, Validation Accuracy: 0.7688, Loss: 0.2704
Epoch   2 Batch  930/1077 - Train Accuracy: 0.7125, Validation Accuracy: 0.7450, Loss: 0.2658
Epoch   2 Batch  940/1077 - Train Accuracy: 0.7742, Validation Accuracy: 0.7475, Loss: 0.2738
Epoch   2 Batch  950/1077 - Train Accuracy: 0.7251, Validation Accuracy: 0.7330, Loss: 0.2539
Epoch   2 Batch  960/1077 - Train Accuracy: 0.7563, Validation Accuracy: 0.7504, Loss: 0.2621
Epoch   2 Batch  970/1077 - Train Accuracy: 0.7535, Validation Accuracy: 0.7745, Loss: 0.2816
Epoch   2 Batch  980/1077 - Train Accuracy: 0.7367, Validation Accuracy: 0.7507, Loss: 0.3060
Epoch   2 Batch  990/1077 - Train Accuracy: 0.6768, Validation Accuracy: 0.7500, Loss: 0.3163
Epoch   2 Batch 1000/1077 - Train Accuracy: 0.7507, Validation Accuracy: 0.7624, Loss: 0.2586
Epoch   2 Batch 1010/1077 - Train Accuracy: 0.7582, Validation Accuracy: 0.7624, Loss: 0.2744
Epoch   2 Batch 1020/1077 - Train Accuracy: 0.7750, Validation Accuracy: 0.7450, Loss: 0.2638
Epoch   2 Batch 1030/1077 - Train Accuracy: 0.7238, Validation Accuracy: 0.7269, Loss: 0.2737
Epoch   2 Batch 1040/1077 - Train Accuracy: 0.7636, Validation Accuracy: 0.7731, Loss: 0.2593
Epoch   2 Batch 1050/1077 - Train Accuracy: 0.7652, Validation Accuracy: 0.7695, Loss: 0.2666
Epoch   2 Batch 1060/1077 - Train Accuracy: 0.7973, Validation Accuracy: 0.7901, Loss: 0.2379
Epoch   2 Batch 1070/1077 - Train Accuracy: 0.7438, Validation Accuracy: 0.7773, Loss: 0.2781
Epoch   3 Batch   10/1077 - Train Accuracy: 0.7681, Validation Accuracy: 0.7962, Loss: 0.2785
Epoch   3 Batch   20/1077 - Train Accuracy: 0.7430, Validation Accuracy: 0.7965, Loss: 0.2802
Epoch   3 Batch   30/1077 - Train Accuracy: 0.7734, Validation Accuracy: 0.7866, Loss: 0.2828
Epoch   3 Batch   40/1077 - Train Accuracy: 0.7875, Validation Accuracy: 0.7919, Loss: 0.2567
Epoch   3 Batch   50/1077 - Train Accuracy: 0.7430, Validation Accuracy: 0.7741, Loss: 0.2705
Epoch   3 Batch   60/1077 - Train Accuracy: 0.7548, Validation Accuracy: 0.7869, Loss: 0.2727
Epoch   3 Batch   70/1077 - Train Accuracy: 0.7512, Validation Accuracy: 0.7674, Loss: 0.2744
Epoch   3 Batch   80/1077 - Train Accuracy: 0.7766, Validation Accuracy: 0.7749, Loss: 0.2682
Epoch   3 Batch   90/1077 - Train Accuracy: 0.7707, Validation Accuracy: 0.7724, Loss: 0.2554
Epoch   3 Batch  100/1077 - Train Accuracy: 0.7535, Validation Accuracy: 0.7912, Loss: 0.2550
Epoch   3 Batch  110/1077 - Train Accuracy: 0.7898, Validation Accuracy: 0.7450, Loss: 0.2363
Epoch   3 Batch  120/1077 - Train Accuracy: 0.7863, Validation Accuracy: 0.7663, Loss: 0.2728
Epoch   3 Batch  130/1077 - Train Accuracy: 0.7548, Validation Accuracy: 0.7532, Loss: 0.2573
Epoch   3 Batch  140/1077 - Train Accuracy: 0.7578, Validation Accuracy: 0.7433, Loss: 0.2725
Epoch   3 Batch  150/1077 - Train Accuracy: 0.7898, Validation Accuracy: 0.7610, Loss: 0.2581
Epoch   3 Batch  160/1077 - Train Accuracy: 0.7844, Validation Accuracy: 0.7802, Loss: 0.2777
Epoch   3 Batch  170/1077 - Train Accuracy: 0.7352, Validation Accuracy: 0.7550, Loss: 0.2708
Epoch   3 Batch  180/1077 - Train Accuracy: 0.7184, Validation Accuracy: 0.7571, Loss: 0.2676
Epoch   3 Batch  190/1077 - Train Accuracy: 0.7754, Validation Accuracy: 0.7766, Loss: 0.2472
Epoch   3 Batch  200/1077 - Train Accuracy: 0.7465, Validation Accuracy: 0.7823, Loss: 0.2658
Epoch   3 Batch  210/1077 - Train Accuracy: 0.7481, Validation Accuracy: 0.7209, Loss: 0.2495
Epoch   3 Batch  220/1077 - Train Accuracy: 0.7932, Validation Accuracy: 0.7695, Loss: 0.2486
Epoch   3 Batch  230/1077 - Train Accuracy: 0.7801, Validation Accuracy: 0.7642, Loss: 0.2366
Epoch   3 Batch  240/1077 - Train Accuracy: 0.8363, Validation Accuracy: 0.7809, Loss: 0.2338
Epoch   3 Batch  250/1077 - Train Accuracy: 0.7610, Validation Accuracy: 0.7930, Loss: 0.2410
Epoch   3 Batch  260/1077 - Train Accuracy: 0.7980, Validation Accuracy: 0.7852, Loss: 0.2322
Epoch   3 Batch  270/1077 - Train Accuracy: 0.7723, Validation Accuracy: 0.7887, Loss: 0.2629
Epoch   3 Batch  280/1077 - Train Accuracy: 0.7375, Validation Accuracy: 0.8004, Loss: 0.2419
Epoch   3 Batch  290/1077 - Train Accuracy: 0.8016, Validation Accuracy: 0.8050, Loss: 0.2668
Epoch   3 Batch  300/1077 - Train Accuracy: 0.7800, Validation Accuracy: 0.7603, Loss: 0.2666
Epoch   3 Batch  310/1077 - Train Accuracy: 0.7457, Validation Accuracy: 0.7741, Loss: 0.2858
Epoch   3 Batch  320/1077 - Train Accuracy: 0.8020, Validation Accuracy: 0.7678, Loss: 0.2433
Epoch   3 Batch  330/1077 - Train Accuracy: 0.7895, Validation Accuracy: 0.7536, Loss: 0.2308
Epoch   3 Batch  340/1077 - Train Accuracy: 0.7969, Validation Accuracy: 0.7869, Loss: 0.2490
Epoch   3 Batch  350/1077 - Train Accuracy: 0.8047, Validation Accuracy: 0.7940, Loss: 0.2312
Epoch   3 Batch  360/1077 - Train Accuracy: 0.7992, Validation Accuracy: 0.7741, Loss: 0.2373
Epoch   3 Batch  370/1077 - Train Accuracy: 0.8025, Validation Accuracy: 0.7773, Loss: 0.2994
Epoch   3 Batch  380/1077 - Train Accuracy: 0.7070, Validation Accuracy: 0.6634, Loss: 0.3461
Epoch   3 Batch  390/1077 - Train Accuracy: 0.7199, Validation Accuracy: 0.7376, Loss: 0.3839
Epoch   3 Batch  400/1077 - Train Accuracy: 0.7820, Validation Accuracy: 0.7475, Loss: 0.3244
Epoch   3 Batch  410/1077 - Train Accuracy: 0.7767, Validation Accuracy: 0.7607, Loss: 0.3113
Epoch   3 Batch  420/1077 - Train Accuracy: 0.8227, Validation Accuracy: 0.7663, Loss: 0.2547
Epoch   3 Batch  430/1077 - Train Accuracy: 0.7531, Validation Accuracy: 0.7699, Loss: 0.2697
Epoch   3 Batch  440/1077 - Train Accuracy: 0.7453, Validation Accuracy: 0.7511, Loss: 0.2923
Epoch   3 Batch  450/1077 - Train Accuracy: 0.7863, Validation Accuracy: 0.7997, Loss: 0.2559
Epoch   3 Batch  460/1077 - Train Accuracy: 0.8109, Validation Accuracy: 0.7777, Loss: 0.2592
Epoch   3 Batch  470/1077 - Train Accuracy: 0.8088, Validation Accuracy: 0.7873, Loss: 0.2581
Epoch   3 Batch  480/1077 - Train Accuracy: 0.8285, Validation Accuracy: 0.8143, Loss: 0.2451
Epoch   3 Batch  490/1077 - Train Accuracy: 0.7688, Validation Accuracy: 0.7884, Loss: 0.2725
Epoch   3 Batch  500/1077 - Train Accuracy: 0.8004, Validation Accuracy: 0.7905, Loss: 0.2546
Epoch   3 Batch  510/1077 - Train Accuracy: 0.7980, Validation Accuracy: 0.8011, Loss: 0.2398
Epoch   3 Batch  520/1077 - Train Accuracy: 0.8638, Validation Accuracy: 0.8036, Loss: 0.2263
Epoch   3 Batch  530/1077 - Train Accuracy: 0.7496, Validation Accuracy: 0.7798, Loss: 0.2747
Epoch   3 Batch  540/1077 - Train Accuracy: 0.8230, Validation Accuracy: 0.7830, Loss: 0.2152
Epoch   3 Batch  550/1077 - Train Accuracy: 0.7246, Validation Accuracy: 0.8104, Loss: 0.2917
Epoch   3 Batch  560/1077 - Train Accuracy: 0.7684, Validation Accuracy: 0.8022, Loss: 0.2317
Epoch   3 Batch  570/1077 - Train Accuracy: 0.7895, Validation Accuracy: 0.8033, Loss: 0.2644
Epoch   3 Batch  580/1077 - Train Accuracy: 0.8337, Validation Accuracy: 0.7905, Loss: 0.2242
Epoch   3 Batch  590/1077 - Train Accuracy: 0.8047, Validation Accuracy: 0.7898, Loss: 0.2473
Epoch   3 Batch  600/1077 - Train Accuracy: 0.8285, Validation Accuracy: 0.8317, Loss: 0.2216
Epoch   3 Batch  610/1077 - Train Accuracy: 0.8030, Validation Accuracy: 0.8178, Loss: 0.2423
Epoch   3 Batch  620/1077 - Train Accuracy: 0.8203, Validation Accuracy: 0.8203, Loss: 0.2169
Epoch   3 Batch  630/1077 - Train Accuracy: 0.7691, Validation Accuracy: 0.8043, Loss: 0.2326
Epoch   3 Batch  640/1077 - Train Accuracy: 0.7857, Validation Accuracy: 0.8072, Loss: 0.2348
Epoch   3 Batch  650/1077 - Train Accuracy: 0.7941, Validation Accuracy: 0.8029, Loss: 0.2561
Epoch   3 Batch  660/1077 - Train Accuracy: 0.8035, Validation Accuracy: 0.7745, Loss: 0.2425
Epoch   3 Batch  670/1077 - Train Accuracy: 0.8161, Validation Accuracy: 0.7844, Loss: 0.2262
Epoch   3 Batch  680/1077 - Train Accuracy: 0.7924, Validation Accuracy: 0.7908, Loss: 0.2250
Epoch   3 Batch  690/1077 - Train Accuracy: 0.8344, Validation Accuracy: 0.8182, Loss: 0.2380
Epoch   3 Batch  700/1077 - Train Accuracy: 0.7758, Validation Accuracy: 0.8011, Loss: 0.2446
Epoch   3 Batch  710/1077 - Train Accuracy: 0.8004, Validation Accuracy: 0.8004, Loss: 0.2524
Epoch   3 Batch  720/1077 - Train Accuracy: 0.8006, Validation Accuracy: 0.8267, Loss: 0.2424
Epoch   3 Batch  730/1077 - Train Accuracy: 0.8227, Validation Accuracy: 0.8050, Loss: 0.2447
Epoch   3 Batch  740/1077 - Train Accuracy: 0.8395, Validation Accuracy: 0.8139, Loss: 0.2151
Epoch   3 Batch  750/1077 - Train Accuracy: 0.8145, Validation Accuracy: 0.7759, Loss: 0.2399
Epoch   3 Batch  760/1077 - Train Accuracy: 0.8578, Validation Accuracy: 0.8217, Loss: 0.2128
Epoch   3 Batch  770/1077 - Train Accuracy: 0.8281, Validation Accuracy: 0.8082, Loss: 0.2140
Epoch   3 Batch  780/1077 - Train Accuracy: 0.7926, Validation Accuracy: 0.8214, Loss: 0.2304
Epoch   3 Batch  790/1077 - Train Accuracy: 0.7812, Validation Accuracy: 0.8271, Loss: 0.2642
Epoch   3 Batch  800/1077 - Train Accuracy: 0.7961, Validation Accuracy: 0.8409, Loss: 0.2233
Epoch   3 Batch  810/1077 - Train Accuracy: 0.8278, Validation Accuracy: 0.8469, Loss: 0.2126
Epoch   3 Batch  820/1077 - Train Accuracy: 0.7555, Validation Accuracy: 0.8466, Loss: 0.2559
Epoch   3 Batch  830/1077 - Train Accuracy: 0.7824, Validation Accuracy: 0.8125, Loss: 0.2483
Epoch   3 Batch  840/1077 - Train Accuracy: 0.8246, Validation Accuracy: 0.8164, Loss: 0.2186
Epoch   3 Batch  850/1077 - Train Accuracy: 0.8013, Validation Accuracy: 0.8512, Loss: 0.2658
Epoch   3 Batch  860/1077 - Train Accuracy: 0.8051, Validation Accuracy: 0.8466, Loss: 0.2328
Epoch   3 Batch  870/1077 - Train Accuracy: 0.7714, Validation Accuracy: 0.8423, Loss: 0.2400
Epoch   3 Batch  880/1077 - Train Accuracy: 0.8520, Validation Accuracy: 0.8363, Loss: 0.2240
Epoch   3 Batch  890/1077 - Train Accuracy: 0.8512, Validation Accuracy: 0.8260, Loss: 0.2093
Epoch   3 Batch  900/1077 - Train Accuracy: 0.8348, Validation Accuracy: 0.8107, Loss: 0.2151
Epoch   3 Batch  910/1077 - Train Accuracy: 0.8192, Validation Accuracy: 0.8335, Loss: 0.2237
Epoch   3 Batch  920/1077 - Train Accuracy: 0.8055, Validation Accuracy: 0.8150, Loss: 0.2220
Epoch   3 Batch  930/1077 - Train Accuracy: 0.8262, Validation Accuracy: 0.8100, Loss: 0.2126
Epoch   3 Batch  940/1077 - Train Accuracy: 0.8406, Validation Accuracy: 0.8317, Loss: 0.2067
Epoch   3 Batch  950/1077 - Train Accuracy: 0.8121, Validation Accuracy: 0.8129, Loss: 0.1968
Epoch   3 Batch  960/1077 - Train Accuracy: 0.8196, Validation Accuracy: 0.8104, Loss: 0.2046
Epoch   3 Batch  970/1077 - Train Accuracy: 0.8520, Validation Accuracy: 0.7915, Loss: 0.2197
Epoch   3 Batch  980/1077 - Train Accuracy: 0.8152, Validation Accuracy: 0.8239, Loss: 0.2059
Epoch   3 Batch  990/1077 - Train Accuracy: 0.8088, Validation Accuracy: 0.8473, Loss: 0.2375
Epoch   3 Batch 1000/1077 - Train Accuracy: 0.8259, Validation Accuracy: 0.8338, Loss: 0.1895
Epoch   3 Batch 1010/1077 - Train Accuracy: 0.8629, Validation Accuracy: 0.8278, Loss: 0.2032
Epoch   3 Batch 1020/1077 - Train Accuracy: 0.8465, Validation Accuracy: 0.8480, Loss: 0.2240
Epoch   3 Batch 1030/1077 - Train Accuracy: 0.8344, Validation Accuracy: 0.8072, Loss: 0.2111
Epoch   3 Batch 1040/1077 - Train Accuracy: 0.8104, Validation Accuracy: 0.8224, Loss: 0.2143
Epoch   3 Batch 1050/1077 - Train Accuracy: 0.8359, Validation Accuracy: 0.8207, Loss: 0.2262
Epoch   3 Batch 1060/1077 - Train Accuracy: 0.8754, Validation Accuracy: 0.8097, Loss: 0.1911
Epoch   3 Batch 1070/1077 - Train Accuracy: 0.8152, Validation Accuracy: 0.8079, Loss: 0.2346
Epoch   4 Batch   10/1077 - Train Accuracy: 0.8462, Validation Accuracy: 0.8303, Loss: 0.2149
Epoch   4 Batch   20/1077 - Train Accuracy: 0.8195, Validation Accuracy: 0.8189, Loss: 0.2223
Epoch   4 Batch   30/1077 - Train Accuracy: 0.8344, Validation Accuracy: 0.8455, Loss: 0.2258
Epoch   4 Batch   40/1077 - Train Accuracy: 0.8430, Validation Accuracy: 0.8278, Loss: 0.2127
Epoch   4 Batch   50/1077 - Train Accuracy: 0.8242, Validation Accuracy: 0.8551, Loss: 0.2113
Epoch   4 Batch   60/1077 - Train Accuracy: 0.8482, Validation Accuracy: 0.7923, Loss: 0.2045
Epoch   4 Batch   70/1077 - Train Accuracy: 0.8318, Validation Accuracy: 0.8349, Loss: 0.2191
Epoch   4 Batch   80/1077 - Train Accuracy: 0.8449, Validation Accuracy: 0.8377, Loss: 0.2074
Epoch   4 Batch   90/1077 - Train Accuracy: 0.8324, Validation Accuracy: 0.8299, Loss: 0.1994
Epoch   4 Batch  100/1077 - Train Accuracy: 0.8430, Validation Accuracy: 0.8253, Loss: 0.1911
Epoch   4 Batch  110/1077 - Train Accuracy: 0.8586, Validation Accuracy: 0.8320, Loss: 0.1711
Epoch   4 Batch  120/1077 - Train Accuracy: 0.8434, Validation Accuracy: 0.8452, Loss: 0.2215
Epoch   4 Batch  130/1077 - Train Accuracy: 0.8326, Validation Accuracy: 0.8388, Loss: 0.2036
Epoch   4 Batch  140/1077 - Train Accuracy: 0.8627, Validation Accuracy: 0.8129, Loss: 0.2065
Epoch   4 Batch  150/1077 - Train Accuracy: 0.8534, Validation Accuracy: 0.8427, Loss: 0.1878
Epoch   4 Batch  160/1077 - Train Accuracy: 0.8348, Validation Accuracy: 0.8324, Loss: 0.2017
Epoch   4 Batch  170/1077 - Train Accuracy: 0.8371, Validation Accuracy: 0.8331, Loss: 0.2092
Epoch   4 Batch  180/1077 - Train Accuracy: 0.8328, Validation Accuracy: 0.8377, Loss: 0.2119
Epoch   4 Batch  190/1077 - Train Accuracy: 0.8508, Validation Accuracy: 0.8370, Loss: 0.1706
Epoch   4 Batch  200/1077 - Train Accuracy: 0.8371, Validation Accuracy: 0.8445, Loss: 0.2112
Epoch   4 Batch  210/1077 - Train Accuracy: 0.8538, Validation Accuracy: 0.8288, Loss: 0.1899
Epoch   4 Batch  220/1077 - Train Accuracy: 0.8384, Validation Accuracy: 0.8445, Loss: 0.1851
Epoch   4 Batch  230/1077 - Train Accuracy: 0.8155, Validation Accuracy: 0.8526, Loss: 0.1908
Epoch   4 Batch  240/1077 - Train Accuracy: 0.8758, Validation Accuracy: 0.8544, Loss: 0.1851
Epoch   4 Batch  250/1077 - Train Accuracy: 0.8427, Validation Accuracy: 0.8533, Loss: 0.1885
Epoch   4 Batch  260/1077 - Train Accuracy: 0.8609, Validation Accuracy: 0.8306, Loss: 0.2030
Epoch   4 Batch  270/1077 - Train Accuracy: 0.7758, Validation Accuracy: 0.8065, Loss: 0.2666
Epoch   4 Batch  280/1077 - Train Accuracy: 0.7836, Validation Accuracy: 0.8338, Loss: 0.2353
Epoch   4 Batch  290/1077 - Train Accuracy: 0.8340, Validation Accuracy: 0.8477, Loss: 0.2077
Epoch   4 Batch  300/1077 - Train Accuracy: 0.8442, Validation Accuracy: 0.8189, Loss: 0.1891
Epoch   4 Batch  310/1077 - Train Accuracy: 0.8367, Validation Accuracy: 0.8533, Loss: 0.2117
Epoch   4 Batch  320/1077 - Train Accuracy: 0.8527, Validation Accuracy: 0.8604, Loss: 0.1904
Epoch   4 Batch  330/1077 - Train Accuracy: 0.8570, Validation Accuracy: 0.8423, Loss: 0.1738
Epoch   4 Batch  340/1077 - Train Accuracy: 0.8643, Validation Accuracy: 0.8274, Loss: 0.1914
Epoch   4 Batch  350/1077 - Train Accuracy: 0.8535, Validation Accuracy: 0.8537, Loss: 0.1795
Epoch   4 Batch  360/1077 - Train Accuracy: 0.8762, Validation Accuracy: 0.8604, Loss: 0.1689
Epoch   4 Batch  370/1077 - Train Accuracy: 0.8754, Validation Accuracy: 0.8565, Loss: 0.1942
Epoch   4 Batch  380/1077 - Train Accuracy: 0.8664, Validation Accuracy: 0.8647, Loss: 0.1769
Epoch   4 Batch  390/1077 - Train Accuracy: 0.8043, Validation Accuracy: 0.8693, Loss: 0.2266
Epoch   4 Batch  400/1077 - Train Accuracy: 0.8535, Validation Accuracy: 0.8462, Loss: 0.1905
Epoch   4 Batch  410/1077 - Train Accuracy: 0.8298, Validation Accuracy: 0.8452, Loss: 0.2050
Epoch   4 Batch  420/1077 - Train Accuracy: 0.8680, Validation Accuracy: 0.8764, Loss: 0.1740
Epoch   4 Batch  430/1077 - Train Accuracy: 0.8434, Validation Accuracy: 0.8473, Loss: 0.1967
Epoch   4 Batch  440/1077 - Train Accuracy: 0.8262, Validation Accuracy: 0.8569, Loss: 0.2061
Epoch   4 Batch  450/1077 - Train Accuracy: 0.8559, Validation Accuracy: 0.8690, Loss: 0.1718
Epoch   4 Batch  460/1077 - Train Accuracy: 0.8707, Validation Accuracy: 0.8544, Loss: 0.1858
Epoch   4 Batch  470/1077 - Train Accuracy: 0.8581, Validation Accuracy: 0.8594, Loss: 0.1828
Epoch   4 Batch  480/1077 - Train Accuracy: 0.8787, Validation Accuracy: 0.8768, Loss: 0.1750
Epoch   4 Batch  490/1077 - Train Accuracy: 0.8590, Validation Accuracy: 0.8654, Loss: 0.1961
Epoch   4 Batch  500/1077 - Train Accuracy: 0.8609, Validation Accuracy: 0.8782, Loss: 0.1751
Epoch   4 Batch  510/1077 - Train Accuracy: 0.8434, Validation Accuracy: 0.8761, Loss: 0.1762
Epoch   4 Batch  520/1077 - Train Accuracy: 0.9007, Validation Accuracy: 0.8647, Loss: 0.1528
Epoch   4 Batch  530/1077 - Train Accuracy: 0.8313, Validation Accuracy: 0.8594, Loss: 0.2080
Epoch   4 Batch  540/1077 - Train Accuracy: 0.8637, Validation Accuracy: 0.8665, Loss: 0.1562
Epoch   4 Batch  550/1077 - Train Accuracy: 0.7941, Validation Accuracy: 0.8718, Loss: 0.2013
Epoch   4 Batch  560/1077 - Train Accuracy: 0.8602, Validation Accuracy: 0.8455, Loss: 0.1645
Epoch   4 Batch  570/1077 - Train Accuracy: 0.8651, Validation Accuracy: 0.8477, Loss: 0.1953
Epoch   4 Batch  580/1077 - Train Accuracy: 0.8679, Validation Accuracy: 0.8509, Loss: 0.1646
Epoch   4 Batch  590/1077 - Train Accuracy: 0.8470, Validation Accuracy: 0.8643, Loss: 0.1860
Epoch   4 Batch  600/1077 - Train Accuracy: 0.8705, Validation Accuracy: 0.8661, Loss: 0.1611
Epoch   4 Batch  610/1077 - Train Accuracy: 0.8429, Validation Accuracy: 0.8576, Loss: 0.1841
Epoch   4 Batch  620/1077 - Train Accuracy: 0.8895, Validation Accuracy: 0.8778, Loss: 0.1681
Epoch   4 Batch  630/1077 - Train Accuracy: 0.8820, Validation Accuracy: 0.8729, Loss: 0.1723
Epoch   4 Batch  640/1077 - Train Accuracy: 0.8296, Validation Accuracy: 0.8675, Loss: 0.1696
Epoch   4 Batch  650/1077 - Train Accuracy: 0.8664, Validation Accuracy: 0.8679, Loss: 0.1771
Epoch   4 Batch  660/1077 - Train Accuracy: 0.8883, Validation Accuracy: 0.8526, Loss: 0.1721
Epoch   4 Batch  670/1077 - Train Accuracy: 0.8722, Validation Accuracy: 0.8597, Loss: 0.1626
Epoch   4 Batch  680/1077 - Train Accuracy: 0.8646, Validation Accuracy: 0.8739, Loss: 0.1843
Epoch   4 Batch  690/1077 - Train Accuracy: 0.8617, Validation Accuracy: 0.8651, Loss: 0.1709
Epoch   4 Batch  700/1077 - Train Accuracy: 0.8387, Validation Accuracy: 0.8640, Loss: 0.1682
Epoch   4 Batch  710/1077 - Train Accuracy: 0.8418, Validation Accuracy: 0.8793, Loss: 0.1802
Epoch   4 Batch  720/1077 - Train Accuracy: 0.8803, Validation Accuracy: 0.8807, Loss: 0.1919
Epoch   4 Batch  730/1077 - Train Accuracy: 0.8574, Validation Accuracy: 0.8739, Loss: 0.1798
Epoch   4 Batch  740/1077 - Train Accuracy: 0.8551, Validation Accuracy: 0.8597, Loss: 0.1755
Epoch   4 Batch  750/1077 - Train Accuracy: 0.8613, Validation Accuracy: 0.8743, Loss: 0.1803
Epoch   4 Batch  760/1077 - Train Accuracy: 0.8801, Validation Accuracy: 0.8548, Loss: 0.1492
Epoch   4 Batch  770/1077 - Train Accuracy: 0.8668, Validation Accuracy: 0.8732, Loss: 0.1604
Epoch   4 Batch  780/1077 - Train Accuracy: 0.8387, Validation Accuracy: 0.8672, Loss: 0.1778
Epoch   4 Batch  790/1077 - Train Accuracy: 0.8078, Validation Accuracy: 0.8523, Loss: 0.1976
Epoch   4 Batch  800/1077 - Train Accuracy: 0.8758, Validation Accuracy: 0.8839, Loss: 0.1707
Epoch   4 Batch  810/1077 - Train Accuracy: 0.8757, Validation Accuracy: 0.8888, Loss: 0.1600
Epoch   4 Batch  820/1077 - Train Accuracy: 0.8633, Validation Accuracy: 0.9027, Loss: 0.2130
Epoch   4 Batch  830/1077 - Train Accuracy: 0.8355, Validation Accuracy: 0.8817, Loss: 0.1838
Epoch   4 Batch  840/1077 - Train Accuracy: 0.8504, Validation Accuracy: 0.8817, Loss: 0.1670
Epoch   4 Batch  850/1077 - Train Accuracy: 0.8218, Validation Accuracy: 0.8668, Loss: 0.1966
Epoch   4 Batch  860/1077 - Train Accuracy: 0.8549, Validation Accuracy: 0.8736, Loss: 0.1787
Epoch   4 Batch  870/1077 - Train Accuracy: 0.8331, Validation Accuracy: 0.8661, Loss: 0.1792
Epoch   4 Batch  880/1077 - Train Accuracy: 0.9008, Validation Accuracy: 0.8803, Loss: 0.1914
Epoch   4 Batch  890/1077 - Train Accuracy: 0.8754, Validation Accuracy: 0.8764, Loss: 0.1541
Epoch   4 Batch  900/1077 - Train Accuracy: 0.8879, Validation Accuracy: 0.8643, Loss: 0.1663
Epoch   4 Batch  910/1077 - Train Accuracy: 0.8356, Validation Accuracy: 0.8636, Loss: 0.1664
Epoch   4 Batch  920/1077 - Train Accuracy: 0.8582, Validation Accuracy: 0.8817, Loss: 0.1580
Epoch   4 Batch  930/1077 - Train Accuracy: 0.8816, Validation Accuracy: 0.8874, Loss: 0.1612
Epoch   4 Batch  940/1077 - Train Accuracy: 0.8656, Validation Accuracy: 0.8853, Loss: 0.1757
Epoch   4 Batch  950/1077 - Train Accuracy: 0.8620, Validation Accuracy: 0.8690, Loss: 0.1423
Epoch   4 Batch  960/1077 - Train Accuracy: 0.8906, Validation Accuracy: 0.8604, Loss: 0.1487
Epoch   4 Batch  970/1077 - Train Accuracy: 0.8910, Validation Accuracy: 0.8608, Loss: 0.1594
Epoch   4 Batch  980/1077 - Train Accuracy: 0.8500, Validation Accuracy: 0.8839, Loss: 0.1614
Epoch   4 Batch  990/1077 - Train Accuracy: 0.8561, Validation Accuracy: 0.8888, Loss: 0.1729
Epoch   4 Batch 1000/1077 - Train Accuracy: 0.8739, Validation Accuracy: 0.8739, Loss: 0.1484
Epoch   4 Batch 1010/1077 - Train Accuracy: 0.8844, Validation Accuracy: 0.8768, Loss: 0.1547
Epoch   4 Batch 1020/1077 - Train Accuracy: 0.8777, Validation Accuracy: 0.8793, Loss: 0.1643
Epoch   4 Batch 1030/1077 - Train Accuracy: 0.8891, Validation Accuracy: 0.8754, Loss: 0.1680
Epoch   4 Batch 1040/1077 - Train Accuracy: 0.8717, Validation Accuracy: 0.8817, Loss: 0.1655
Epoch   4 Batch 1050/1077 - Train Accuracy: 0.8844, Validation Accuracy: 0.8821, Loss: 0.1655
Epoch   4 Batch 1060/1077 - Train Accuracy: 0.9090, Validation Accuracy: 0.8707, Loss: 0.1345
Epoch   4 Batch 1070/1077 - Train Accuracy: 0.8785, Validation Accuracy: 0.8810, Loss: 0.1715
Epoch   5 Batch   10/1077 - Train Accuracy: 0.8828, Validation Accuracy: 0.8832, Loss: 0.1693
Epoch   5 Batch   20/1077 - Train Accuracy: 0.8855, Validation Accuracy: 0.9222, Loss: 0.1663
Epoch   5 Batch   30/1077 - Train Accuracy: 0.8996, Validation Accuracy: 0.8707, Loss: 0.1630
Epoch   5 Batch   40/1077 - Train Accuracy: 0.8922, Validation Accuracy: 0.9023, Loss: 0.1552
Epoch   5 Batch   50/1077 - Train Accuracy: 0.8750, Validation Accuracy: 0.8874, Loss: 0.1641
Epoch   5 Batch   60/1077 - Train Accuracy: 0.9081, Validation Accuracy: 0.8814, Loss: 0.1533
Epoch   5 Batch   70/1077 - Train Accuracy: 0.8569, Validation Accuracy: 0.9130, Loss: 0.1636
Epoch   5 Batch   80/1077 - Train Accuracy: 0.8719, Validation Accuracy: 0.8871, Loss: 0.1571
Epoch   5 Batch   90/1077 - Train Accuracy: 0.8613, Validation Accuracy: 0.8960, Loss: 0.1601
Epoch   5 Batch  100/1077 - Train Accuracy: 0.8797, Validation Accuracy: 0.8761, Loss: 0.1558
Epoch   5 Batch  110/1077 - Train Accuracy: 0.9004, Validation Accuracy: 0.8892, Loss: 0.1455
Epoch   5 Batch  120/1077 - Train Accuracy: 0.8969, Validation Accuracy: 0.9027, Loss: 0.1497
Epoch   5 Batch  130/1077 - Train Accuracy: 0.8806, Validation Accuracy: 0.8810, Loss: 0.1480
Epoch   5 Batch  140/1077 - Train Accuracy: 0.8956, Validation Accuracy: 0.8892, Loss: 0.1404
Epoch   5 Batch  150/1077 - Train Accuracy: 0.8676, Validation Accuracy: 0.8391, Loss: 0.1900
Epoch   5 Batch  160/1077 - Train Accuracy: 0.8516, Validation Accuracy: 0.8860, Loss: 0.1934
Epoch   5 Batch  170/1077 - Train Accuracy: 0.8691, Validation Accuracy: 0.8803, Loss: 0.1931
Epoch   5 Batch  180/1077 - Train Accuracy: 0.8766, Validation Accuracy: 0.8878, Loss: 0.1700
Epoch   5 Batch  190/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.8974, Loss: 0.1480
Epoch   5 Batch  200/1077 - Train Accuracy: 0.8957, Validation Accuracy: 0.8960, Loss: 0.1593
Epoch   5 Batch  210/1077 - Train Accuracy: 0.8839, Validation Accuracy: 0.8892, Loss: 0.1463
Epoch   5 Batch  220/1077 - Train Accuracy: 0.8894, Validation Accuracy: 0.8967, Loss: 0.1428
Epoch   5 Batch  230/1077 - Train Accuracy: 0.8478, Validation Accuracy: 0.8846, Loss: 0.1484
Epoch   5 Batch  240/1077 - Train Accuracy: 0.9180, Validation Accuracy: 0.8878, Loss: 0.1417
Epoch   5 Batch  250/1077 - Train Accuracy: 0.8849, Validation Accuracy: 0.8892, Loss: 0.1452
Epoch   5 Batch  260/1077 - Train Accuracy: 0.8917, Validation Accuracy: 0.8867, Loss: 0.1309
Epoch   5 Batch  270/1077 - Train Accuracy: 0.8324, Validation Accuracy: 0.8928, Loss: 0.1610
Epoch   5 Batch  280/1077 - Train Accuracy: 0.8766, Validation Accuracy: 0.8974, Loss: 0.1545
Epoch   5 Batch  290/1077 - Train Accuracy: 0.8711, Validation Accuracy: 0.8690, Loss: 0.1685
Epoch   5 Batch  300/1077 - Train Accuracy: 0.9079, Validation Accuracy: 0.8803, Loss: 0.1324
Epoch   5 Batch  310/1077 - Train Accuracy: 0.8859, Validation Accuracy: 0.8917, Loss: 0.1574
Epoch   5 Batch  320/1077 - Train Accuracy: 0.9105, Validation Accuracy: 0.9016, Loss: 0.1385
Epoch   5 Batch  330/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.9102, Loss: 0.1325
Epoch   5 Batch  340/1077 - Train Accuracy: 0.9050, Validation Accuracy: 0.8842, Loss: 0.1463
Epoch   5 Batch  350/1077 - Train Accuracy: 0.8812, Validation Accuracy: 0.8899, Loss: 0.1237
Epoch   5 Batch  360/1077 - Train Accuracy: 0.9164, Validation Accuracy: 0.9013, Loss: 0.1438
Epoch   5 Batch  370/1077 - Train Accuracy: 0.9115, Validation Accuracy: 0.8949, Loss: 0.1400
Epoch   5 Batch  380/1077 - Train Accuracy: 0.9090, Validation Accuracy: 0.9038, Loss: 0.1366
Epoch   5 Batch  390/1077 - Train Accuracy: 0.8590, Validation Accuracy: 0.8899, Loss: 0.1636
Epoch   5 Batch  400/1077 - Train Accuracy: 0.8801, Validation Accuracy: 0.9070, Loss: 0.1393
Epoch   5 Batch  410/1077 - Train Accuracy: 0.8405, Validation Accuracy: 0.9013, Loss: 0.1593
Epoch   5 Batch  420/1077 - Train Accuracy: 0.9070, Validation Accuracy: 0.8800, Loss: 0.1408
Epoch   5 Batch  430/1077 - Train Accuracy: 0.8812, Validation Accuracy: 0.9123, Loss: 0.1464
Epoch   5 Batch  440/1077 - Train Accuracy: 0.8746, Validation Accuracy: 0.9045, Loss: 0.1628
Epoch   5 Batch  450/1077 - Train Accuracy: 0.9160, Validation Accuracy: 0.8878, Loss: 0.1395
Epoch   5 Batch  460/1077 - Train Accuracy: 0.8797, Validation Accuracy: 0.8864, Loss: 0.1510
Epoch   5 Batch  470/1077 - Train Accuracy: 0.8960, Validation Accuracy: 0.8910, Loss: 0.1407
Epoch   5 Batch  480/1077 - Train Accuracy: 0.8771, Validation Accuracy: 0.8935, Loss: 0.1227
Epoch   5 Batch  490/1077 - Train Accuracy: 0.8836, Validation Accuracy: 0.8920, Loss: 0.1519
Epoch   5 Batch  500/1077 - Train Accuracy: 0.8938, Validation Accuracy: 0.8718, Loss: 0.1244
Epoch   5 Batch  510/1077 - Train Accuracy: 0.8633, Validation Accuracy: 0.8757, Loss: 0.1372
Epoch   5 Batch  520/1077 - Train Accuracy: 0.9286, Validation Accuracy: 0.8832, Loss: 0.1220
Epoch   5 Batch  530/1077 - Train Accuracy: 0.8520, Validation Accuracy: 0.8786, Loss: 0.1595
Epoch   5 Batch  540/1077 - Train Accuracy: 0.8914, Validation Accuracy: 0.8835, Loss: 0.1267
Epoch   5 Batch  550/1077 - Train Accuracy: 0.8629, Validation Accuracy: 0.8807, Loss: 0.1586
Epoch   5 Batch  560/1077 - Train Accuracy: 0.8895, Validation Accuracy: 0.8750, Loss: 0.1257
Epoch   5 Batch  570/1077 - Train Accuracy: 0.8877, Validation Accuracy: 0.8775, Loss: 0.1601
Epoch   5 Batch  580/1077 - Train Accuracy: 0.9271, Validation Accuracy: 0.9027, Loss: 0.1281
Epoch   5 Batch  590/1077 - Train Accuracy: 0.8886, Validation Accuracy: 0.8974, Loss: 0.1503
Epoch   5 Batch  600/1077 - Train Accuracy: 0.9014, Validation Accuracy: 0.8853, Loss: 0.1314
Epoch   5 Batch  610/1077 - Train Accuracy: 0.8701, Validation Accuracy: 0.8920, Loss: 0.1398
Epoch   5 Batch  620/1077 - Train Accuracy: 0.9082, Validation Accuracy: 0.9073, Loss: 0.1285
Epoch   5 Batch  630/1077 - Train Accuracy: 0.9125, Validation Accuracy: 0.8942, Loss: 0.1304
Epoch   5 Batch  640/1077 - Train Accuracy: 0.8750, Validation Accuracy: 0.9087, Loss: 0.1352
Epoch   5 Batch  650/1077 - Train Accuracy: 0.8926, Validation Accuracy: 0.8970, Loss: 0.1387
Epoch   5 Batch  660/1077 - Train Accuracy: 0.9348, Validation Accuracy: 0.8974, Loss: 0.1304
Epoch   5 Batch  670/1077 - Train Accuracy: 0.8949, Validation Accuracy: 0.8803, Loss: 0.1302
Epoch   5 Batch  680/1077 - Train Accuracy: 0.8828, Validation Accuracy: 0.8945, Loss: 0.1326
Epoch   5 Batch  690/1077 - Train Accuracy: 0.8871, Validation Accuracy: 0.8935, Loss: 0.1294
Epoch   5 Batch  700/1077 - Train Accuracy: 0.9016, Validation Accuracy: 0.9002, Loss: 0.1332
Epoch   5 Batch  710/1077 - Train Accuracy: 0.8980, Validation Accuracy: 0.9087, Loss: 0.1363
Epoch   5 Batch  720/1077 - Train Accuracy: 0.9062, Validation Accuracy: 0.8938, Loss: 0.1369
Epoch   5 Batch  730/1077 - Train Accuracy: 0.9000, Validation Accuracy: 0.9055, Loss: 0.1437
Epoch   5 Batch  740/1077 - Train Accuracy: 0.9051, Validation Accuracy: 0.8828, Loss: 0.1325
Epoch   5 Batch  750/1077 - Train Accuracy: 0.9090, Validation Accuracy: 0.9034, Loss: 0.1336
Epoch   5 Batch  760/1077 - Train Accuracy: 0.9078, Validation Accuracy: 0.9070, Loss: 0.1291
Epoch   5 Batch  770/1077 - Train Accuracy: 0.8958, Validation Accuracy: 0.9151, Loss: 0.1219
Epoch   5 Batch  780/1077 - Train Accuracy: 0.8586, Validation Accuracy: 0.9102, Loss: 0.1417
Epoch   5 Batch  790/1077 - Train Accuracy: 0.8488, Validation Accuracy: 0.9062, Loss: 0.1603
Epoch   5 Batch  800/1077 - Train Accuracy: 0.9027, Validation Accuracy: 0.9045, Loss: 0.1326
Epoch   5 Batch  810/1077 - Train Accuracy: 0.8984, Validation Accuracy: 0.9041, Loss: 0.1184
Epoch   5 Batch  820/1077 - Train Accuracy: 0.8809, Validation Accuracy: 0.8967, Loss: 0.1617
Epoch   5 Batch  830/1077 - Train Accuracy: 0.8535, Validation Accuracy: 0.9038, Loss: 0.1488
Epoch   5 Batch  840/1077 - Train Accuracy: 0.8906, Validation Accuracy: 0.9102, Loss: 0.1337
Epoch   5 Batch  850/1077 - Train Accuracy: 0.8583, Validation Accuracy: 0.9155, Loss: 0.1634
Epoch   5 Batch  860/1077 - Train Accuracy: 0.8996, Validation Accuracy: 0.8977, Loss: 0.1541
Epoch   5 Batch  870/1077 - Train Accuracy: 0.8569, Validation Accuracy: 0.8807, Loss: 0.1760
Epoch   5 Batch  880/1077 - Train Accuracy: 0.9152, Validation Accuracy: 0.9151, Loss: 0.1494
Epoch   5 Batch  890/1077 - Train Accuracy: 0.9036, Validation Accuracy: 0.8828, Loss: 0.1414
Epoch   5 Batch  900/1077 - Train Accuracy: 0.8438, Validation Accuracy: 0.8281, Loss: 0.4399
Epoch   5 Batch  910/1077 - Train Accuracy: 0.8445, Validation Accuracy: 0.8672, Loss: 0.2659
Epoch   5 Batch  920/1077 - Train Accuracy: 0.8395, Validation Accuracy: 0.8597, Loss: 0.2064
Epoch   5 Batch  930/1077 - Train Accuracy: 0.8691, Validation Accuracy: 0.8825, Loss: 0.1929
Epoch   5 Batch  940/1077 - Train Accuracy: 0.8930, Validation Accuracy: 0.8647, Loss: 0.1591
Epoch   5 Batch  950/1077 - Train Accuracy: 0.8817, Validation Accuracy: 0.8853, Loss: 0.1361
Epoch   5 Batch  960/1077 - Train Accuracy: 0.8977, Validation Accuracy: 0.8924, Loss: 0.1338
Epoch   5 Batch  970/1077 - Train Accuracy: 0.9105, Validation Accuracy: 0.9006, Loss: 0.1517
Epoch   5 Batch  980/1077 - Train Accuracy: 0.8578, Validation Accuracy: 0.8956, Loss: 0.1447
Epoch   5 Batch  990/1077 - Train Accuracy: 0.8836, Validation Accuracy: 0.9087, Loss: 0.1550
Epoch   5 Batch 1000/1077 - Train Accuracy: 0.8750, Validation Accuracy: 0.8977, Loss: 0.1240
Epoch   5 Batch 1010/1077 - Train Accuracy: 0.8973, Validation Accuracy: 0.8814, Loss: 0.1426
Epoch   5 Batch 1020/1077 - Train Accuracy: 0.9152, Validation Accuracy: 0.9080, Loss: 0.1380
Epoch   5 Batch 1030/1077 - Train Accuracy: 0.8859, Validation Accuracy: 0.8793, Loss: 0.1301
Epoch   5 Batch 1040/1077 - Train Accuracy: 0.8849, Validation Accuracy: 0.8853, Loss: 0.1403
Epoch   5 Batch 1050/1077 - Train Accuracy: 0.9082, Validation Accuracy: 0.8920, Loss: 0.1390
Epoch   5 Batch 1060/1077 - Train Accuracy: 0.9156, Validation Accuracy: 0.9141, Loss: 0.1203
Epoch   5 Batch 1070/1077 - Train Accuracy: 0.8883, Validation Accuracy: 0.8920, Loss: 0.1488
Epoch   6 Batch   10/1077 - Train Accuracy: 0.9079, Validation Accuracy: 0.9130, Loss: 0.1335
Epoch   6 Batch   20/1077 - Train Accuracy: 0.9008, Validation Accuracy: 0.9020, Loss: 0.1423
Epoch   6 Batch   30/1077 - Train Accuracy: 0.9090, Validation Accuracy: 0.8970, Loss: 0.1378
Epoch   6 Batch   40/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.9144, Loss: 0.1358
Epoch   6 Batch   50/1077 - Train Accuracy: 0.8902, Validation Accuracy: 0.9070, Loss: 0.1366
Epoch   6 Batch   60/1077 - Train Accuracy: 0.9033, Validation Accuracy: 0.8995, Loss: 0.1250
Epoch   6 Batch   70/1077 - Train Accuracy: 0.8861, Validation Accuracy: 0.9013, Loss: 0.1364
Epoch   6 Batch   80/1077 - Train Accuracy: 0.8883, Validation Accuracy: 0.8917, Loss: 0.1284
Epoch   6 Batch   90/1077 - Train Accuracy: 0.8840, Validation Accuracy: 0.8945, Loss: 0.1307
Epoch   6 Batch  100/1077 - Train Accuracy: 0.8992, Validation Accuracy: 0.8825, Loss: 0.1250
Epoch   6 Batch  110/1077 - Train Accuracy: 0.9043, Validation Accuracy: 0.9034, Loss: 0.1121
Epoch   6 Batch  120/1077 - Train Accuracy: 0.9035, Validation Accuracy: 0.9055, Loss: 0.1420
Epoch   6 Batch  130/1077 - Train Accuracy: 0.8973, Validation Accuracy: 0.8970, Loss: 0.1321
Epoch   6 Batch  140/1077 - Train Accuracy: 0.8980, Validation Accuracy: 0.9041, Loss: 0.1423
Epoch   6 Batch  150/1077 - Train Accuracy: 0.9010, Validation Accuracy: 0.8878, Loss: 0.1336
Epoch   6 Batch  160/1077 - Train Accuracy: 0.9094, Validation Accuracy: 0.9077, Loss: 0.1425
Epoch   6 Batch  170/1077 - Train Accuracy: 0.8742, Validation Accuracy: 0.8867, Loss: 0.1428
Epoch   6 Batch  180/1077 - Train Accuracy: 0.8934, Validation Accuracy: 0.8888, Loss: 0.1359
Epoch   6 Batch  190/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.8974, Loss: 0.1191
Epoch   6 Batch  200/1077 - Train Accuracy: 0.8898, Validation Accuracy: 0.9048, Loss: 0.1393
Epoch   6 Batch  210/1077 - Train Accuracy: 0.8932, Validation Accuracy: 0.8906, Loss: 0.1323
Epoch   6 Batch  220/1077 - Train Accuracy: 0.9071, Validation Accuracy: 0.9009, Loss: 0.1228
Epoch   6 Batch  230/1077 - Train Accuracy: 0.8813, Validation Accuracy: 0.9137, Loss: 0.1274
Epoch   6 Batch  240/1077 - Train Accuracy: 0.9316, Validation Accuracy: 0.8896, Loss: 0.1119
Epoch   6 Batch  250/1077 - Train Accuracy: 0.8935, Validation Accuracy: 0.9165, Loss: 0.1219
Epoch   6 Batch  260/1077 - Train Accuracy: 0.9129, Validation Accuracy: 0.9119, Loss: 0.1218
Epoch   6 Batch  270/1077 - Train Accuracy: 0.8832, Validation Accuracy: 0.9180, Loss: 0.1516
Epoch   6 Batch  280/1077 - Train Accuracy: 0.8781, Validation Accuracy: 0.9268, Loss: 0.1250
Epoch   6 Batch  290/1077 - Train Accuracy: 0.8984, Validation Accuracy: 0.9016, Loss: 0.1434
Epoch   6 Batch  300/1077 - Train Accuracy: 0.9174, Validation Accuracy: 0.8945, Loss: 0.1213
Epoch   6 Batch  310/1077 - Train Accuracy: 0.9117, Validation Accuracy: 0.8981, Loss: 0.1445
Epoch   6 Batch  320/1077 - Train Accuracy: 0.9031, Validation Accuracy: 0.8981, Loss: 0.1158
Epoch   6 Batch  330/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.8991, Loss: 0.1180
Epoch   6 Batch  340/1077 - Train Accuracy: 0.9260, Validation Accuracy: 0.9134, Loss: 0.1326
Epoch   6 Batch  350/1077 - Train Accuracy: 0.8840, Validation Accuracy: 0.9027, Loss: 0.1238
Epoch   6 Batch  360/1077 - Train Accuracy: 0.9207, Validation Accuracy: 0.9038, Loss: 0.1311
Epoch   6 Batch  370/1077 - Train Accuracy: 0.9126, Validation Accuracy: 0.9141, Loss: 0.1321
Epoch   6 Batch  380/1077 - Train Accuracy: 0.9406, Validation Accuracy: 0.9141, Loss: 0.1229
Epoch   6 Batch  390/1077 - Train Accuracy: 0.8758, Validation Accuracy: 0.9222, Loss: 0.1441
Epoch   6 Batch  400/1077 - Train Accuracy: 0.9051, Validation Accuracy: 0.9080, Loss: 0.1263
Epoch   6 Batch  410/1077 - Train Accuracy: 0.8717, Validation Accuracy: 0.9194, Loss: 0.1422
Epoch   6 Batch  420/1077 - Train Accuracy: 0.9262, Validation Accuracy: 0.9187, Loss: 0.1182
Epoch   6 Batch  430/1077 - Train Accuracy: 0.8887, Validation Accuracy: 0.8949, Loss: 0.1530
Epoch   6 Batch  440/1077 - Train Accuracy: 0.8719, Validation Accuracy: 0.8924, Loss: 0.1474
Epoch   6 Batch  450/1077 - Train Accuracy: 0.9121, Validation Accuracy: 0.8800, Loss: 0.1191
Epoch   6 Batch  460/1077 - Train Accuracy: 0.9039, Validation Accuracy: 0.9055, Loss: 0.1237
Epoch   6 Batch  470/1077 - Train Accuracy: 0.9137, Validation Accuracy: 0.8981, Loss: 0.1220
Epoch   6 Batch  480/1077 - Train Accuracy: 0.9239, Validation Accuracy: 0.9031, Loss: 0.1084
Epoch   6 Batch  490/1077 - Train Accuracy: 0.8965, Validation Accuracy: 0.9080, Loss: 0.1295
Epoch   6 Batch  500/1077 - Train Accuracy: 0.9008, Validation Accuracy: 0.9016, Loss: 0.1260
Epoch   6 Batch  510/1077 - Train Accuracy: 0.8816, Validation Accuracy: 0.9055, Loss: 0.1358
Epoch   6 Batch  520/1077 - Train Accuracy: 0.9449, Validation Accuracy: 0.8991, Loss: 0.1098
Epoch   6 Batch  530/1077 - Train Accuracy: 0.8773, Validation Accuracy: 0.8924, Loss: 0.1408
Epoch   6 Batch  540/1077 - Train Accuracy: 0.9141, Validation Accuracy: 0.8732, Loss: 0.1105
Epoch   6 Batch  550/1077 - Train Accuracy: 0.8574, Validation Accuracy: 0.8807, Loss: 0.1354
Epoch   6 Batch  560/1077 - Train Accuracy: 0.9168, Validation Accuracy: 0.9212, Loss: 0.1143
Epoch   6 Batch  570/1077 - Train Accuracy: 0.8927, Validation Accuracy: 0.9066, Loss: 0.1417
Epoch   6 Batch  580/1077 - Train Accuracy: 0.9345, Validation Accuracy: 0.8888, Loss: 0.1114
Epoch   6 Batch  590/1077 - Train Accuracy: 0.9013, Validation Accuracy: 0.9080, Loss: 0.1214
Epoch   6 Batch  600/1077 - Train Accuracy: 0.9230, Validation Accuracy: 0.9023, Loss: 0.1121
Epoch   6 Batch  610/1077 - Train Accuracy: 0.8799, Validation Accuracy: 0.9141, Loss: 0.1366
Epoch   6 Batch  620/1077 - Train Accuracy: 0.9027, Validation Accuracy: 0.9006, Loss: 0.1155
Epoch   6 Batch  630/1077 - Train Accuracy: 0.9141, Validation Accuracy: 0.9119, Loss: 0.1254
Epoch   6 Batch  640/1077 - Train Accuracy: 0.8921, Validation Accuracy: 0.9098, Loss: 0.1234
Epoch   6 Batch  650/1077 - Train Accuracy: 0.9035, Validation Accuracy: 0.9102, Loss: 0.1282
Epoch   6 Batch  660/1077 - Train Accuracy: 0.9398, Validation Accuracy: 0.9016, Loss: 0.1191
Epoch   6 Batch  670/1077 - Train Accuracy: 0.9020, Validation Accuracy: 0.8949, Loss: 0.1162
Epoch   6 Batch  680/1077 - Train Accuracy: 0.8843, Validation Accuracy: 0.9112, Loss: 0.1207
Epoch   6 Batch  690/1077 - Train Accuracy: 0.8992, Validation Accuracy: 0.9009, Loss: 0.1179
Epoch   6 Batch  700/1077 - Train Accuracy: 0.9137, Validation Accuracy: 0.9190, Loss: 0.1182
Epoch   6 Batch  710/1077 - Train Accuracy: 0.8922, Validation Accuracy: 0.9144, Loss: 0.1297
Epoch   6 Batch  720/1077 - Train Accuracy: 0.9132, Validation Accuracy: 0.9137, Loss: 0.1289
Epoch   6 Batch  730/1077 - Train Accuracy: 0.8891, Validation Accuracy: 0.9016, Loss: 0.1465
Epoch   6 Batch  740/1077 - Train Accuracy: 0.9137, Validation Accuracy: 0.9013, Loss: 0.1227
Epoch   6 Batch  750/1077 - Train Accuracy: 0.9133, Validation Accuracy: 0.8999, Loss: 0.1259
Epoch   6 Batch  760/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.9105, Loss: 0.1113
Epoch   6 Batch  770/1077 - Train Accuracy: 0.9152, Validation Accuracy: 0.9080, Loss: 0.1103
Epoch   6 Batch  780/1077 - Train Accuracy: 0.8594, Validation Accuracy: 0.9126, Loss: 0.1244
Epoch   6 Batch  790/1077 - Train Accuracy: 0.8652, Validation Accuracy: 0.9283, Loss: 0.1650
Epoch   6 Batch  800/1077 - Train Accuracy: 0.8969, Validation Accuracy: 0.9226, Loss: 0.1309
Epoch   6 Batch  810/1077 - Train Accuracy: 0.8955, Validation Accuracy: 0.9084, Loss: 0.1237
Epoch   6 Batch  820/1077 - Train Accuracy: 0.8922, Validation Accuracy: 0.9134, Loss: 0.1517
Epoch   6 Batch  830/1077 - Train Accuracy: 0.8688, Validation Accuracy: 0.9251, Loss: 0.1358
Epoch   6 Batch  840/1077 - Train Accuracy: 0.8949, Validation Accuracy: 0.9226, Loss: 0.1230
Epoch   6 Batch  850/1077 - Train Accuracy: 0.8910, Validation Accuracy: 0.9027, Loss: 0.1398
Epoch   6 Batch  860/1077 - Train Accuracy: 0.9263, Validation Accuracy: 0.9087, Loss: 0.1263
Epoch   6 Batch  870/1077 - Train Accuracy: 0.8943, Validation Accuracy: 0.9169, Loss: 0.1322
Epoch   6 Batch  880/1077 - Train Accuracy: 0.9191, Validation Accuracy: 0.9279, Loss: 0.1224
Epoch   6 Batch  890/1077 - Train Accuracy: 0.9200, Validation Accuracy: 0.8885, Loss: 0.1165
Epoch   6 Batch  900/1077 - Train Accuracy: 0.9102, Validation Accuracy: 0.9077, Loss: 0.1187
Epoch   6 Batch  910/1077 - Train Accuracy: 0.8988, Validation Accuracy: 0.9173, Loss: 0.1293
Epoch   6 Batch  920/1077 - Train Accuracy: 0.8879, Validation Accuracy: 0.9066, Loss: 0.1301
Epoch   6 Batch  930/1077 - Train Accuracy: 0.9137, Validation Accuracy: 0.8913, Loss: 0.1185
Epoch   6 Batch  940/1077 - Train Accuracy: 0.9043, Validation Accuracy: 0.9023, Loss: 0.1197
Epoch   6 Batch  950/1077 - Train Accuracy: 0.9074, Validation Accuracy: 0.8977, Loss: 0.1136
Epoch   6 Batch  960/1077 - Train Accuracy: 0.9260, Validation Accuracy: 0.8977, Loss: 0.1172
Epoch   6 Batch  970/1077 - Train Accuracy: 0.9281, Validation Accuracy: 0.9098, Loss: 0.1173
Epoch   6 Batch  980/1077 - Train Accuracy: 0.8762, Validation Accuracy: 0.8952, Loss: 0.1299
Epoch   6 Batch  990/1077 - Train Accuracy: 0.8828, Validation Accuracy: 0.8956, Loss: 0.1285
Epoch   6 Batch 1000/1077 - Train Accuracy: 0.8895, Validation Accuracy: 0.8974, Loss: 0.1005
Epoch   6 Batch 1010/1077 - Train Accuracy: 0.9336, Validation Accuracy: 0.9141, Loss: 0.1225
Epoch   6 Batch 1020/1077 - Train Accuracy: 0.9270, Validation Accuracy: 0.9233, Loss: 0.1103
Epoch   6 Batch 1030/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.9102, Loss: 0.1143
Epoch   6 Batch 1040/1077 - Train Accuracy: 0.9124, Validation Accuracy: 0.8995, Loss: 0.1096
Epoch   6 Batch 1050/1077 - Train Accuracy: 0.9113, Validation Accuracy: 0.9048, Loss: 0.1237
Epoch   6 Batch 1060/1077 - Train Accuracy: 0.9363, Validation Accuracy: 0.9048, Loss: 0.1126
Epoch   6 Batch 1070/1077 - Train Accuracy: 0.8816, Validation Accuracy: 0.9073, Loss: 0.1302
Epoch   7 Batch   10/1077 - Train Accuracy: 0.9009, Validation Accuracy: 0.8874, Loss: 0.1317
Epoch   7 Batch   20/1077 - Train Accuracy: 0.8902, Validation Accuracy: 0.9105, Loss: 0.1367
Epoch   7 Batch   30/1077 - Train Accuracy: 0.9184, Validation Accuracy: 0.9251, Loss: 0.1274
Epoch   7 Batch   40/1077 - Train Accuracy: 0.9109, Validation Accuracy: 0.8974, Loss: 0.1358
Epoch   7 Batch   50/1077 - Train Accuracy: 0.9121, Validation Accuracy: 0.9052, Loss: 0.1234
Epoch   7 Batch   60/1077 - Train Accuracy: 0.9103, Validation Accuracy: 0.8906, Loss: 0.1298
Epoch   7 Batch   70/1077 - Train Accuracy: 0.8976, Validation Accuracy: 0.9020, Loss: 0.1391
Epoch   7 Batch   80/1077 - Train Accuracy: 0.8980, Validation Accuracy: 0.9020, Loss: 0.1292
Epoch   7 Batch   90/1077 - Train Accuracy: 0.8922, Validation Accuracy: 0.8963, Loss: 0.1319
Epoch   7 Batch  100/1077 - Train Accuracy: 0.9184, Validation Accuracy: 0.9055, Loss: 0.1242
Epoch   7 Batch  110/1077 - Train Accuracy: 0.9055, Validation Accuracy: 0.9080, Loss: 0.1060
Epoch   7 Batch  120/1077 - Train Accuracy: 0.9215, Validation Accuracy: 0.9052, Loss: 0.1391
Epoch   7 Batch  130/1077 - Train Accuracy: 0.9003, Validation Accuracy: 0.9055, Loss: 0.1174
Epoch   7 Batch  140/1077 - Train Accuracy: 0.9161, Validation Accuracy: 0.9066, Loss: 0.1172
Epoch   7 Batch  150/1077 - Train Accuracy: 0.9074, Validation Accuracy: 0.9094, Loss: 0.1163
Epoch   7 Batch  160/1077 - Train Accuracy: 0.9035, Validation Accuracy: 0.9055, Loss: 0.1336
Epoch   7 Batch  170/1077 - Train Accuracy: 0.9121, Validation Accuracy: 0.9098, Loss: 0.1336
Epoch   7 Batch  180/1077 - Train Accuracy: 0.9121, Validation Accuracy: 0.9212, Loss: 0.1163
Epoch   7 Batch  190/1077 - Train Accuracy: 0.9262, Validation Accuracy: 0.9048, Loss: 0.1068
Epoch   7 Batch  200/1077 - Train Accuracy: 0.9066, Validation Accuracy: 0.9169, Loss: 0.1210
Epoch   7 Batch  210/1077 - Train Accuracy: 0.8996, Validation Accuracy: 0.9169, Loss: 0.1080
Epoch   7 Batch  220/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.9137, Loss: 0.1096
Epoch   7 Batch  230/1077 - Train Accuracy: 0.9018, Validation Accuracy: 0.9212, Loss: 0.1211
Epoch   7 Batch  240/1077 - Train Accuracy: 0.9281, Validation Accuracy: 0.9180, Loss: 0.1102
Epoch   7 Batch  250/1077 - Train Accuracy: 0.8924, Validation Accuracy: 0.9034, Loss: 0.1173
Epoch   7 Batch  260/1077 - Train Accuracy: 0.9189, Validation Accuracy: 0.8995, Loss: 0.1137
Epoch   7 Batch  270/1077 - Train Accuracy: 0.8953, Validation Accuracy: 0.9087, Loss: 0.1318
Epoch   7 Batch  280/1077 - Train Accuracy: 0.8848, Validation Accuracy: 0.9109, Loss: 0.1239
Epoch   7 Batch  290/1077 - Train Accuracy: 0.9055, Validation Accuracy: 0.9102, Loss: 0.1232
Epoch   7 Batch  300/1077 - Train Accuracy: 0.9264, Validation Accuracy: 0.9116, Loss: 0.1142
Epoch   7 Batch  310/1077 - Train Accuracy: 0.8996, Validation Accuracy: 0.9141, Loss: 0.1207
Epoch   7 Batch  320/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.8917, Loss: 0.1086
Epoch   7 Batch  330/1077 - Train Accuracy: 0.9207, Validation Accuracy: 0.9173, Loss: 0.1071
Epoch   7 Batch  340/1077 - Train Accuracy: 0.9112, Validation Accuracy: 0.9052, Loss: 0.1196
Epoch   7 Batch  350/1077 - Train Accuracy: 0.9359, Validation Accuracy: 0.9091, Loss: 0.1074
Epoch   7 Batch  360/1077 - Train Accuracy: 0.9191, Validation Accuracy: 0.9059, Loss: 0.1105
Epoch   7 Batch  370/1077 - Train Accuracy: 0.9304, Validation Accuracy: 0.9155, Loss: 0.1101
Epoch   7 Batch  380/1077 - Train Accuracy: 0.9492, Validation Accuracy: 0.9112, Loss: 0.1039
Epoch   7 Batch  390/1077 - Train Accuracy: 0.8918, Validation Accuracy: 0.9300, Loss: 0.1340
Epoch   7 Batch  400/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.9109, Loss: 0.1109
Epoch   7 Batch  410/1077 - Train Accuracy: 0.8787, Validation Accuracy: 0.9126, Loss: 0.1309
Epoch   7 Batch  420/1077 - Train Accuracy: 0.8996, Validation Accuracy: 0.9087, Loss: 0.1155
Epoch   7 Batch  430/1077 - Train Accuracy: 0.8844, Validation Accuracy: 0.9112, Loss: 0.1444
Epoch   7 Batch  440/1077 - Train Accuracy: 0.8918, Validation Accuracy: 0.8945, Loss: 0.1522
Epoch   7 Batch  450/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.8892, Loss: 0.1315
Epoch   7 Batch  460/1077 - Train Accuracy: 0.9035, Validation Accuracy: 0.9048, Loss: 0.1171
Epoch   7 Batch  470/1077 - Train Accuracy: 0.9359, Validation Accuracy: 0.8988, Loss: 0.1131
Epoch   7 Batch  480/1077 - Train Accuracy: 0.9095, Validation Accuracy: 0.9023, Loss: 0.1140
Epoch   7 Batch  490/1077 - Train Accuracy: 0.8961, Validation Accuracy: 0.8981, Loss: 0.1197
Epoch   7 Batch  500/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.8928, Loss: 0.1143
Epoch   7 Batch  510/1077 - Train Accuracy: 0.8762, Validation Accuracy: 0.8910, Loss: 0.1423
Epoch   7 Batch  520/1077 - Train Accuracy: 0.9349, Validation Accuracy: 0.8754, Loss: 0.1049
Epoch   7 Batch  530/1077 - Train Accuracy: 0.8711, Validation Accuracy: 0.8839, Loss: 0.1556
Epoch   7 Batch  540/1077 - Train Accuracy: 0.8746, Validation Accuracy: 0.8817, Loss: 0.1246
Epoch   7 Batch  550/1077 - Train Accuracy: 0.8594, Validation Accuracy: 0.8970, Loss: 0.1415
Epoch   7 Batch  560/1077 - Train Accuracy: 0.9020, Validation Accuracy: 0.9123, Loss: 0.1107
Epoch   7 Batch  570/1077 - Train Accuracy: 0.9108, Validation Accuracy: 0.8956, Loss: 0.1455
Epoch   7 Batch  580/1077 - Train Accuracy: 0.9308, Validation Accuracy: 0.8988, Loss: 0.1021
Epoch   7 Batch  590/1077 - Train Accuracy: 0.9021, Validation Accuracy: 0.9055, Loss: 0.1255
Epoch   7 Batch  600/1077 - Train Accuracy: 0.9200, Validation Accuracy: 0.9080, Loss: 0.1044
Epoch   7 Batch  610/1077 - Train Accuracy: 0.8853, Validation Accuracy: 0.9080, Loss: 0.1160
Epoch   7 Batch  620/1077 - Train Accuracy: 0.9117, Validation Accuracy: 0.9098, Loss: 0.1064
Epoch   7 Batch  630/1077 - Train Accuracy: 0.9168, Validation Accuracy: 0.9027, Loss: 0.1118
Epoch   7 Batch  640/1077 - Train Accuracy: 0.8947, Validation Accuracy: 0.8999, Loss: 0.1138
Epoch   7 Batch  650/1077 - Train Accuracy: 0.9176, Validation Accuracy: 0.9041, Loss: 0.1147
Epoch   7 Batch  660/1077 - Train Accuracy: 0.9375, Validation Accuracy: 0.9073, Loss: 0.1153
Epoch   7 Batch  670/1077 - Train Accuracy: 0.9208, Validation Accuracy: 0.9205, Loss: 0.1104
Epoch   7 Batch  680/1077 - Train Accuracy: 0.8917, Validation Accuracy: 0.8906, Loss: 0.1417
Epoch   7 Batch  690/1077 - Train Accuracy: 0.8984, Validation Accuracy: 0.8835, Loss: 0.1381
Epoch   7 Batch  700/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.9016, Loss: 0.1241
Epoch   7 Batch  710/1077 - Train Accuracy: 0.8816, Validation Accuracy: 0.9105, Loss: 0.1273
Epoch   7 Batch  720/1077 - Train Accuracy: 0.9034, Validation Accuracy: 0.9155, Loss: 0.1324
Epoch   7 Batch  730/1077 - Train Accuracy: 0.9004, Validation Accuracy: 0.9094, Loss: 0.1331
Epoch   7 Batch  740/1077 - Train Accuracy: 0.8910, Validation Accuracy: 0.8956, Loss: 0.1167
Epoch   7 Batch  750/1077 - Train Accuracy: 0.8992, Validation Accuracy: 0.8952, Loss: 0.1227
Epoch   7 Batch  760/1077 - Train Accuracy: 0.9141, Validation Accuracy: 0.9098, Loss: 0.1138
Epoch   7 Batch  770/1077 - Train Accuracy: 0.9152, Validation Accuracy: 0.9183, Loss: 0.1125
Epoch   7 Batch  780/1077 - Train Accuracy: 0.8711, Validation Accuracy: 0.9194, Loss: 0.1180
Epoch   7 Batch  790/1077 - Train Accuracy: 0.8773, Validation Accuracy: 0.9041, Loss: 0.1306
Epoch   7 Batch  800/1077 - Train Accuracy: 0.9078, Validation Accuracy: 0.9126, Loss: 0.1183
Epoch   7 Batch  810/1077 - Train Accuracy: 0.9263, Validation Accuracy: 0.9052, Loss: 0.1145
Epoch   7 Batch  820/1077 - Train Accuracy: 0.8875, Validation Accuracy: 0.9130, Loss: 0.1393
Epoch   7 Batch  830/1077 - Train Accuracy: 0.8699, Validation Accuracy: 0.9013, Loss: 0.1280
Epoch   7 Batch  840/1077 - Train Accuracy: 0.8867, Validation Accuracy: 0.9116, Loss: 0.1170
Epoch   7 Batch  850/1077 - Train Accuracy: 0.8880, Validation Accuracy: 0.9183, Loss: 0.1407
Epoch   7 Batch  860/1077 - Train Accuracy: 0.9252, Validation Accuracy: 0.9031, Loss: 0.1144
Epoch   7 Batch  870/1077 - Train Accuracy: 0.9021, Validation Accuracy: 0.9087, Loss: 0.1190
Epoch   7 Batch  880/1077 - Train Accuracy: 0.9141, Validation Accuracy: 0.9087, Loss: 0.1280
Epoch   7 Batch  890/1077 - Train Accuracy: 0.9312, Validation Accuracy: 0.9155, Loss: 0.1068
Epoch   7 Batch  900/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.9038, Loss: 0.1125
Epoch   7 Batch  910/1077 - Train Accuracy: 0.8988, Validation Accuracy: 0.9077, Loss: 0.1206
Epoch   7 Batch  920/1077 - Train Accuracy: 0.9133, Validation Accuracy: 0.9009, Loss: 0.1145
Epoch   7 Batch  930/1077 - Train Accuracy: 0.9336, Validation Accuracy: 0.8991, Loss: 0.1021
Epoch   7 Batch  940/1077 - Train Accuracy: 0.9113, Validation Accuracy: 0.8984, Loss: 0.1060
Epoch   7 Batch  950/1077 - Train Accuracy: 0.9103, Validation Accuracy: 0.8949, Loss: 0.1004
Epoch   7 Batch  960/1077 - Train Accuracy: 0.9193, Validation Accuracy: 0.9084, Loss: 0.1063
Epoch   7 Batch  970/1077 - Train Accuracy: 0.9223, Validation Accuracy: 0.9197, Loss: 0.1093
Epoch   7 Batch  980/1077 - Train Accuracy: 0.8973, Validation Accuracy: 0.9087, Loss: 0.1175
Epoch   7 Batch  990/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.9080, Loss: 0.1146
Epoch   7 Batch 1000/1077 - Train Accuracy: 0.8802, Validation Accuracy: 0.9038, Loss: 0.1037
Epoch   7 Batch 1010/1077 - Train Accuracy: 0.9324, Validation Accuracy: 0.9034, Loss: 0.1158
Epoch   7 Batch 1020/1077 - Train Accuracy: 0.9184, Validation Accuracy: 0.8970, Loss: 0.1072
Epoch   7 Batch 1030/1077 - Train Accuracy: 0.9219, Validation Accuracy: 0.9002, Loss: 0.1186
Epoch   7 Batch 1040/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.8977, Loss: 0.1209
Epoch   7 Batch 1050/1077 - Train Accuracy: 0.9180, Validation Accuracy: 0.9034, Loss: 0.1277
Epoch   7 Batch 1060/1077 - Train Accuracy: 0.9383, Validation Accuracy: 0.9002, Loss: 0.1181
Epoch   7 Batch 1070/1077 - Train Accuracy: 0.8922, Validation Accuracy: 0.9290, Loss: 0.1370
Epoch   8 Batch   10/1077 - Train Accuracy: 0.9050, Validation Accuracy: 0.9194, Loss: 0.1307
Epoch   8 Batch   20/1077 - Train Accuracy: 0.9082, Validation Accuracy: 0.9169, Loss: 0.1351
Epoch   8 Batch   30/1077 - Train Accuracy: 0.9258, Validation Accuracy: 0.9091, Loss: 0.1327
Epoch   8 Batch   40/1077 - Train Accuracy: 0.9336, Validation Accuracy: 0.9052, Loss: 0.1163
Epoch   8 Batch   50/1077 - Train Accuracy: 0.9246, Validation Accuracy: 0.9080, Loss: 0.1180
Epoch   8 Batch   60/1077 - Train Accuracy: 0.9040, Validation Accuracy: 0.8903, Loss: 0.1154
Epoch   8 Batch   70/1077 - Train Accuracy: 0.9112, Validation Accuracy: 0.8963, Loss: 0.1225
Epoch   8 Batch   80/1077 - Train Accuracy: 0.8953, Validation Accuracy: 0.8995, Loss: 0.1217
Epoch   8 Batch   90/1077 - Train Accuracy: 0.9004, Validation Accuracy: 0.9169, Loss: 0.1145
Epoch   8 Batch  100/1077 - Train Accuracy: 0.9129, Validation Accuracy: 0.9059, Loss: 0.1275
Epoch   8 Batch  110/1077 - Train Accuracy: 0.9164, Validation Accuracy: 0.9066, Loss: 0.0974
Epoch   8 Batch  120/1077 - Train Accuracy: 0.9242, Validation Accuracy: 0.9208, Loss: 0.1118
Epoch   8 Batch  130/1077 - Train Accuracy: 0.9096, Validation Accuracy: 0.9165, Loss: 0.1128
Epoch   8 Batch  140/1077 - Train Accuracy: 0.9293, Validation Accuracy: 0.9162, Loss: 0.1193
Epoch   8 Batch  150/1077 - Train Accuracy: 0.9222, Validation Accuracy: 0.9137, Loss: 0.1025
Epoch   8 Batch  160/1077 - Train Accuracy: 0.8984, Validation Accuracy: 0.9162, Loss: 0.1187
Epoch   8 Batch  170/1077 - Train Accuracy: 0.9191, Validation Accuracy: 0.9134, Loss: 0.1213
Epoch   8 Batch  180/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.9048, Loss: 0.1134
Epoch   8 Batch  190/1077 - Train Accuracy: 0.9191, Validation Accuracy: 0.9187, Loss: 0.1002
Epoch   8 Batch  200/1077 - Train Accuracy: 0.9184, Validation Accuracy: 0.9247, Loss: 0.1101
Epoch   8 Batch  210/1077 - Train Accuracy: 0.9048, Validation Accuracy: 0.9077, Loss: 0.1044
Epoch   8 Batch  220/1077 - Train Accuracy: 0.9280, Validation Accuracy: 0.9155, Loss: 0.1029
Epoch   8 Batch  230/1077 - Train Accuracy: 0.9092, Validation Accuracy: 0.9062, Loss: 0.1106
Epoch   8 Batch  240/1077 - Train Accuracy: 0.9367, Validation Accuracy: 0.8956, Loss: 0.1099
Epoch   8 Batch  250/1077 - Train Accuracy: 0.8931, Validation Accuracy: 0.9062, Loss: 0.1153
Epoch   8 Batch  260/1077 - Train Accuracy: 0.9286, Validation Accuracy: 0.9212, Loss: 0.1046
Epoch   8 Batch  270/1077 - Train Accuracy: 0.9102, Validation Accuracy: 0.9151, Loss: 0.1254
Epoch   8 Batch  280/1077 - Train Accuracy: 0.8816, Validation Accuracy: 0.9151, Loss: 0.1257
Epoch   8 Batch  290/1077 - Train Accuracy: 0.9281, Validation Accuracy: 0.9087, Loss: 0.1171
Epoch   8 Batch  300/1077 - Train Accuracy: 0.9391, Validation Accuracy: 0.9031, Loss: 0.1130
Epoch   8 Batch  310/1077 - Train Accuracy: 0.8984, Validation Accuracy: 0.9162, Loss: 0.1335
Epoch   8 Batch  320/1077 - Train Accuracy: 0.9129, Validation Accuracy: 0.9055, Loss: 0.1067
Epoch   8 Batch  330/1077 - Train Accuracy: 0.9207, Validation Accuracy: 0.9208, Loss: 0.1009
Epoch   8 Batch  340/1077 - Train Accuracy: 0.9248, Validation Accuracy: 0.9155, Loss: 0.1261
Epoch   8 Batch  350/1077 - Train Accuracy: 0.9219, Validation Accuracy: 0.9165, Loss: 0.1126
Epoch   8 Batch  360/1077 - Train Accuracy: 0.9184, Validation Accuracy: 0.9126, Loss: 0.1061
Epoch   8 Batch  370/1077 - Train Accuracy: 0.9342, Validation Accuracy: 0.9141, Loss: 0.1173
Epoch   8 Batch  380/1077 - Train Accuracy: 0.9367, Validation Accuracy: 0.9137, Loss: 0.1167
Epoch   8 Batch  390/1077 - Train Accuracy: 0.8898, Validation Accuracy: 0.9194, Loss: 0.1330
Epoch   8 Batch  400/1077 - Train Accuracy: 0.9266, Validation Accuracy: 0.9197, Loss: 0.1089
Epoch   8 Batch  410/1077 - Train Accuracy: 0.8775, Validation Accuracy: 0.9126, Loss: 0.1379
Epoch   8 Batch  420/1077 - Train Accuracy: 0.9355, Validation Accuracy: 0.9151, Loss: 0.1029
Epoch   8 Batch  430/1077 - Train Accuracy: 0.9039, Validation Accuracy: 0.9169, Loss: 0.1206
Epoch   8 Batch  440/1077 - Train Accuracy: 0.8926, Validation Accuracy: 0.9176, Loss: 0.1267
Epoch   8 Batch  450/1077 - Train Accuracy: 0.9293, Validation Accuracy: 0.9197, Loss: 0.1073
Epoch   8 Batch  460/1077 - Train Accuracy: 0.9023, Validation Accuracy: 0.9102, Loss: 0.1069
Epoch   8 Batch  470/1077 - Train Accuracy: 0.9305, Validation Accuracy: 0.9109, Loss: 0.1107
Epoch   8 Batch  480/1077 - Train Accuracy: 0.9264, Validation Accuracy: 0.9240, Loss: 0.0891
Epoch   8 Batch  490/1077 - Train Accuracy: 0.9059, Validation Accuracy: 0.9084, Loss: 0.1135
Epoch   8 Batch  500/1077 - Train Accuracy: 0.9230, Validation Accuracy: 0.9190, Loss: 0.1046
Epoch   8 Batch  510/1077 - Train Accuracy: 0.9055, Validation Accuracy: 0.9205, Loss: 0.1073
Epoch   8 Batch  520/1077 - Train Accuracy: 0.9446, Validation Accuracy: 0.9084, Loss: 0.0987
Epoch   8 Batch  530/1077 - Train Accuracy: 0.8883, Validation Accuracy: 0.9034, Loss: 0.1309
Epoch   8 Batch  540/1077 - Train Accuracy: 0.9156, Validation Accuracy: 0.8899, Loss: 0.1030
Epoch   8 Batch  550/1077 - Train Accuracy: 0.8906, Validation Accuracy: 0.8786, Loss: 0.1231
Epoch   8 Batch  560/1077 - Train Accuracy: 0.9172, Validation Accuracy: 0.9016, Loss: 0.0965
Epoch   8 Batch  570/1077 - Train Accuracy: 0.9025, Validation Accuracy: 0.9027, Loss: 0.1291
Epoch   8 Batch  580/1077 - Train Accuracy: 0.9416, Validation Accuracy: 0.8913, Loss: 0.1019
Epoch   8 Batch  590/1077 - Train Accuracy: 0.9194, Validation Accuracy: 0.8928, Loss: 0.1186
Epoch   8 Batch  600/1077 - Train Accuracy: 0.9167, Validation Accuracy: 0.8942, Loss: 0.1039
Epoch   8 Batch  610/1077 - Train Accuracy: 0.9013, Validation Accuracy: 0.9176, Loss: 0.1077
Epoch   8 Batch  620/1077 - Train Accuracy: 0.9117, Validation Accuracy: 0.9055, Loss: 0.1140
Epoch   8 Batch  630/1077 - Train Accuracy: 0.9160, Validation Accuracy: 0.9205, Loss: 0.1125
Epoch   8 Batch  640/1077 - Train Accuracy: 0.9189, Validation Accuracy: 0.9244, Loss: 0.0998
Epoch   8 Batch  650/1077 - Train Accuracy: 0.9105, Validation Accuracy: 0.9173, Loss: 0.1146
Epoch   8 Batch  660/1077 - Train Accuracy: 0.9406, Validation Accuracy: 0.9158, Loss: 0.1059
Epoch   8 Batch  670/1077 - Train Accuracy: 0.9325, Validation Accuracy: 0.9151, Loss: 0.1039
Epoch   8 Batch  680/1077 - Train Accuracy: 0.9103, Validation Accuracy: 0.9162, Loss: 0.1071
Epoch   8 Batch  690/1077 - Train Accuracy: 0.8797, Validation Accuracy: 0.9237, Loss: 0.1175
Epoch   8 Batch  700/1077 - Train Accuracy: 0.9328, Validation Accuracy: 0.9205, Loss: 0.1047
Epoch   8 Batch  710/1077 - Train Accuracy: 0.9137, Validation Accuracy: 0.9201, Loss: 0.1202
Epoch   8 Batch  720/1077 - Train Accuracy: 0.8840, Validation Accuracy: 0.9126, Loss: 0.1210
Epoch   8 Batch  730/1077 - Train Accuracy: 0.9074, Validation Accuracy: 0.9048, Loss: 0.1314
Epoch   8 Batch  740/1077 - Train Accuracy: 0.9238, Validation Accuracy: 0.9169, Loss: 0.1098
Epoch   8 Batch  750/1077 - Train Accuracy: 0.8988, Validation Accuracy: 0.9148, Loss: 0.1170
Epoch   8 Batch  760/1077 - Train Accuracy: 0.9277, Validation Accuracy: 0.9336, Loss: 0.0957
Epoch   8 Batch  770/1077 - Train Accuracy: 0.9122, Validation Accuracy: 0.9318, Loss: 0.1014
Epoch   8 Batch  780/1077 - Train Accuracy: 0.8902, Validation Accuracy: 0.9237, Loss: 0.1085
Epoch   8 Batch  790/1077 - Train Accuracy: 0.8797, Validation Accuracy: 0.9190, Loss: 0.1418
Epoch   8 Batch  800/1077 - Train Accuracy: 0.9340, Validation Accuracy: 0.9013, Loss: 0.0981
Epoch   8 Batch  810/1077 - Train Accuracy: 0.9185, Validation Accuracy: 0.9123, Loss: 0.1083
Epoch   8 Batch  820/1077 - Train Accuracy: 0.9074, Validation Accuracy: 0.9091, Loss: 0.1270
Epoch   8 Batch  830/1077 - Train Accuracy: 0.8641, Validation Accuracy: 0.9219, Loss: 0.1258
Epoch   8 Batch  840/1077 - Train Accuracy: 0.9016, Validation Accuracy: 0.9180, Loss: 0.1065
Epoch   8 Batch  850/1077 - Train Accuracy: 0.9062, Validation Accuracy: 0.9119, Loss: 0.1295
Epoch   8 Batch  860/1077 - Train Accuracy: 0.9156, Validation Accuracy: 0.9066, Loss: 0.1241
Epoch   8 Batch  870/1077 - Train Accuracy: 0.9182, Validation Accuracy: 0.9112, Loss: 0.1194
Epoch   8 Batch  880/1077 - Train Accuracy: 0.9215, Validation Accuracy: 0.9134, Loss: 0.1130
Epoch   8 Batch  890/1077 - Train Accuracy: 0.9189, Validation Accuracy: 0.9290, Loss: 0.1010
Epoch   8 Batch  900/1077 - Train Accuracy: 0.9121, Validation Accuracy: 0.9190, Loss: 0.1141
Epoch   8 Batch  910/1077 - Train Accuracy: 0.9003, Validation Accuracy: 0.9183, Loss: 0.1124
Epoch   8 Batch  920/1077 - Train Accuracy: 0.9164, Validation Accuracy: 0.9119, Loss: 0.1000
Epoch   8 Batch  930/1077 - Train Accuracy: 0.9305, Validation Accuracy: 0.9155, Loss: 0.1084
Epoch   8 Batch  940/1077 - Train Accuracy: 0.9051, Validation Accuracy: 0.9169, Loss: 0.1020
Epoch   8 Batch  950/1077 - Train Accuracy: 0.9159, Validation Accuracy: 0.9080, Loss: 0.0937
Epoch   8 Batch  960/1077 - Train Accuracy: 0.9394, Validation Accuracy: 0.9183, Loss: 0.1072
Epoch   8 Batch  970/1077 - Train Accuracy: 0.9301, Validation Accuracy: 0.9112, Loss: 0.1004
Epoch   8 Batch  980/1077 - Train Accuracy: 0.9008, Validation Accuracy: 0.9205, Loss: 0.1045
Epoch   8 Batch  990/1077 - Train Accuracy: 0.9149, Validation Accuracy: 0.9180, Loss: 0.1160
Epoch   8 Batch 1000/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.9119, Loss: 0.0944
Epoch   8 Batch 1010/1077 - Train Accuracy: 0.9383, Validation Accuracy: 0.9052, Loss: 0.1094
Epoch   8 Batch 1020/1077 - Train Accuracy: 0.9195, Validation Accuracy: 0.9237, Loss: 0.1083
Epoch   8 Batch 1030/1077 - Train Accuracy: 0.9348, Validation Accuracy: 0.9130, Loss: 0.0940
Epoch   8 Batch 1040/1077 - Train Accuracy: 0.9137, Validation Accuracy: 0.9052, Loss: 0.1105
Epoch   8 Batch 1050/1077 - Train Accuracy: 0.9176, Validation Accuracy: 0.9197, Loss: 0.1091
Epoch   8 Batch 1060/1077 - Train Accuracy: 0.9305, Validation Accuracy: 0.8967, Loss: 0.1094
Epoch   8 Batch 1070/1077 - Train Accuracy: 0.8871, Validation Accuracy: 0.9158, Loss: 0.1280
Epoch   9 Batch   10/1077 - Train Accuracy: 0.9005, Validation Accuracy: 0.9268, Loss: 0.1308
Epoch   9 Batch   20/1077 - Train Accuracy: 0.9141, Validation Accuracy: 0.9169, Loss: 0.1308
Epoch   9 Batch   30/1077 - Train Accuracy: 0.9227, Validation Accuracy: 0.9158, Loss: 0.1182
Epoch   9 Batch   40/1077 - Train Accuracy: 0.9344, Validation Accuracy: 0.9073, Loss: 0.1140
Epoch   9 Batch   50/1077 - Train Accuracy: 0.9277, Validation Accuracy: 0.9109, Loss: 0.1180
Epoch   9 Batch   60/1077 - Train Accuracy: 0.9070, Validation Accuracy: 0.9020, Loss: 0.1066
Epoch   9 Batch   70/1077 - Train Accuracy: 0.8939, Validation Accuracy: 0.9073, Loss: 0.1192
Epoch   9 Batch   80/1077 - Train Accuracy: 0.8922, Validation Accuracy: 0.8899, Loss: 0.1355
Epoch   9 Batch   90/1077 - Train Accuracy: 0.9070, Validation Accuracy: 0.9116, Loss: 0.1161
Epoch   9 Batch  100/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.8945, Loss: 0.1191
Epoch   9 Batch  110/1077 - Train Accuracy: 0.9281, Validation Accuracy: 0.9070, Loss: 0.0951
Epoch   9 Batch  120/1077 - Train Accuracy: 0.9313, Validation Accuracy: 0.9126, Loss: 0.1089
Epoch   9 Batch  130/1077 - Train Accuracy: 0.8962, Validation Accuracy: 0.9055, Loss: 0.1089
Epoch   9 Batch  140/1077 - Train Accuracy: 0.9346, Validation Accuracy: 0.8928, Loss: 0.1098
Epoch   9 Batch  150/1077 - Train Accuracy: 0.9267, Validation Accuracy: 0.9102, Loss: 0.1064
Epoch   9 Batch  160/1077 - Train Accuracy: 0.9055, Validation Accuracy: 0.9059, Loss: 0.1165
Epoch   9 Batch  170/1077 - Train Accuracy: 0.8977, Validation Accuracy: 0.9251, Loss: 0.1181
Epoch   9 Batch  180/1077 - Train Accuracy: 0.9270, Validation Accuracy: 0.9148, Loss: 0.1106
Epoch   9 Batch  190/1077 - Train Accuracy: 0.9383, Validation Accuracy: 0.9197, Loss: 0.1004
Epoch   9 Batch  200/1077 - Train Accuracy: 0.9195, Validation Accuracy: 0.9116, Loss: 0.1134
Epoch   9 Batch  210/1077 - Train Accuracy: 0.9159, Validation Accuracy: 0.9251, Loss: 0.1239
Epoch   9 Batch  220/1077 - Train Accuracy: 0.9272, Validation Accuracy: 0.9201, Loss: 0.0954
Epoch   9 Batch  230/1077 - Train Accuracy: 0.9107, Validation Accuracy: 0.9130, Loss: 0.1062
Epoch   9 Batch  240/1077 - Train Accuracy: 0.9445, Validation Accuracy: 0.9165, Loss: 0.1020
Epoch   9 Batch  250/1077 - Train Accuracy: 0.9094, Validation Accuracy: 0.9094, Loss: 0.1022
Epoch   9 Batch  260/1077 - Train Accuracy: 0.9330, Validation Accuracy: 0.9205, Loss: 0.1053
Epoch   9 Batch  270/1077 - Train Accuracy: 0.9035, Validation Accuracy: 0.8974, Loss: 0.1166
Epoch   9 Batch  280/1077 - Train Accuracy: 0.8797, Validation Accuracy: 0.9016, Loss: 0.1115
Epoch   9 Batch  290/1077 - Train Accuracy: 0.9086, Validation Accuracy: 0.8899, Loss: 0.1176
Epoch   9 Batch  300/1077 - Train Accuracy: 0.9223, Validation Accuracy: 0.8999, Loss: 0.1015
Epoch   9 Batch  310/1077 - Train Accuracy: 0.9051, Validation Accuracy: 0.9180, Loss: 0.1144
Epoch   9 Batch  320/1077 - Train Accuracy: 0.9375, Validation Accuracy: 0.9116, Loss: 0.0985
Epoch   9 Batch  330/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.9237, Loss: 0.0875
Epoch   9 Batch  340/1077 - Train Accuracy: 0.9350, Validation Accuracy: 0.9237, Loss: 0.1068
Epoch   9 Batch  350/1077 - Train Accuracy: 0.9395, Validation Accuracy: 0.9222, Loss: 0.0932
Epoch   9 Batch  360/1077 - Train Accuracy: 0.9297, Validation Accuracy: 0.9180, Loss: 0.0958
Epoch   9 Batch  370/1077 - Train Accuracy: 0.9360, Validation Accuracy: 0.9094, Loss: 0.1053
Epoch   9 Batch  380/1077 - Train Accuracy: 0.9359, Validation Accuracy: 0.9222, Loss: 0.0944
Epoch   9 Batch  390/1077 - Train Accuracy: 0.8957, Validation Accuracy: 0.9237, Loss: 0.1172
Epoch   9 Batch  400/1077 - Train Accuracy: 0.9191, Validation Accuracy: 0.9311, Loss: 0.1010
Epoch   9 Batch  410/1077 - Train Accuracy: 0.9021, Validation Accuracy: 0.9237, Loss: 0.1152
Epoch   9 Batch  420/1077 - Train Accuracy: 0.9320, Validation Accuracy: 0.9226, Loss: 0.0884
Epoch   9 Batch  430/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.9240, Loss: 0.1110
Epoch   9 Batch  440/1077 - Train Accuracy: 0.8914, Validation Accuracy: 0.9126, Loss: 0.1210
Epoch   9 Batch  450/1077 - Train Accuracy: 0.9203, Validation Accuracy: 0.8952, Loss: 0.0957
Epoch   9 Batch  460/1077 - Train Accuracy: 0.9031, Validation Accuracy: 0.8977, Loss: 0.1103
Epoch   9 Batch  470/1077 - Train Accuracy: 0.9116, Validation Accuracy: 0.8984, Loss: 0.1183
Epoch   9 Batch  480/1077 - Train Accuracy: 0.9112, Validation Accuracy: 0.9134, Loss: 0.1212
Epoch   9 Batch  490/1077 - Train Accuracy: 0.8930, Validation Accuracy: 0.9219, Loss: 0.1319
Epoch   9 Batch  500/1077 - Train Accuracy: 0.9281, Validation Accuracy: 0.9102, Loss: 0.0997
Epoch   9 Batch  510/1077 - Train Accuracy: 0.8977, Validation Accuracy: 0.9261, Loss: 0.1122
Epoch   9 Batch  520/1077 - Train Accuracy: 0.9382, Validation Accuracy: 0.9048, Loss: 0.0963
Epoch   9 Batch  530/1077 - Train Accuracy: 0.9121, Validation Accuracy: 0.9240, Loss: 0.1213
Epoch   9 Batch  540/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.9119, Loss: 0.0935
Epoch   9 Batch  550/1077 - Train Accuracy: 0.9000, Validation Accuracy: 0.9034, Loss: 0.1194
Epoch   9 Batch  560/1077 - Train Accuracy: 0.9363, Validation Accuracy: 0.9055, Loss: 0.0873
Epoch   9 Batch  570/1077 - Train Accuracy: 0.9071, Validation Accuracy: 0.9052, Loss: 0.1288
Epoch   9 Batch  580/1077 - Train Accuracy: 0.9282, Validation Accuracy: 0.9247, Loss: 0.0960
Epoch   9 Batch  590/1077 - Train Accuracy: 0.9062, Validation Accuracy: 0.9265, Loss: 0.1219
Epoch   9 Batch  600/1077 - Train Accuracy: 0.9170, Validation Accuracy: 0.9151, Loss: 0.0987
Epoch   9 Batch  610/1077 - Train Accuracy: 0.9087, Validation Accuracy: 0.9265, Loss: 0.1039
Epoch   9 Batch  620/1077 - Train Accuracy: 0.9199, Validation Accuracy: 0.9059, Loss: 0.1060
Epoch   9 Batch  630/1077 - Train Accuracy: 0.9113, Validation Accuracy: 0.9190, Loss: 0.1089
Epoch   9 Batch  640/1077 - Train Accuracy: 0.9040, Validation Accuracy: 0.9290, Loss: 0.1113
Epoch   9 Batch  650/1077 - Train Accuracy: 0.9078, Validation Accuracy: 0.9283, Loss: 0.1218
Epoch   9 Batch  660/1077 - Train Accuracy: 0.9379, Validation Accuracy: 0.9219, Loss: 0.1113
Epoch   9 Batch  670/1077 - Train Accuracy: 0.9229, Validation Accuracy: 0.9052, Loss: 0.1047
Epoch   9 Batch  680/1077 - Train Accuracy: 0.9196, Validation Accuracy: 0.9162, Loss: 0.1074
Epoch   9 Batch  690/1077 - Train Accuracy: 0.8984, Validation Accuracy: 0.9201, Loss: 0.1021
Epoch   9 Batch  700/1077 - Train Accuracy: 0.9418, Validation Accuracy: 0.9300, Loss: 0.0971
Epoch   9 Batch  710/1077 - Train Accuracy: 0.9113, Validation Accuracy: 0.9304, Loss: 0.1084
Epoch   9 Batch  720/1077 - Train Accuracy: 0.9124, Validation Accuracy: 0.9251, Loss: 0.1111
Epoch   9 Batch  730/1077 - Train Accuracy: 0.8977, Validation Accuracy: 0.9293, Loss: 0.1113
Epoch   9 Batch  740/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.9258, Loss: 0.1153
Epoch   9 Batch  750/1077 - Train Accuracy: 0.9066, Validation Accuracy: 0.9212, Loss: 0.1283
Epoch   9 Batch  760/1077 - Train Accuracy: 0.9258, Validation Accuracy: 0.9322, Loss: 0.0933
Epoch   9 Batch  770/1077 - Train Accuracy: 0.9241, Validation Accuracy: 0.9162, Loss: 0.0904
Epoch   9 Batch  780/1077 - Train Accuracy: 0.8871, Validation Accuracy: 0.9062, Loss: 0.1022
Epoch   9 Batch  790/1077 - Train Accuracy: 0.8746, Validation Accuracy: 0.9112, Loss: 0.1241
Epoch   9 Batch  800/1077 - Train Accuracy: 0.9180, Validation Accuracy: 0.9197, Loss: 0.1120
Epoch   9 Batch  810/1077 - Train Accuracy: 0.9334, Validation Accuracy: 0.9173, Loss: 0.0989
Epoch   9 Batch  820/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.9283, Loss: 0.1138
Epoch   9 Batch  830/1077 - Train Accuracy: 0.8629, Validation Accuracy: 0.9364, Loss: 0.1081
Epoch   9 Batch  840/1077 - Train Accuracy: 0.9117, Validation Accuracy: 0.9212, Loss: 0.0923
Epoch   9 Batch  850/1077 - Train Accuracy: 0.9215, Validation Accuracy: 0.9190, Loss: 0.1272
Epoch   9 Batch  860/1077 - Train Accuracy: 0.9126, Validation Accuracy: 0.9411, Loss: 0.1063
Epoch   9 Batch  870/1077 - Train Accuracy: 0.9104, Validation Accuracy: 0.9240, Loss: 0.1043
Epoch   9 Batch  880/1077 - Train Accuracy: 0.9320, Validation Accuracy: 0.9329, Loss: 0.1128
Epoch   9 Batch  890/1077 - Train Accuracy: 0.9289, Validation Accuracy: 0.9290, Loss: 0.0978
Epoch   9 Batch  900/1077 - Train Accuracy: 0.9230, Validation Accuracy: 0.9141, Loss: 0.0988
Epoch   9 Batch  910/1077 - Train Accuracy: 0.9014, Validation Accuracy: 0.9126, Loss: 0.1126
Epoch   9 Batch  920/1077 - Train Accuracy: 0.9266, Validation Accuracy: 0.9325, Loss: 0.0992
Epoch   9 Batch  930/1077 - Train Accuracy: 0.9316, Validation Accuracy: 0.9059, Loss: 0.0904
Epoch   9 Batch  940/1077 - Train Accuracy: 0.9215, Validation Accuracy: 0.9105, Loss: 0.0954
Epoch   9 Batch  950/1077 - Train Accuracy: 0.9163, Validation Accuracy: 0.9105, Loss: 0.0930
Epoch   9 Batch  960/1077 - Train Accuracy: 0.9289, Validation Accuracy: 0.9066, Loss: 0.0985
Epoch   9 Batch  970/1077 - Train Accuracy: 0.9332, Validation Accuracy: 0.9286, Loss: 0.0972
Epoch   9 Batch  980/1077 - Train Accuracy: 0.9066, Validation Accuracy: 0.9251, Loss: 0.0977
Epoch   9 Batch  990/1077 - Train Accuracy: 0.9128, Validation Accuracy: 0.9144, Loss: 0.1058
Epoch   9 Batch 1000/1077 - Train Accuracy: 0.9156, Validation Accuracy: 0.9144, Loss: 0.0876
Epoch   9 Batch 1010/1077 - Train Accuracy: 0.9469, Validation Accuracy: 0.9180, Loss: 0.1028
Epoch   9 Batch 1020/1077 - Train Accuracy: 0.9227, Validation Accuracy: 0.9190, Loss: 0.0979
Epoch   9 Batch 1030/1077 - Train Accuracy: 0.9285, Validation Accuracy: 0.9116, Loss: 0.0906
Epoch   9 Batch 1040/1077 - Train Accuracy: 0.9330, Validation Accuracy: 0.9194, Loss: 0.0979
Epoch   9 Batch 1050/1077 - Train Accuracy: 0.9344, Validation Accuracy: 0.9130, Loss: 0.1017
Epoch   9 Batch 1060/1077 - Train Accuracy: 0.9484, Validation Accuracy: 0.9123, Loss: 0.0852
Epoch   9 Batch 1070/1077 - Train Accuracy: 0.9059, Validation Accuracy: 0.9194, Loss: 0.1026
Epoch  10 Batch   10/1077 - Train Accuracy: 0.9256, Validation Accuracy: 0.9219, Loss: 0.1015
Epoch  10 Batch   20/1077 - Train Accuracy: 0.9074, Validation Accuracy: 0.9247, Loss: 0.1073
Epoch  10 Batch   30/1077 - Train Accuracy: 0.9227, Validation Accuracy: 0.9286, Loss: 0.1041
Epoch  10 Batch   40/1077 - Train Accuracy: 0.9395, Validation Accuracy: 0.9311, Loss: 0.1008
Epoch  10 Batch   50/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.9052, Loss: 0.1130
Epoch  10 Batch   60/1077 - Train Accuracy: 0.9234, Validation Accuracy: 0.9134, Loss: 0.0998
Epoch  10 Batch   70/1077 - Train Accuracy: 0.9190, Validation Accuracy: 0.9151, Loss: 0.1072
Epoch  10 Batch   80/1077 - Train Accuracy: 0.9035, Validation Accuracy: 0.9173, Loss: 0.1007
Epoch  10 Batch   90/1077 - Train Accuracy: 0.9152, Validation Accuracy: 0.9276, Loss: 0.0969
Epoch  10 Batch  100/1077 - Train Accuracy: 0.9219, Validation Accuracy: 0.9162, Loss: 0.0979
Epoch  10 Batch  110/1077 - Train Accuracy: 0.9313, Validation Accuracy: 0.9226, Loss: 0.0849
Epoch  10 Batch  120/1077 - Train Accuracy: 0.9406, Validation Accuracy: 0.9244, Loss: 0.1007
Epoch  10 Batch  130/1077 - Train Accuracy: 0.9089, Validation Accuracy: 0.9194, Loss: 0.0934
Epoch  10 Batch  140/1077 - Train Accuracy: 0.9112, Validation Accuracy: 0.8999, Loss: 0.0940
Epoch  10 Batch  150/1077 - Train Accuracy: 0.9263, Validation Accuracy: 0.9162, Loss: 0.0945
Epoch  10 Batch  160/1077 - Train Accuracy: 0.9332, Validation Accuracy: 0.9290, Loss: 0.1083
Epoch  10 Batch  170/1077 - Train Accuracy: 0.9148, Validation Accuracy: 0.9176, Loss: 0.1074
Epoch  10 Batch  180/1077 - Train Accuracy: 0.9090, Validation Accuracy: 0.9158, Loss: 0.0959
Epoch  10 Batch  190/1077 - Train Accuracy: 0.9355, Validation Accuracy: 0.9109, Loss: 0.0829
Epoch  10 Batch  200/1077 - Train Accuracy: 0.9238, Validation Accuracy: 0.9226, Loss: 0.0899
Epoch  10 Batch  210/1077 - Train Accuracy: 0.9208, Validation Accuracy: 0.9173, Loss: 0.0912
Epoch  10 Batch  220/1077 - Train Accuracy: 0.9157, Validation Accuracy: 0.9237, Loss: 0.0926
Epoch  10 Batch  230/1077 - Train Accuracy: 0.9196, Validation Accuracy: 0.9187, Loss: 0.0901
Epoch  10 Batch  240/1077 - Train Accuracy: 0.9504, Validation Accuracy: 0.9116, Loss: 0.0842
Epoch  10 Batch  250/1077 - Train Accuracy: 0.9045, Validation Accuracy: 0.9134, Loss: 0.0929
Epoch  10 Batch  260/1077 - Train Accuracy: 0.9226, Validation Accuracy: 0.9119, Loss: 0.0984
Epoch  10 Batch  270/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.9197, Loss: 0.1002
Epoch  10 Batch  280/1077 - Train Accuracy: 0.9035, Validation Accuracy: 0.9201, Loss: 0.1002
Epoch  10 Batch  290/1077 - Train Accuracy: 0.9332, Validation Accuracy: 0.9109, Loss: 0.1019
Epoch  10 Batch  300/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.9141, Loss: 0.0831
Epoch  10 Batch  310/1077 - Train Accuracy: 0.9125, Validation Accuracy: 0.8995, Loss: 0.1071
Epoch  10 Batch  320/1077 - Train Accuracy: 0.9266, Validation Accuracy: 0.9059, Loss: 0.0906
Epoch  10 Batch  330/1077 - Train Accuracy: 0.9098, Validation Accuracy: 0.9126, Loss: 0.0850
Epoch  10 Batch  340/1077 - Train Accuracy: 0.9437, Validation Accuracy: 0.9155, Loss: 0.0975
Epoch  10 Batch  350/1077 - Train Accuracy: 0.9254, Validation Accuracy: 0.9339, Loss: 0.0864
Epoch  10 Batch  360/1077 - Train Accuracy: 0.9316, Validation Accuracy: 0.9130, Loss: 0.1002
Epoch  10 Batch  370/1077 - Train Accuracy: 0.9375, Validation Accuracy: 0.9105, Loss: 0.1005
Epoch  10 Batch  380/1077 - Train Accuracy: 0.9437, Validation Accuracy: 0.9286, Loss: 0.0858
Epoch  10 Batch  390/1077 - Train Accuracy: 0.8938, Validation Accuracy: 0.9244, Loss: 0.1070
Epoch  10 Batch  400/1077 - Train Accuracy: 0.9559, Validation Accuracy: 0.9155, Loss: 0.0952
Epoch  10 Batch  410/1077 - Train Accuracy: 0.9042, Validation Accuracy: 0.9116, Loss: 0.1057
Epoch  10 Batch  420/1077 - Train Accuracy: 0.9332, Validation Accuracy: 0.9165, Loss: 0.0884
Epoch  10 Batch  430/1077 - Train Accuracy: 0.9230, Validation Accuracy: 0.9194, Loss: 0.0986
Epoch  10 Batch  440/1077 - Train Accuracy: 0.9074, Validation Accuracy: 0.9272, Loss: 0.1118
Epoch  10 Batch  450/1077 - Train Accuracy: 0.9289, Validation Accuracy: 0.9173, Loss: 0.0843
Epoch  10 Batch  460/1077 - Train Accuracy: 0.8980, Validation Accuracy: 0.9137, Loss: 0.0907
Epoch  10 Batch  470/1077 - Train Accuracy: 0.9276, Validation Accuracy: 0.9009, Loss: 0.0972
Epoch  10 Batch  480/1077 - Train Accuracy: 0.9268, Validation Accuracy: 0.9187, Loss: 0.0947
Epoch  10 Batch  490/1077 - Train Accuracy: 0.9027, Validation Accuracy: 0.9176, Loss: 0.0959
Epoch  10 Batch  500/1077 - Train Accuracy: 0.9309, Validation Accuracy: 0.9176, Loss: 0.0835
Epoch  10 Batch  510/1077 - Train Accuracy: 0.8734, Validation Accuracy: 0.8700, Loss: 0.1506
Epoch  10 Batch  520/1077 - Train Accuracy: 0.9100, Validation Accuracy: 0.8903, Loss: 0.1246
Epoch  10 Batch  530/1077 - Train Accuracy: 0.8938, Validation Accuracy: 0.9016, Loss: 0.1222
Epoch  10 Batch  540/1077 - Train Accuracy: 0.9398, Validation Accuracy: 0.9073, Loss: 0.0937
Epoch  10 Batch  550/1077 - Train Accuracy: 0.9031, Validation Accuracy: 0.9087, Loss: 0.1221
Epoch  10 Batch  560/1077 - Train Accuracy: 0.9125, Validation Accuracy: 0.9102, Loss: 0.0915
Epoch  10 Batch  570/1077 - Train Accuracy: 0.9169, Validation Accuracy: 0.9087, Loss: 0.1243
Epoch  10 Batch  580/1077 - Train Accuracy: 0.9368, Validation Accuracy: 0.8864, Loss: 0.0963
Epoch  10 Batch  590/1077 - Train Accuracy: 0.9001, Validation Accuracy: 0.9148, Loss: 0.1045
Epoch  10 Batch  600/1077 - Train Accuracy: 0.9267, Validation Accuracy: 0.9261, Loss: 0.0930
Epoch  10 Batch  610/1077 - Train Accuracy: 0.9017, Validation Accuracy: 0.9158, Loss: 0.1062
Epoch  10 Batch  620/1077 - Train Accuracy: 0.9230, Validation Accuracy: 0.9308, Loss: 0.0879
Epoch  10 Batch  630/1077 - Train Accuracy: 0.9242, Validation Accuracy: 0.9215, Loss: 0.0886
Epoch  10 Batch  640/1077 - Train Accuracy: 0.9237, Validation Accuracy: 0.9091, Loss: 0.1033
Epoch  10 Batch  650/1077 - Train Accuracy: 0.9047, Validation Accuracy: 0.9062, Loss: 0.1020
Epoch  10 Batch  660/1077 - Train Accuracy: 0.9375, Validation Accuracy: 0.9173, Loss: 0.0952
Epoch  10 Batch  670/1077 - Train Accuracy: 0.9364, Validation Accuracy: 0.9144, Loss: 0.0824
Epoch  10 Batch  680/1077 - Train Accuracy: 0.9278, Validation Accuracy: 0.9283, Loss: 0.0923
Epoch  10 Batch  690/1077 - Train Accuracy: 0.9000, Validation Accuracy: 0.9084, Loss: 0.0911
Epoch  10 Batch  700/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.8952, Loss: 0.0968
Epoch  10 Batch  710/1077 - Train Accuracy: 0.9074, Validation Accuracy: 0.9304, Loss: 0.1023
Epoch  10 Batch  720/1077 - Train Accuracy: 0.9153, Validation Accuracy: 0.9183, Loss: 0.1104
Epoch  10 Batch  730/1077 - Train Accuracy: 0.9203, Validation Accuracy: 0.9112, Loss: 0.1099
Epoch  10 Batch  740/1077 - Train Accuracy: 0.9320, Validation Accuracy: 0.9158, Loss: 0.1031
Epoch  10 Batch  750/1077 - Train Accuracy: 0.9133, Validation Accuracy: 0.9084, Loss: 0.0989
Epoch  10 Batch  760/1077 - Train Accuracy: 0.9277, Validation Accuracy: 0.9208, Loss: 0.0873
Epoch  10 Batch  770/1077 - Train Accuracy: 0.9334, Validation Accuracy: 0.9201, Loss: 0.0805
Epoch  10 Batch  780/1077 - Train Accuracy: 0.9102, Validation Accuracy: 0.9318, Loss: 0.0832
Epoch  10 Batch  790/1077 - Train Accuracy: 0.8895, Validation Accuracy: 0.9183, Loss: 0.0980
Epoch  10 Batch  800/1077 - Train Accuracy: 0.9332, Validation Accuracy: 0.9116, Loss: 0.0779
Epoch  10 Batch  810/1077 - Train Accuracy: 0.9282, Validation Accuracy: 0.9084, Loss: 0.0825
Epoch  10 Batch  820/1077 - Train Accuracy: 0.9199, Validation Accuracy: 0.9286, Loss: 0.0966
Epoch  10 Batch  830/1077 - Train Accuracy: 0.8855, Validation Accuracy: 0.9130, Loss: 0.0993
Epoch  10 Batch  840/1077 - Train Accuracy: 0.9191, Validation Accuracy: 0.9194, Loss: 0.0943
Epoch  10 Batch  850/1077 - Train Accuracy: 0.9204, Validation Accuracy: 0.9293, Loss: 0.1055
Epoch  10 Batch  860/1077 - Train Accuracy: 0.9330, Validation Accuracy: 0.9155, Loss: 0.0838
Epoch  10 Batch  870/1077 - Train Accuracy: 0.9301, Validation Accuracy: 0.9297, Loss: 0.0811
Epoch  10 Batch  880/1077 - Train Accuracy: 0.9324, Validation Accuracy: 0.9205, Loss: 0.0813
Epoch  10 Batch  890/1077 - Train Accuracy: 0.9330, Validation Accuracy: 0.9141, Loss: 0.0769
Epoch  10 Batch  900/1077 - Train Accuracy: 0.9254, Validation Accuracy: 0.9279, Loss: 0.0854
Epoch  10 Batch  910/1077 - Train Accuracy: 0.9356, Validation Accuracy: 0.9311, Loss: 0.0811
Epoch  10 Batch  920/1077 - Train Accuracy: 0.9336, Validation Accuracy: 0.9233, Loss: 0.0866
Epoch  10 Batch  930/1077 - Train Accuracy: 0.9434, Validation Accuracy: 0.9233, Loss: 0.0730
Epoch  10 Batch  940/1077 - Train Accuracy: 0.9313, Validation Accuracy: 0.9173, Loss: 0.0817
Epoch  10 Batch  950/1077 - Train Accuracy: 0.9193, Validation Accuracy: 0.9141, Loss: 0.0659
Epoch  10 Batch  960/1077 - Train Accuracy: 0.9297, Validation Accuracy: 0.9244, Loss: 0.0796
Epoch  10 Batch  970/1077 - Train Accuracy: 0.9426, Validation Accuracy: 0.9251, Loss: 0.0743
Epoch  10 Batch  980/1077 - Train Accuracy: 0.9176, Validation Accuracy: 0.8995, Loss: 0.0737
Epoch  10 Batch  990/1077 - Train Accuracy: 0.9305, Validation Accuracy: 0.9126, Loss: 0.0848
Epoch  10 Batch 1000/1077 - Train Accuracy: 0.9342, Validation Accuracy: 0.9222, Loss: 0.0714
Epoch  10 Batch 1010/1077 - Train Accuracy: 0.9441, Validation Accuracy: 0.9148, Loss: 0.0799
Epoch  10 Batch 1020/1077 - Train Accuracy: 0.9418, Validation Accuracy: 0.9279, Loss: 0.0717
Epoch  10 Batch 1030/1077 - Train Accuracy: 0.9172, Validation Accuracy: 0.9006, Loss: 0.0844
Epoch  10 Batch 1040/1077 - Train Accuracy: 0.9297, Validation Accuracy: 0.9084, Loss: 0.0876
Epoch  10 Batch 1050/1077 - Train Accuracy: 0.9281, Validation Accuracy: 0.9116, Loss: 0.0980
Epoch  10 Batch 1060/1077 - Train Accuracy: 0.9563, Validation Accuracy: 0.9215, Loss: 0.0693
Epoch  10 Batch 1070/1077 - Train Accuracy: 0.9141, Validation Accuracy: 0.9347, Loss: 0.0880
Epoch  11 Batch   10/1077 - Train Accuracy: 0.9071, Validation Accuracy: 0.9297, Loss: 0.0844
Epoch  11 Batch   20/1077 - Train Accuracy: 0.9117, Validation Accuracy: 0.9169, Loss: 0.0973
Epoch  11 Batch   30/1077 - Train Accuracy: 0.9355, Validation Accuracy: 0.9222, Loss: 0.0975
Epoch  11 Batch   40/1077 - Train Accuracy: 0.9504, Validation Accuracy: 0.9130, Loss: 0.0867
Epoch  11 Batch   50/1077 - Train Accuracy: 0.9395, Validation Accuracy: 0.9045, Loss: 0.0832
Epoch  11 Batch   60/1077 - Train Accuracy: 0.9334, Validation Accuracy: 0.9144, Loss: 0.0738
Epoch  11 Batch   70/1077 - Train Accuracy: 0.8984, Validation Accuracy: 0.9194, Loss: 0.0842
Epoch  11 Batch   80/1077 - Train Accuracy: 0.9125, Validation Accuracy: 0.9162, Loss: 0.0876
Epoch  11 Batch   90/1077 - Train Accuracy: 0.9199, Validation Accuracy: 0.9293, Loss: 0.0920
Epoch  11 Batch  100/1077 - Train Accuracy: 0.9242, Validation Accuracy: 0.9361, Loss: 0.0852
Epoch  11 Batch  110/1077 - Train Accuracy: 0.9414, Validation Accuracy: 0.9375, Loss: 0.0720
Epoch  11 Batch  120/1077 - Train Accuracy: 0.9418, Validation Accuracy: 0.9379, Loss: 0.0783
Epoch  11 Batch  130/1077 - Train Accuracy: 0.9263, Validation Accuracy: 0.9233, Loss: 0.0829
Epoch  11 Batch  140/1077 - Train Accuracy: 0.9182, Validation Accuracy: 0.9272, Loss: 0.0740
Epoch  11 Batch  150/1077 - Train Accuracy: 0.9375, Validation Accuracy: 0.9219, Loss: 0.0701
Epoch  11 Batch  160/1077 - Train Accuracy: 0.9273, Validation Accuracy: 0.9290, Loss: 0.0778
Epoch  11 Batch  170/1077 - Train Accuracy: 0.9230, Validation Accuracy: 0.9347, Loss: 0.0850
Epoch  11 Batch  180/1077 - Train Accuracy: 0.9145, Validation Accuracy: 0.9126, Loss: 0.0793
Epoch  11 Batch  190/1077 - Train Accuracy: 0.9246, Validation Accuracy: 0.9268, Loss: 0.0721
Epoch  11 Batch  200/1077 - Train Accuracy: 0.9316, Validation Accuracy: 0.9162, Loss: 0.0831
Epoch  11 Batch  210/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.9311, Loss: 0.0828
Epoch  11 Batch  220/1077 - Train Accuracy: 0.9330, Validation Accuracy: 0.9318, Loss: 0.0790
Epoch  11 Batch  230/1077 - Train Accuracy: 0.9185, Validation Accuracy: 0.9233, Loss: 0.0730
Epoch  11 Batch  240/1077 - Train Accuracy: 0.9582, Validation Accuracy: 0.9244, Loss: 0.0718
Epoch  11 Batch  250/1077 - Train Accuracy: 0.9279, Validation Accuracy: 0.9105, Loss: 0.0748
Epoch  11 Batch  260/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.9361, Loss: 0.0656
Epoch  11 Batch  270/1077 - Train Accuracy: 0.9410, Validation Accuracy: 0.9329, Loss: 0.0803
Epoch  11 Batch  280/1077 - Train Accuracy: 0.9020, Validation Accuracy: 0.9141, Loss: 0.0802
Epoch  11 Batch  290/1077 - Train Accuracy: 0.9328, Validation Accuracy: 0.9094, Loss: 0.0885
Epoch  11 Batch  300/1077 - Train Accuracy: 0.9280, Validation Accuracy: 0.9190, Loss: 0.0855
Epoch  11 Batch  310/1077 - Train Accuracy: 0.9187, Validation Accuracy: 0.9261, Loss: 0.0925
Epoch  11 Batch  320/1077 - Train Accuracy: 0.9406, Validation Accuracy: 0.9194, Loss: 0.0703
Epoch  11 Batch  330/1077 - Train Accuracy: 0.9215, Validation Accuracy: 0.9300, Loss: 0.0753
Epoch  11 Batch  340/1077 - Train Accuracy: 0.9412, Validation Accuracy: 0.9339, Loss: 0.0763
Epoch  11 Batch  350/1077 - Train Accuracy: 0.9434, Validation Accuracy: 0.9332, Loss: 0.0656
Epoch  11 Batch  360/1077 - Train Accuracy: 0.9316, Validation Accuracy: 0.9411, Loss: 0.0725
Epoch  11 Batch  370/1077 - Train Accuracy: 0.9368, Validation Accuracy: 0.9304, Loss: 0.0767
Epoch  11 Batch  380/1077 - Train Accuracy: 0.9488, Validation Accuracy: 0.9347, Loss: 0.0669
Epoch  11 Batch  390/1077 - Train Accuracy: 0.8988, Validation Accuracy: 0.9229, Loss: 0.0878
Epoch  11 Batch  400/1077 - Train Accuracy: 0.9320, Validation Accuracy: 0.9322, Loss: 0.0789
Epoch  11 Batch  410/1077 - Train Accuracy: 0.9050, Validation Accuracy: 0.9311, Loss: 0.0895
Epoch  11 Batch  420/1077 - Train Accuracy: 0.9559, Validation Accuracy: 0.9339, Loss: 0.0615
Epoch  11 Batch  430/1077 - Train Accuracy: 0.9246, Validation Accuracy: 0.9400, Loss: 0.0744
Epoch  11 Batch  440/1077 - Train Accuracy: 0.9086, Validation Accuracy: 0.9240, Loss: 0.0822
Epoch  11 Batch  450/1077 - Train Accuracy: 0.9301, Validation Accuracy: 0.9247, Loss: 0.0714
Epoch  11 Batch  460/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.9336, Loss: 0.0713
Epoch  11 Batch  470/1077 - Train Accuracy: 0.9461, Validation Accuracy: 0.9272, Loss: 0.0737
Epoch  11 Batch  480/1077 - Train Accuracy: 0.9346, Validation Accuracy: 0.9336, Loss: 0.0585
Epoch  11 Batch  490/1077 - Train Accuracy: 0.9273, Validation Accuracy: 0.9343, Loss: 0.0729
Epoch  11 Batch  500/1077 - Train Accuracy: 0.9437, Validation Accuracy: 0.9308, Loss: 0.0698
Epoch  11 Batch  510/1077 - Train Accuracy: 0.9082, Validation Accuracy: 0.9240, Loss: 0.0821
Epoch  11 Batch  520/1077 - Train Accuracy: 0.9427, Validation Accuracy: 0.9201, Loss: 0.0632
Epoch  11 Batch  530/1077 - Train Accuracy: 0.9070, Validation Accuracy: 0.9130, Loss: 0.0851
Epoch  11 Batch  540/1077 - Train Accuracy: 0.9441, Validation Accuracy: 0.9279, Loss: 0.0714
Epoch  11 Batch  550/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.9308, Loss: 0.0786
Epoch  11 Batch  560/1077 - Train Accuracy: 0.9293, Validation Accuracy: 0.9297, Loss: 0.0618
Epoch  11 Batch  570/1077 - Train Accuracy: 0.9108, Validation Accuracy: 0.9375, Loss: 0.0788
Epoch  11 Batch  580/1077 - Train Accuracy: 0.9639, Validation Accuracy: 0.9382, Loss: 0.0653
Epoch  11 Batch  590/1077 - Train Accuracy: 0.9062, Validation Accuracy: 0.9382, Loss: 0.0866
Epoch  11 Batch  600/1077 - Train Accuracy: 0.9531, Validation Accuracy: 0.9300, Loss: 0.0672
Epoch  11 Batch  610/1077 - Train Accuracy: 0.9211, Validation Accuracy: 0.9418, Loss: 0.0735
Epoch  11 Batch  620/1077 - Train Accuracy: 0.9426, Validation Accuracy: 0.9293, Loss: 0.0670
Epoch  11 Batch  630/1077 - Train Accuracy: 0.9363, Validation Accuracy: 0.9347, Loss: 0.0730
Epoch  11 Batch  640/1077 - Train Accuracy: 0.9304, Validation Accuracy: 0.9290, Loss: 0.0620
Epoch  11 Batch  650/1077 - Train Accuracy: 0.9258, Validation Accuracy: 0.9173, Loss: 0.0744
Epoch  11 Batch  660/1077 - Train Accuracy: 0.9398, Validation Accuracy: 0.9453, Loss: 0.0653
Epoch  11 Batch  670/1077 - Train Accuracy: 0.9517, Validation Accuracy: 0.9428, Loss: 0.0634
Epoch  11 Batch  680/1077 - Train Accuracy: 0.9420, Validation Accuracy: 0.9315, Loss: 0.0686
Epoch  11 Batch  690/1077 - Train Accuracy: 0.9133, Validation Accuracy: 0.8906, Loss: 0.1087
Epoch  11 Batch  700/1077 - Train Accuracy: 0.9277, Validation Accuracy: 0.9190, Loss: 0.0972
Epoch  11 Batch  710/1077 - Train Accuracy: 0.9230, Validation Accuracy: 0.9265, Loss: 0.0970
Epoch  11 Batch  720/1077 - Train Accuracy: 0.9178, Validation Accuracy: 0.9233, Loss: 0.0946
Epoch  11 Batch  730/1077 - Train Accuracy: 0.9062, Validation Accuracy: 0.9222, Loss: 0.0846
Epoch  11 Batch  740/1077 - Train Accuracy: 0.9285, Validation Accuracy: 0.9347, Loss: 0.0850
Epoch  11 Batch  750/1077 - Train Accuracy: 0.9262, Validation Accuracy: 0.9272, Loss: 0.0894
Epoch  11 Batch  760/1077 - Train Accuracy: 0.9437, Validation Accuracy: 0.9393, Loss: 0.0874
Epoch  11 Batch  770/1077 - Train Accuracy: 0.9435, Validation Accuracy: 0.9407, Loss: 0.0776
Epoch  11 Batch  780/1077 - Train Accuracy: 0.8988, Validation Accuracy: 0.9325, Loss: 0.0937
Epoch  11 Batch  790/1077 - Train Accuracy: 0.9289, Validation Accuracy: 0.9336, Loss: 0.0890
Epoch  11 Batch  800/1077 - Train Accuracy: 0.9410, Validation Accuracy: 0.9258, Loss: 0.0663
Epoch  11 Batch  810/1077 - Train Accuracy: 0.9420, Validation Accuracy: 0.9276, Loss: 0.0687
Epoch  11 Batch  820/1077 - Train Accuracy: 0.9387, Validation Accuracy: 0.9442, Loss: 0.0819
Epoch  11 Batch  830/1077 - Train Accuracy: 0.8918, Validation Accuracy: 0.9382, Loss: 0.0904
Epoch  11 Batch  840/1077 - Train Accuracy: 0.9254, Validation Accuracy: 0.9492, Loss: 0.0809
Epoch  11 Batch  850/1077 - Train Accuracy: 0.9215, Validation Accuracy: 0.9400, Loss: 0.0801
Epoch  11 Batch  860/1077 - Train Accuracy: 0.9282, Validation Accuracy: 0.9379, Loss: 0.0736
Epoch  11 Batch  870/1077 - Train Accuracy: 0.9268, Validation Accuracy: 0.9354, Loss: 0.0726
Epoch  11 Batch  880/1077 - Train Accuracy: 0.9477, Validation Accuracy: 0.9311, Loss: 0.0765
Epoch  11 Batch  890/1077 - Train Accuracy: 0.9386, Validation Accuracy: 0.9272, Loss: 0.0661
Epoch  11 Batch  900/1077 - Train Accuracy: 0.9340, Validation Accuracy: 0.9492, Loss: 0.0722
Epoch  11 Batch  910/1077 - Train Accuracy: 0.9304, Validation Accuracy: 0.9389, Loss: 0.0729
Epoch  11 Batch  920/1077 - Train Accuracy: 0.9277, Validation Accuracy: 0.9265, Loss: 0.0690
Epoch  11 Batch  930/1077 - Train Accuracy: 0.9512, Validation Accuracy: 0.9478, Loss: 0.0579
Epoch  11 Batch  940/1077 - Train Accuracy: 0.9406, Validation Accuracy: 0.9304, Loss: 0.0770
Epoch  11 Batch  950/1077 - Train Accuracy: 0.9334, Validation Accuracy: 0.9361, Loss: 0.0498
Epoch  11 Batch  960/1077 - Train Accuracy: 0.9435, Validation Accuracy: 0.9308, Loss: 0.0601
Epoch  11 Batch  970/1077 - Train Accuracy: 0.9289, Validation Accuracy: 0.9272, Loss: 0.0737
Epoch  11 Batch  980/1077 - Train Accuracy: 0.9180, Validation Accuracy: 0.9379, Loss: 0.0766
Epoch  11 Batch  990/1077 - Train Accuracy: 0.9174, Validation Accuracy: 0.9212, Loss: 0.0844
Epoch  11 Batch 1000/1077 - Train Accuracy: 0.9159, Validation Accuracy: 0.9137, Loss: 0.0644
Epoch  11 Batch 1010/1077 - Train Accuracy: 0.9445, Validation Accuracy: 0.9244, Loss: 0.0742
Epoch  11 Batch 1020/1077 - Train Accuracy: 0.9449, Validation Accuracy: 0.9308, Loss: 0.0723
Epoch  11 Batch 1030/1077 - Train Accuracy: 0.9410, Validation Accuracy: 0.9158, Loss: 0.0696
Epoch  11 Batch 1040/1077 - Train Accuracy: 0.9400, Validation Accuracy: 0.9162, Loss: 0.0693
Epoch  11 Batch 1050/1077 - Train Accuracy: 0.9480, Validation Accuracy: 0.9240, Loss: 0.0644
Epoch  11 Batch 1060/1077 - Train Accuracy: 0.9609, Validation Accuracy: 0.9265, Loss: 0.0593
Epoch  11 Batch 1070/1077 - Train Accuracy: 0.9207, Validation Accuracy: 0.9311, Loss: 0.0701
Model Trained and Saved

Save Parameters

Save the batch_size and save_path parameters for inference.


In [28]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Save parameters for checkpoint
helper.save_params(save_path)

Checkpoint


In [29]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import tensorflow as tf
import numpy as np
import helper
import problem_unittests as tests

_, (source_vocab_to_int, target_vocab_to_int), (source_int_to_vocab, target_int_to_vocab) = helper.load_preprocess()
load_path = helper.load_params()

Sentence to Sequence

To feed a sentence into the model for translation, you first need to preprocess it. Implement the function sentence_to_seq() to preprocess new sentences.

  • Convert the sentence to lowercase
  • Convert words into ids using vocab_to_int
    • Convert words not in the vocabulary, to the <UNK> word id.

In [30]:
source_vocab_to_int


Out[30]:
{',': 101,
 '.': 75,
 '<EOS>': 1,
 '<GO>': 3,
 '<PAD>': 0,
 '<UNK>': 2,
 '?': 130,
 'a': 158,
 'am': 57,
 'and': 201,
 'animal': 106,
 'animals': 192,
 'apple': 69,
 'apples': 89,
 'april': 82,
 'are': 175,
 "aren't": 188,
 'august': 46,
 'automobile': 111,
 'autumn': 145,
 'banana': 162,
 'bananas': 20,
 'bear': 118,
 'bears': 14,
 'beautiful': 25,
 'been': 114,
 'between': 58,
 'big': 42,
 'bird': 125,
 'birds': 102,
 'black': 53,
 'blue': 126,
 'busy': 79,
 'but': 173,
 'california': 108,
 'car': 26,
 'cat': 197,
 'cats': 86,
 'chilly': 49,
 'china': 172,
 'chinese': 70,
 'cold': 40,
 'december': 165,
 'did': 166,
 "didn't": 36,
 'difficult': 43,
 'dislike': 181,
 'disliked': 55,
 'dislikes': 77,
 'do': 21,
 'does': 50,
 'dog': 95,
 'dogs': 68,
 'drives': 150,
 'driving': 174,
 'drove': 131,
 'dry': 103,
 'during': 34,
 'easy': 170,
 'eiffel': 178,
 'elephant': 80,
 'elephants': 30,
 'english': 180,
 'fall': 100,
 'favorite': 112,
 'feared': 41,
 'february': 35,
 'field': 143,
 'football': 159,
 'france': 202,
 'freezing': 164,
 'french': 16,
 'fruit': 198,
 'fun': 203,
 'go': 74,
 'going': 99,
 'grape': 78,
 'grapefruit': 52,
 'grapes': 66,
 'green': 148,
 'grocery': 60,
 'has': 121,
 'have': 196,
 'he': 116,
 'her': 160,
 'his': 168,
 'horse': 176,
 'horses': 171,
 'hot': 182,
 'how': 94,
 'i': 205,
 'in': 134,
 'india': 23,
 'is': 156,
 "isn't": 195,
 'it': 117,
 "it's": 139,
 'january': 104,
 'jersey': 132,
 'july': 44,
 'june': 140,
 'lake': 67,
 'last': 33,
 'least': 136,
 'lemon': 12,
 'lemons': 22,
 'like': 107,
 'liked': 81,
 'likes': 27,
 'lime': 45,
 'limes': 5,
 'lion': 191,
 'lions': 149,
 'little': 96,
 'loved': 4,
 'mango': 29,
 'mangoes': 110,
 'march': 190,
 'may': 105,
 'mice': 39,
 'might': 76,
 'mild': 141,
 'monkey': 7,
 'monkeys': 185,
 'most': 119,
 'mouse': 63,
 'my': 152,
 'never': 153,
 'new': 13,
 'next': 187,
 'nice': 169,
 'november': 179,
 'october': 88,
 'old': 200,
 'orange': 167,
 'oranges': 15,
 'our': 186,
 'paris': 85,
 'peach': 115,
 'peaches': 10,
 'pear': 161,
 'pears': 123,
 'plan': 204,
 'plans': 97,
 'pleasant': 59,
 'portuguese': 24,
 'quiet': 6,
 'rabbit': 124,
 'rabbits': 120,
 'rainy': 183,
 'red': 138,
 'relaxing': 194,
 'rusty': 8,
 'saw': 28,
 'school': 73,
 'september': 142,
 'shark': 87,
 'sharks': 163,
 'she': 98,
 'shiny': 72,
 'snake': 9,
 'snakes': 146,
 'snowy': 177,
 'sometimes': 151,
 'spanish': 90,
 'spring': 109,
 'states': 32,
 'store': 92,
 'strawberries': 83,
 'strawberry': 18,
 'summer': 127,
 'that': 11,
 'the': 62,
 'their': 199,
 'they': 17,
 'think': 48,
 'thinks': 144,
 'this': 135,
 'to': 38,
 'tower': 122,
 'translate': 147,
 'translating': 184,
 'truck': 189,
 'united': 128,
 'usually': 113,
 'visit': 37,
 'want': 54,
 'wanted': 91,
 'wants': 93,
 'warm': 19,
 'was': 154,
 'we': 129,
 'weather': 137,
 'went': 133,
 'were': 47,
 'wet': 51,
 'when': 155,
 'where': 61,
 'white': 71,
 'why': 157,
 'winter': 56,
 'wonderful': 84,
 'would': 193,
 'yellow': 64,
 'you': 31,
 'your': 65}

In [31]:
import collections

In [32]:
def sentence_to_seq(sentence, vocab_to_int):
    """
    Convert a sentence to a sequence of ids
    :param sentence: String
    :param vocab_to_int: Dictionary to go from the words to an id
    :return: List of word ids
    """
    def unk(_=None):
        return vocab_to_int['<UNK>']
    dd = collections.defaultdict(unk, vocab_to_int)
    
    return [dd[k] for k in sentence.lower().split()]


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_sentence_to_seq(sentence_to_seq)


Tests Passed

Translate

This will translate translate_sentence from English to French.


In [33]:
translate_sentence = 'he saw a old yellow truck .'


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
translate_sentence = sentence_to_seq(translate_sentence, source_vocab_to_int)

loaded_graph = tf.Graph()
with tf.Session(graph=loaded_graph) as sess:
    # Load saved model
    loader = tf.train.import_meta_graph(load_path + '.meta')
    loader.restore(sess, load_path)

    input_data = loaded_graph.get_tensor_by_name('input:0')
    logits = loaded_graph.get_tensor_by_name('predictions:0')
    target_sequence_length = loaded_graph.get_tensor_by_name('target_sequence_length:0')
    source_sequence_length = loaded_graph.get_tensor_by_name('source_sequence_length:0')
    keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')

    translate_logits = sess.run(logits, {input_data: [translate_sentence]*batch_size,
                                         target_sequence_length: [len(translate_sentence)*2]*batch_size,
                                         source_sequence_length: [len(translate_sentence)]*batch_size,
                                         keep_prob: 1.0})[0]

print('Input')
print('  Word Ids:      {}'.format([i for i in translate_sentence]))
print('  English Words: {}'.format([source_int_to_vocab[i] for i in translate_sentence]))

print('\nPrediction')
print('  Word Ids:      {}'.format([i for i in translate_logits]))
print('  French Words: {}'.format(" ".join([target_int_to_vocab[i] for i in translate_logits])))


INFO:tensorflow:Restoring parameters from checkpoints/dev
Input
  Word Ids:      [116, 28, 158, 200, 64, 189, 75]
  English Words: ['he', 'saw', 'a', 'old', 'yellow', 'truck', '.']

Prediction
  Word Ids:      [277, 163, 336, 57, 201, 172, 188, 219, 1]
  French Words: il a vu un vieux camion jaune . <EOS>

Imperfect Translation

You might notice that some sentences translate better than others. Since the dataset you're using only has a vocabulary of 227 English words of the thousands that you use, you're only going to see good results using these words. For this project, you don't need a perfect translation. However, if you want to create a better translation model, you'll need better data.

You can train on the WMT10 French-English corpus. This dataset has more vocabulary and richer in topics discussed. However, this will take you days to train, so make sure you've a GPU and the neural network is performing well on dataset we provided. Just make sure you play with the WMT10 corpus after you've submitted this project.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_language_translation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.


In [34]:
def translate(sentence, quiet=False):

    translate_sentence = sentence_to_seq(sentence, source_vocab_to_int)

    loaded_graph = tf.Graph()
    with tf.Session(graph=loaded_graph) as sess:
        # Load saved model
        loader = tf.train.import_meta_graph(load_path + '.meta')
        loader.restore(sess, load_path)

        input_data = loaded_graph.get_tensor_by_name('input:0')
        logits = loaded_graph.get_tensor_by_name('predictions:0')
        target_sequence_length = loaded_graph.get_tensor_by_name('target_sequence_length:0')
        source_sequence_length = loaded_graph.get_tensor_by_name('source_sequence_length:0')
        keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')

        translate_logits = sess.run(logits, {input_data: [translate_sentence]*batch_size,
                                             target_sequence_length: [len(translate_sentence)*2]*batch_size,
                                             source_sequence_length: [len(translate_sentence)]*batch_size,
                                             keep_prob: 1.0})[0]
        
    output = " ".join([target_int_to_vocab[i] for i in translate_logits])

    if not quiet:
        print('Input')
        print('  Word Ids:      {}'.format([i for i in translate_sentence]))
        print('  English Words: {}'.format([source_int_to_vocab[i] for i in translate_sentence]))

        print('\nPrediction')
        print('  Word Ids:      {}'.format([i for i in translate_logits]))
        print('  French Words: {}'.format(output))
    return output

In [35]:
translate("the car is yellow .", True)


INFO:tensorflow:Restoring parameters from checkpoints/dev
Out[35]:
"pourquoi est -ce qu'elle va aux états-unis ? <EOS>"

In [36]:
translate('he saw a old yellow truck .', True)


INFO:tensorflow:Restoring parameters from checkpoints/dev
Out[36]:
'il a vu un vieux camion jaune . <EOS>'

In [37]:
translate("california is sometimes nice during winter , and it is snowy in november .")


INFO:tensorflow:Restoring parameters from checkpoints/dev
Input
  Word Ids:      [108, 156, 151, 169, 34, 56, 101, 201, 117, 156, 177, 134, 179, 75]
  English Words: ['california', 'is', 'sometimes', 'nice', 'during', 'winter', ',', 'and', 'it', 'is', 'snowy', 'in', 'november', '.']

Prediction
  Word Ids:      [281, 63, 191, 207, 21, 292, 203, 310, 36, 277, 63, 158, 262, 49, 248, 307, 219, 1]
  French Words: californie est parfois agréable pendant l' hiver , et il est neigeux au mois d' août . <EOS>
Out[37]:
"californie est parfois agréable pendant l' hiver , et il est neigeux au mois d' août . <EOS>"

In [ ]:


In [38]:
translate("she saw a old yellow car .")


INFO:tensorflow:Restoring parameters from checkpoints/dev
Input
  Word Ids:      [98, 28, 158, 200, 64, 26, 75]
  English Words: ['she', 'saw', 'a', 'old', 'yellow', 'car', '.']

Prediction
  Word Ids:      [80, 163, 336, 170, 108, 325, 188, 219, 1]
  French Words: elle a vu une vieille voiture jaune . <EOS>
Out[38]:
'elle a vu une vieille voiture jaune . <EOS>'

In [ ]:


In [ ]: