Language Translation

In this project, you’re going to take a peek into the realm of neural network machine translation. You’ll be training a sequence to sequence model on a dataset of English and French sentences that can translate new sentences from English to French.

Get the Data

Since translating the whole language of English to French will take lots of time to train, we have provided you with a small portion of the English corpus.


In [4]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper
import problem_unittests as tests

source_path = 'data/small_vocab_en'
target_path = 'data/small_vocab_fr'
source_text = helper.load_data(source_path)
target_text = helper.load_data(target_path)

Explore the Data

Play around with view_sentence_range to view different parts of the data.


In [113]:
view_sentence_range = (0, 10)

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

print('Dataset Stats')
print('Roughly the number of unique words: {}'.format(len({word: None for word in source_text.split()})))

sentences = source_text.split('\n')
word_counts = [len(sentence.split()) for sentence in sentences]
print('Number of sentences: {}'.format(len(sentences)))
print('Average number of words in a sentence: {}'.format(np.average(word_counts)))

print()
print('English sentences {} to {}:'.format(*view_sentence_range))
print('\n'.join(source_text.split('\n')[view_sentence_range[0]:view_sentence_range[1]]))
print()
print('French sentences {} to {}:'.format(*view_sentence_range))
print('\n'.join(target_text.split('\n')[view_sentence_range[0]:view_sentence_range[1]]))


Dataset Stats
Roughly the number of unique words: 227
Number of sentences: 137861
Average number of words in a sentence: 13.225277634719028

English sentences 0 to 10:
new jersey is sometimes quiet during autumn , and it is snowy in april .
the united states is usually chilly during july , and it is usually freezing in november .
california is usually quiet during march , and it is usually hot in june .
the united states is sometimes mild during june , and it is cold in september .
your least liked fruit is the grape , but my least liked is the apple .
his favorite fruit is the orange , but my favorite is the grape .
paris is relaxing during december , but it is usually chilly in july .
new jersey is busy during spring , and it is never hot in march .
our least liked fruit is the lemon , but my least liked is the grape .
the united states is sometimes busy during january , and it is sometimes warm in november .

French sentences 0 to 10:
new jersey est parfois calme pendant l' automne , et il est neigeux en avril .
les états-unis est généralement froid en juillet , et il gèle habituellement en novembre .
california est généralement calme en mars , et il est généralement chaud en juin .
les états-unis est parfois légère en juin , et il fait froid en septembre .
votre moins aimé fruit est le raisin , mais mon moins aimé est la pomme .
son fruit préféré est l'orange , mais mon préféré est le raisin .
paris est relaxant en décembre , mais il est généralement froid en juillet .
new jersey est occupé au printemps , et il est jamais chaude en mars .
notre fruit est moins aimé le citron , mais mon moins aimé est le raisin .
les états-unis est parfois occupé en janvier , et il est parfois chaud en novembre .

Implement Preprocessing Function

Text to Word Ids

As you did with other RNNs, you must turn the text into a number so the computer can understand it. In the function text_to_ids(), you'll turn source_text and target_text from words to ids. However, you need to add the <EOS> word id at the end of target_text. This will help the neural network predict when the sentence should end.

You can get the <EOS> word id by doing:

target_vocab_to_int['<EOS>']

You can get other word ids using source_vocab_to_int and target_vocab_to_int.


In [114]:
def text_to_ids(source_text, target_text, source_vocab_to_int, target_vocab_to_int):
    """
    Convert source and target text to proper word ids
    :param source_text: String that contains all the source text.
    :param target_text: String that contains all the target text.
    :param source_vocab_to_int: Dictionary to go from the source words to an id
    :param target_vocab_to_int: Dictionary to go from the target words to an id
    :return: A tuple of lists (source_id_text, target_id_text)
    """

    def convert_source_sentence(sentence):
        return [source_vocab_to_int[w] for w in sentence.split(" ") if w!=""]
    
    def convert_target_sentence(sentence):
        return [target_vocab_to_int[w] for w in sentence.split(" ") if w!=""]+[target_vocab_to_int['<EOS>']]
    
    return  [convert_source_sentence(sentence) for sentence in source_text.split("\n")],\
            [convert_target_sentence(sentence) for sentence in target_text.split("\n")]


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_text_to_ids(text_to_ids)


Tests Passed

Preprocess all the data and save it

Running the code cell below will preprocess all the data and save it to file.


In [115]:
helper.preprocess_and_save_data


Out[115]:
<function helper.preprocess_and_save_data>

In [116]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
helper.preprocess_and_save_data(source_path, target_path, text_to_ids)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np
import helper

(source_int_text, target_int_text), (source_vocab_to_int, target_vocab_to_int), _ = helper.load_preprocess()

Check the Version of TensorFlow and Access to GPU

This will check to make sure you have the correct version of TensorFlow and access to a GPU


In [2]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf
from tensorflow.python.layers.core import Dense

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.1'), 'Please use TensorFlow version 1.1 or newer'
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


TensorFlow Version: 1.1.0
Default GPU Device: /gpu:0

Build the Neural Network

You'll build the components necessary to build a Sequence-to-Sequence model by implementing the following functions below:

  • model_inputs
  • process_decoder_input
  • encoding_layer
  • decoding_layer_train
  • decoding_layer_infer
  • decoding_layer
  • seq2seq_model

Input

Implement the model_inputs() function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Input text placeholder named "input" using the TF Placeholder name parameter with rank 2.
  • Targets placeholder with rank 2.
  • Learning rate placeholder with rank 0.
  • Keep probability placeholder named "keep_prob" using the TF Placeholder name parameter with rank 0.
  • Target sequence length placeholder named "target_sequence_length" with rank 1
  • Max target sequence length tensor named "max_target_len" getting its value from applying tf.reduce_max on the target_sequence_length placeholder. Rank 0.
  • Source sequence length placeholder named "source_sequence_length" with rank 1

Return the placeholders in the following the tuple (input, targets, learning rate, keep probability, target sequence length, max target sequence length, source sequence length)


In [5]:
def model_inputs():
    """
    Create TF Placeholders for input, targets, learning rate, and lengths of source and target sequences.
    :return: Tuple (input, targets, learning rate, keep probability, target sequence length,
    max target sequence length, source sequence length)
    """
    input = tf.placeholder(tf.int32,[None,None],name="input")
    targets = tf.placeholder(tf.int32,[None,None],name="targets")
    learning_rate = tf.placeholder(tf.float32,name="learning_rate")
    keep_probability = tf.placeholder(tf.float32,name="keep_prob")
    target_sequence_length = tf.placeholder(tf.int32,[None],name="target_sequence_length")
    max_target_len = tf.reduce_max(target_sequence_length)
    source_sequence_len = tf.placeholder(tf.int32,[None],name="source_sequence_length")
    
    # TODO: Implement Function
    return input, targets, learning_rate, keep_probability, target_sequence_length, max_target_len, source_sequence_len


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_inputs(model_inputs)


Tests Passed

Process Decoder Input

Implement process_decoder_input by removing the last word id from each batch in target_data and concat the GO ID to the begining of each batch.


In [6]:
import inspect
inspect.getsourcelines(tests.test_process_encoding_input)


Out[6]:
(['def test_process_encoding_input(process_encoding_input):\n',
  '    batch_size = 2\n',
  '    seq_length = 3\n',
  "    target_vocab_to_int = {'<GO>': 3}\n",
  '    with tf.Graph().as_default():\n',
  '        target_data = tf.placeholder(tf.int32, [batch_size, seq_length])\n',
  '        dec_input = process_encoding_input(target_data, target_vocab_to_int, batch_size)\n',
  '\n',
  '        assert dec_input.get_shape() == (batch_size, seq_length),\\\n',
  "            'Wrong shape returned.  Found {}'.format(dec_input.get_shape())\n",
  '\n',
  '        test_target_data = [[10, 20, 30], [40, 18, 23]]\n',
  '        with tf.Session() as sess:\n',
  '            test_dec_input = sess.run(dec_input, {target_data: test_target_data})\n',
  '\n',
  "        assert test_dec_input[0][0] == target_vocab_to_int['<GO>'] and\\\n",
  "               test_dec_input[1][0] == target_vocab_to_int['<GO>'],\\\n",
  "            'Missing GO Id.'\n",
  '\n',
  '    _print_success_message()\n'],
 281)

In [7]:
def process_decoder_input(target_data, target_vocab_to_int, batch_size):
    """
    Preprocess target data for encoding
    :param target_data: Target Placehoder
    :param target_vocab_to_int: Dictionary to go from the target words to an id
    :param batch_size: Batch Size
    :return: Preprocessed target data
    """
    
    return(tf.concat([tf.constant([[target_vocab_to_int["<GO>"]]]*batch_size),\
                     tf.strided_slice(target_data,[0,0],[batch_size,-1],[1,1])],1))


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_process_encoding_input(process_decoder_input)


Tests Passed

Encoding

Implement encoding_layer() to create a Encoder RNN layer:


In [8]:
from imp import reload
reload(tests)

def encoding_layer(rnn_inputs, rnn_size, num_layers, keep_prob, 
                   source_sequence_length, source_vocab_size, 
                   encoding_embedding_size):
    """
    Create encoding layer
    :param rnn_inputs: Inputs for the RNN
    :param rnn_size: RNN Size
    :param num_layers: Number of layers
    :param keep_prob: Dropout keep probability
    :param source_sequence_length: a list of the lengths of each sequence in the batch
    :param source_vocab_size: vocabulary size of source data
    :param encoding_embedding_size: embedding size of source data
    :return: tuple (RNN output, RNN state)
    """
    # TODO: Implement Function
    
    embed = tf.contrib.layers.embed_sequence(rnn_inputs,rnn_size,encoding_embedding_size)
    lstm_cell = tf.contrib.rnn.LSTMCell(rnn_size)
    lstm_stack = tf.contrib.rnn.DropoutWrapper(\
        tf.contrib.rnn.MultiRNNCell([tf.contrib.rnn.LSTMCell(rnn_size) for _ in range(num_layers)]),
        output_keep_prob=keep_prob)
    return tf.nn.dynamic_rnn(lstm_stack,embed,source_sequence_length,dtype=tf.float32)
    
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_encoding_layer(encoding_layer)


Tests Passed

Decoding - Training

Create a training decoding layer:


In [9]:
def decoding_layer_train(encoder_state, dec_cell, dec_embed_input, 
                         target_sequence_length, max_summary_length, 
                         output_layer, keep_prob):
    """
    Create a decoding layer for training
    :param encoder_state: Encoder State
    :param dec_cell: Decoder RNN Cell
    :param dec_embed_input: Decoder embedded input
    :param target_sequence_length: The lengths of each sequence in the target batch
    :param max_summary_length: The length of the longest sequence in the batch
    :param output_layer: Function to apply the output layer
    :param keep_prob: Dropout keep probability
    :return: BasicDecoderOutput containing training logits and sample_id
    """
    
    training_helper = tf.contrib.seq2seq.TrainingHelper(dec_embed_input,
                                      sequence_length=target_sequence_length)
    
    decoder = tf.contrib.seq2seq.BasicDecoder(\
            tf.contrib.rnn.DropoutWrapper(dec_cell,output_keep_prob=keep_prob),\
            training_helper,encoder_state,output_layer)
    
    final_outputs, final_state = tf.contrib.seq2seq.dynamic_decode(decoder,maximum_iterations=max_summary_length)
    return(final_outputs)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_decoding_layer_train(decoding_layer_train)


Tests Passed

Decoding - Inference

Create inference decoder:


In [11]:
def decoding_layer_infer(encoder_state, dec_cell, dec_embeddings, start_of_sequence_id,
                         end_of_sequence_id, max_target_sequence_length,
                         vocab_size, output_layer, batch_size, keep_prob):
    """
    Create a decoding layer for inference
    :param encoder_state: Encoder state
    :param dec_cell: Decoder RNN Cell
    :param dec_embeddings: Decoder embeddings
    :param start_of_sequence_id: GO ID
    :param end_of_sequence_id: EOS Id
    :param max_target_sequence_length: Maximum length of target sequences
    :param vocab_size: Size of decoder/target vocabulary
    :param decoding_scope: TenorFlow Variable Scope for decoding
    :param output_layer: Function to apply the output layer
    :param batch_size: Batch size
    :param keep_prob: Dropout keep probability
    :return: BasicDecoderOutput containing inference logits and sample_id
    """
    start_tokens = tf.tile(tf.constant([start_of_sequence_id], dtype=tf.int32), [batch_size], name='start_tokens')
    embed_helper = tf.contrib.seq2seq.GreedyEmbeddingHelper(dec_embeddings,start_tokens,end_of_sequence_id)
    
    decoder = tf.contrib.seq2seq.BasicDecoder(\
            dec_cell,\
            embed_helper,encoder_state,output_layer)
    
    final_outputs, final_state = tf.contrib.seq2seq.dynamic_decode(decoder,maximum_iterations=max_target_sequence_length,impute_finished=True)
    
  
    
    return final_outputs



"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_decoding_layer_infer(decoding_layer_infer)


Tests Passed

Build the Decoding Layer

Implement decoding_layer() to create a Decoder RNN layer.

  • Embed the target sequences
  • Construct the decoder LSTM cell (just like you constructed the encoder cell above)
  • Create an output layer to map the outputs of the decoder to the elements of our vocabulary
  • Use the your decoding_layer_train(encoder_state, dec_cell, dec_embed_input, target_sequence_length, max_target_sequence_length, output_layer, keep_prob) function to get the training logits.
  • Use your decoding_layer_infer(encoder_state, dec_cell, dec_embeddings, start_of_sequence_id, end_of_sequence_id, max_target_sequence_length, vocab_size, output_layer, batch_size, keep_prob) function to get the inference logits.

Note: You'll need to use tf.variable_scope to share variables between training and inference.


In [12]:
def decoding_layer(dec_input, encoder_state,
                   target_sequence_length, max_target_sequence_length,
                   rnn_size,
                   num_layers, target_vocab_to_int, target_vocab_size,
                   batch_size, keep_prob, decoding_embedding_size):
    """
    Create decoding layer
    :param dec_input: Decoder input
    :param encoder_state: Encoder state
    :param target_sequence_length: The lengths of each sequence in the target batch
    :param max_target_sequence_length: Maximum length of target sequences
    :param rnn_size: RNN Size
    :param num_layers: Number of layers
    :param target_vocab_to_int: Dictionary to go from the target words to an id
    :param target_vocab_size: Size of target vocabulary
    :param batch_size: The size of the batch
    :param keep_prob: Dropout keep probability
    :return: Tuple of (Training BasicDecoderOutput, Inference BasicDecoderOutput)
    """

    dec_embeddings = tf.Variable(tf.random_uniform([target_vocab_size, decoding_embedding_size]))
    dec_embed_input = tf.nn.embedding_lookup(dec_embeddings, dec_input)

    
    lstm_cell = tf.contrib.rnn.LSTMCell(rnn_size)

    lstm_stack = tf.contrib.rnn.MultiRNNCell([tf.contrib.rnn.LSTMCell(rnn_size) for _ in range(num_layers)])
    
    output_layer = Dense(target_vocab_size,
                         kernel_initializer = tf.truncated_normal_initializer(mean = 0.0, stddev=0.1))
    
    with tf.variable_scope("decode") as scope:
        train_output = decoding_layer_train(encoder_state, lstm_stack, dec_embed_input, target_sequence_length, max_target_sequence_length, output_layer, keep_prob) 
        scope.reuse_variables()
        infer_output = decoding_layer_infer(encoder_state, lstm_stack, dec_embeddings, target_vocab_to_int['<GO>'], target_vocab_to_int['<EOS>'],\
                                        max_target_sequence_length, target_vocab_size, output_layer, batch_size, keep_prob)
    
    return train_output, infer_output



"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_decoding_layer(decoding_layer)


Tests Passed

Build the Neural Network

Apply the functions you implemented above to:

  • Apply embedding to the input data for the encoder.
  • Encode the input using your encoding_layer(rnn_inputs, rnn_size, num_layers, keep_prob, source_sequence_length, source_vocab_size, encoding_embedding_size).
  • Process target data using your process_decoder_input(target_data, target_vocab_to_int, batch_size) function.
  • Apply embedding to the target data for the decoder.
  • Decode the encoded input using your decoding_layer(dec_input, enc_state, target_sequence_length, max_target_sentence_length, rnn_size, num_layers, target_vocab_to_int, target_vocab_size, batch_size, keep_prob, dec_embedding_size) function.

In [13]:
def seq2seq_model(input_data, target_data, keep_prob, batch_size,
                  source_sequence_length, target_sequence_length,
                  max_target_sentence_length,
                  source_vocab_size, target_vocab_size,
                  enc_embedding_size, dec_embedding_size,
                  rnn_size, num_layers, target_vocab_to_int):
    """
    Build the Sequence-to-Sequence part of the neural network
    :param input_data: Input placeholder
    :param target_data: Target placeholder
    :param keep_prob: Dropout keep probability placeholder
    :param batch_size: Batch Size
    :param source_sequence_length: Sequence Lengths of source sequences in the batch
    :param target_sequence_length: Sequence Lengths of target sequences in the batch
    :param source_vocab_size: Source vocabulary size
    :param target_vocab_size: Target vocabulary size
    :param enc_embedding_size: Decoder embedding size
    :param dec_embedding_size: Encoder embedding size
    :param rnn_size: RNN Size
    :param num_layers: Number of layers
    :param target_vocab_to_int: Dictionary to go from the target words to an id
    :return: Tuple of (Training BasicDecoderOutput, Inference BasicDecoderOutput)
    """
    _, encoding_state = encoding_layer(input_data, rnn_size, num_layers, keep_prob, 
                   source_sequence_length, source_vocab_size, 
                   enc_embedding_size)
        
    decoder_input = process_decoder_input(target_data, target_vocab_to_int, batch_size)
    
    return(decoding_layer(decoder_input, encoding_state, target_sequence_length, max_target_sentence_length, rnn_size, num_layers, target_vocab_to_int, target_vocab_size, batch_size, keep_prob, dec_embedding_size))
    


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_seq2seq_model(seq2seq_model)


Tests Passed

Neural Network Training

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of epochs.
  • Set batch_size to the batch size.
  • Set rnn_size to the size of the RNNs.
  • Set num_layers to the number of layers.
  • Set encoding_embedding_size to the size of the embedding for the encoder.
  • Set decoding_embedding_size to the size of the embedding for the decoder.
  • Set learning_rate to the learning rate.
  • Set keep_probability to the Dropout keep probability
  • Set display_step to state how many steps between each debug output statement

In [14]:
# Number of Epochs
epochs = 50
# Batch Size
batch_size = 512
# RNN Size
rnn_size = 256
# Number of Layers
num_layers = 2
# Embedding Size
encoding_embedding_size = 50
decoding_embedding_size = 50
# Learning Rate
learning_rate = 0.003
# Dropout Keep Probability
keep_probability = 0.5
display_step = 10

Build the Graph

Build the graph using the neural network you implemented.


In [15]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_path = 'checkpoints/dev'
(source_int_text, target_int_text), (source_vocab_to_int, target_vocab_to_int), _ = helper.load_preprocess()
max_target_sentence_length = max([len(sentence) for sentence in source_int_text])

train_graph = tf.Graph()
with train_graph.as_default():
    input_data, targets, lr, keep_prob, target_sequence_length, max_target_sequence_length, source_sequence_length = model_inputs()
    
    #sequence_length = tf.placeholder_with_default(max_target_sentence_length, None, name='sequence_length')
    input_shape = tf.shape(input_data)

    train_logits, inference_logits = seq2seq_model(tf.reverse(input_data, [-1]),
                                                   targets,
                                                   keep_prob,
                                                   batch_size,
                                                   source_sequence_length,
                                                   target_sequence_length,
                                                   max_target_sequence_length,
                                                   len(source_vocab_to_int),
                                                   len(target_vocab_to_int),
                                                   encoding_embedding_size,
                                                   decoding_embedding_size,
                                                   rnn_size,
                                                   num_layers,
                                                   target_vocab_to_int)


    training_logits = tf.identity(train_logits.rnn_output, name='logits')
    inference_logits = tf.identity(inference_logits.sample_id, name='predictions')

    masks = tf.sequence_mask(target_sequence_length, max_target_sequence_length, dtype=tf.float32, name='masks')

    with tf.name_scope("optimization"):
        # Loss function
        cost = tf.contrib.seq2seq.sequence_loss(
            training_logits,
            targets,
            masks)

        # Optimizer
        optimizer = tf.train.AdamOptimizer(lr)

        # Gradient Clipping
        gradients = optimizer.compute_gradients(cost)
        capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients if grad is not None]
        train_op = optimizer.apply_gradients(capped_gradients)

Batch and pad the source and target sequences


In [16]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
def pad_sentence_batch(sentence_batch, pad_int):
    """Pad sentences with <PAD> so that each sentence of a batch has the same length"""
    max_sentence = max([len(sentence) for sentence in sentence_batch])
    return [sentence + [pad_int] * (max_sentence - len(sentence)) for sentence in sentence_batch]


def get_batches(sources, targets, batch_size, source_pad_int, target_pad_int):
    """Batch targets, sources, and the lengths of their sentences together"""
    for batch_i in range(0, len(sources)//batch_size):
        start_i = batch_i * batch_size

        # Slice the right amount for the batch
        sources_batch = sources[start_i:start_i + batch_size]
        targets_batch = targets[start_i:start_i + batch_size]

        # Pad
        pad_sources_batch = np.array(pad_sentence_batch(sources_batch, source_pad_int))
        pad_targets_batch = np.array(pad_sentence_batch(targets_batch, target_pad_int))

        # Need the lengths for the _lengths parameters
        pad_targets_lengths = []
        for target in pad_targets_batch:
            pad_targets_lengths.append(len(target))

        pad_source_lengths = []
        for source in pad_sources_batch:
            pad_source_lengths.append(len(source))

        yield pad_sources_batch, pad_targets_batch, pad_source_lengths, pad_targets_lengths

Train

Train the neural network on the preprocessed data. If you have a hard time getting a good loss, check the forms to see if anyone is having the same problem.


In [17]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
def get_accuracy(target, logits):
    """
    Calculate accuracy
    """
    max_seq = max(target.shape[1], logits.shape[1])
    if max_seq - target.shape[1]:
        target = np.pad(
            target,
            [(0,0),(0,max_seq - target.shape[1])],
            'constant')
    if max_seq - logits.shape[1]:
        logits = np.pad(
            logits,
            [(0,0),(0,max_seq - logits.shape[1])],
            'constant')

    return np.mean(np.equal(target, logits))

# Split data to training and validation sets
train_source = source_int_text[batch_size:]
train_target = target_int_text[batch_size:]
valid_source = source_int_text[:batch_size]
valid_target = target_int_text[:batch_size]
(valid_sources_batch, valid_targets_batch, valid_sources_lengths, valid_targets_lengths ) = next(get_batches(valid_source,
                                                                                                             valid_target,
                                                                                                             batch_size,
                                                                                                             source_vocab_to_int['<PAD>'],
                                                                                                             target_vocab_to_int['<PAD>']))                                                                                                  
with tf.Session(graph=train_graph) as sess:
    sess.run(tf.global_variables_initializer())

    for epoch_i in range(epochs):
        for batch_i, (source_batch, target_batch, sources_lengths, targets_lengths) in enumerate(
                get_batches(train_source, train_target, batch_size,
                            source_vocab_to_int['<PAD>'],
                            target_vocab_to_int['<PAD>'])):

            _, loss = sess.run(
                [train_op, cost],
                {input_data: source_batch,
                 targets: target_batch,
                 lr: learning_rate,
                 target_sequence_length: targets_lengths,
                 source_sequence_length: sources_lengths,
                 keep_prob: keep_probability})


            if batch_i % display_step == 0 and batch_i > 0:


                batch_train_logits = sess.run(
                    inference_logits,
                    {input_data: source_batch,
                     source_sequence_length: sources_lengths,
                     target_sequence_length: targets_lengths,
                     keep_prob: 1.0})


                batch_valid_logits = sess.run(
                    inference_logits,
                    {input_data: valid_sources_batch,
                     source_sequence_length: valid_sources_lengths,
                     target_sequence_length: valid_targets_lengths,
                     keep_prob: 1.0})

                train_acc = get_accuracy(target_batch, batch_train_logits)

                valid_acc = get_accuracy(valid_targets_batch, batch_valid_logits)

                print('Epoch {:>3} Batch {:>4}/{} - Train Accuracy: {:>6.4f}, Validation Accuracy: {:>6.4f}, Loss: {:>6.4f}'
                      .format(epoch_i, batch_i, len(source_int_text) // batch_size, train_acc, valid_acc, loss))

    # Save Model
    saver = tf.train.Saver()
    saver.save(sess, save_path)
    print('Model Trained and Saved')


Epoch   0 Batch   10/269 - Train Accuracy: 0.3075, Validation Accuracy: 0.3738, Loss: 3.5211
Epoch   0 Batch   20/269 - Train Accuracy: 0.3609, Validation Accuracy: 0.4182, Loss: 3.0011
Epoch   0 Batch   30/269 - Train Accuracy: 0.4246, Validation Accuracy: 0.4520, Loss: 2.5886
Epoch   0 Batch   40/269 - Train Accuracy: 0.4345, Validation Accuracy: 0.4846, Loss: 2.4435
Epoch   0 Batch   50/269 - Train Accuracy: 0.4101, Validation Accuracy: 0.4617, Loss: 2.3317
Epoch   0 Batch   60/269 - Train Accuracy: 0.4447, Validation Accuracy: 0.4637, Loss: 1.9825
Epoch   0 Batch   70/269 - Train Accuracy: 0.4944, Validation Accuracy: 0.5081, Loss: 1.9130
Epoch   0 Batch   80/269 - Train Accuracy: 0.4985, Validation Accuracy: 0.5081, Loss: 1.6801
Epoch   0 Batch   90/269 - Train Accuracy: 0.4392, Validation Accuracy: 0.5062, Loss: 1.6568
Epoch   0 Batch  100/269 - Train Accuracy: 0.4825, Validation Accuracy: 0.4848, Loss: 1.3701
Epoch   0 Batch  110/269 - Train Accuracy: 0.4853, Validation Accuracy: 0.5193, Loss: 1.2728
Epoch   0 Batch  120/269 - Train Accuracy: 0.4784, Validation Accuracy: 0.5265, Loss: 1.1997
Epoch   0 Batch  130/269 - Train Accuracy: 0.4629, Validation Accuracy: 0.5158, Loss: 1.1229
Epoch   0 Batch  140/269 - Train Accuracy: 0.5071, Validation Accuracy: 0.5007, Loss: 0.9989
Epoch   0 Batch  150/269 - Train Accuracy: 0.5157, Validation Accuracy: 0.5345, Loss: 0.9393
Epoch   0 Batch  160/269 - Train Accuracy: 0.5313, Validation Accuracy: 0.5389, Loss: 0.8898
Epoch   0 Batch  170/269 - Train Accuracy: 0.5411, Validation Accuracy: 0.5464, Loss: 0.8391
Epoch   0 Batch  180/269 - Train Accuracy: 0.5537, Validation Accuracy: 0.5553, Loss: 0.8223
Epoch   0 Batch  190/269 - Train Accuracy: 0.5555, Validation Accuracy: 0.5672, Loss: 0.7778
Epoch   0 Batch  200/269 - Train Accuracy: 0.5604, Validation Accuracy: 0.5826, Loss: 0.7740
Epoch   0 Batch  210/269 - Train Accuracy: 0.5889, Validation Accuracy: 0.5812, Loss: 0.7140
Epoch   0 Batch  220/269 - Train Accuracy: 0.5958, Validation Accuracy: 0.5857, Loss: 0.6731
Epoch   0 Batch  230/269 - Train Accuracy: 0.5566, Validation Accuracy: 0.5717, Loss: 0.6897
Epoch   0 Batch  240/269 - Train Accuracy: 0.6043, Validation Accuracy: 0.5807, Loss: 0.6098
Epoch   0 Batch  250/269 - Train Accuracy: 0.5736, Validation Accuracy: 0.5872, Loss: 0.6625
Epoch   0 Batch  260/269 - Train Accuracy: 0.5786, Validation Accuracy: 0.5839, Loss: 0.6695
Epoch   1 Batch   10/269 - Train Accuracy: 0.5848, Validation Accuracy: 0.5980, Loss: 0.6428
Epoch   1 Batch   20/269 - Train Accuracy: 0.6048, Validation Accuracy: 0.6105, Loss: 0.6273
Epoch   1 Batch   30/269 - Train Accuracy: 0.6137, Validation Accuracy: 0.6158, Loss: 0.5978
Epoch   1 Batch   40/269 - Train Accuracy: 0.5998, Validation Accuracy: 0.6175, Loss: 0.6190
Epoch   1 Batch   50/269 - Train Accuracy: 0.6143, Validation Accuracy: 0.6262, Loss: 0.5994
Epoch   1 Batch   60/269 - Train Accuracy: 0.6340, Validation Accuracy: 0.6378, Loss: 0.5640
Epoch   1 Batch   70/269 - Train Accuracy: 0.6423, Validation Accuracy: 0.6338, Loss: 0.5556
Epoch   1 Batch   80/269 - Train Accuracy: 0.6505, Validation Accuracy: 0.6412, Loss: 0.5434
Epoch   1 Batch   90/269 - Train Accuracy: 0.6048, Validation Accuracy: 0.6409, Loss: 0.5663
Epoch   1 Batch  100/269 - Train Accuracy: 0.6589, Validation Accuracy: 0.6444, Loss: 0.5202
Epoch   1 Batch  110/269 - Train Accuracy: 0.6416, Validation Accuracy: 0.6514, Loss: 0.5157
Epoch   1 Batch  120/269 - Train Accuracy: 0.6549, Validation Accuracy: 0.6640, Loss: 0.5333
Epoch   1 Batch  130/269 - Train Accuracy: 0.6424, Validation Accuracy: 0.6566, Loss: 0.5281
Epoch   1 Batch  140/269 - Train Accuracy: 0.6661, Validation Accuracy: 0.6553, Loss: 0.5718
Epoch   1 Batch  150/269 - Train Accuracy: 0.6669, Validation Accuracy: 0.6650, Loss: 0.5021
Epoch   1 Batch  160/269 - Train Accuracy: 0.6614, Validation Accuracy: 0.6652, Loss: 0.4809
Epoch   1 Batch  170/269 - Train Accuracy: 0.6726, Validation Accuracy: 0.6793, Loss: 0.4675
Epoch   1 Batch  180/269 - Train Accuracy: 0.6718, Validation Accuracy: 0.6673, Loss: 0.4632
Epoch   1 Batch  190/269 - Train Accuracy: 0.6882, Validation Accuracy: 0.6887, Loss: 0.4446
Epoch   1 Batch  200/269 - Train Accuracy: 0.6779, Validation Accuracy: 0.6832, Loss: 0.4598
Epoch   1 Batch  210/269 - Train Accuracy: 0.6897, Validation Accuracy: 0.6940, Loss: 0.4319
Epoch   1 Batch  220/269 - Train Accuracy: 0.6950, Validation Accuracy: 0.6992, Loss: 0.4003
Epoch   1 Batch  230/269 - Train Accuracy: 0.6996, Validation Accuracy: 0.7103, Loss: 0.4105
Epoch   1 Batch  240/269 - Train Accuracy: 0.7219, Validation Accuracy: 0.7066, Loss: 0.3611
Epoch   1 Batch  250/269 - Train Accuracy: 0.7181, Validation Accuracy: 0.7214, Loss: 0.3975
Epoch   1 Batch  260/269 - Train Accuracy: 0.7015, Validation Accuracy: 0.7257, Loss: 0.4100
Epoch   2 Batch   10/269 - Train Accuracy: 0.7076, Validation Accuracy: 0.7245, Loss: 0.3889
Epoch   2 Batch   20/269 - Train Accuracy: 0.7115, Validation Accuracy: 0.7273, Loss: 0.3804
Epoch   2 Batch   30/269 - Train Accuracy: 0.7201, Validation Accuracy: 0.7345, Loss: 0.3509
Epoch   2 Batch   40/269 - Train Accuracy: 0.7203, Validation Accuracy: 0.7383, Loss: 0.3662
Epoch   2 Batch   50/269 - Train Accuracy: 0.7269, Validation Accuracy: 0.7372, Loss: 0.3497
Epoch   2 Batch   60/269 - Train Accuracy: 0.7260, Validation Accuracy: 0.7424, Loss: 0.3222
Epoch   2 Batch   70/269 - Train Accuracy: 0.7613, Validation Accuracy: 0.7456, Loss: 0.3149
Epoch   2 Batch   80/269 - Train Accuracy: 0.7554, Validation Accuracy: 0.7452, Loss: 0.3050
Epoch   2 Batch   90/269 - Train Accuracy: 0.7093, Validation Accuracy: 0.7352, Loss: 0.3198
Epoch   2 Batch  100/269 - Train Accuracy: 0.7808, Validation Accuracy: 0.7618, Loss: 0.2867
Epoch   2 Batch  110/269 - Train Accuracy: 0.7579, Validation Accuracy: 0.7741, Loss: 0.2710
Epoch   2 Batch  120/269 - Train Accuracy: 0.7699, Validation Accuracy: 0.7638, Loss: 0.2844
Epoch   2 Batch  130/269 - Train Accuracy: 0.7766, Validation Accuracy: 0.7813, Loss: 0.2614
Epoch   2 Batch  140/269 - Train Accuracy: 0.7795, Validation Accuracy: 0.7834, Loss: 0.2538
Epoch   2 Batch  150/269 - Train Accuracy: 0.7848, Validation Accuracy: 0.7854, Loss: 0.2433
Epoch   2 Batch  160/269 - Train Accuracy: 0.8093, Validation Accuracy: 0.7926, Loss: 0.2276
Epoch   2 Batch  170/269 - Train Accuracy: 0.8096, Validation Accuracy: 0.7963, Loss: 0.2310
Epoch   2 Batch  180/269 - Train Accuracy: 0.8149, Validation Accuracy: 0.8072, Loss: 0.2093
Epoch   2 Batch  190/269 - Train Accuracy: 0.7854, Validation Accuracy: 0.7838, Loss: 0.2171
Epoch   2 Batch  200/269 - Train Accuracy: 0.8139, Validation Accuracy: 0.8101, Loss: 0.2071
Epoch   2 Batch  210/269 - Train Accuracy: 0.8280, Validation Accuracy: 0.8104, Loss: 0.1914
Epoch   2 Batch  220/269 - Train Accuracy: 0.8145, Validation Accuracy: 0.8182, Loss: 0.1868
Epoch   2 Batch  230/269 - Train Accuracy: 0.8232, Validation Accuracy: 0.8251, Loss: 0.1840
Epoch   2 Batch  240/269 - Train Accuracy: 0.8521, Validation Accuracy: 0.8234, Loss: 0.1576
Epoch   2 Batch  250/269 - Train Accuracy: 0.8579, Validation Accuracy: 0.8479, Loss: 0.1677
Epoch   2 Batch  260/269 - Train Accuracy: 0.8610, Validation Accuracy: 0.8640, Loss: 0.1766
Epoch   3 Batch   10/269 - Train Accuracy: 0.8602, Validation Accuracy: 0.8668, Loss: 0.1620
Epoch   3 Batch   20/269 - Train Accuracy: 0.8781, Validation Accuracy: 0.8770, Loss: 0.1460
Epoch   3 Batch   30/269 - Train Accuracy: 0.8943, Validation Accuracy: 0.8808, Loss: 0.1370
Epoch   3 Batch   40/269 - Train Accuracy: 0.8717, Validation Accuracy: 0.8873, Loss: 0.1444
Epoch   3 Batch   50/269 - Train Accuracy: 0.8557, Validation Accuracy: 0.8719, Loss: 0.1439
Epoch   3 Batch   60/269 - Train Accuracy: 0.8678, Validation Accuracy: 0.8596, Loss: 0.1303
Epoch   3 Batch   70/269 - Train Accuracy: 0.9022, Validation Accuracy: 0.8828, Loss: 0.1271
Epoch   3 Batch   80/269 - Train Accuracy: 0.8969, Validation Accuracy: 0.8857, Loss: 0.1228
Epoch   3 Batch   90/269 - Train Accuracy: 0.8917, Validation Accuracy: 0.8973, Loss: 0.1338
Epoch   3 Batch  100/269 - Train Accuracy: 0.9041, Validation Accuracy: 0.8897, Loss: 0.1200
Epoch   3 Batch  110/269 - Train Accuracy: 0.8902, Validation Accuracy: 0.8972, Loss: 0.1031
Epoch   3 Batch  120/269 - Train Accuracy: 0.9168, Validation Accuracy: 0.9114, Loss: 0.1076
Epoch   3 Batch  130/269 - Train Accuracy: 0.9109, Validation Accuracy: 0.9051, Loss: 0.0965
Epoch   3 Batch  140/269 - Train Accuracy: 0.9188, Validation Accuracy: 0.9138, Loss: 0.0962
Epoch   3 Batch  150/269 - Train Accuracy: 0.9203, Validation Accuracy: 0.9157, Loss: 0.0857
Epoch   3 Batch  160/269 - Train Accuracy: 0.9101, Validation Accuracy: 0.9143, Loss: 0.0775
Epoch   3 Batch  170/269 - Train Accuracy: 0.9138, Validation Accuracy: 0.9256, Loss: 0.0762
Epoch   3 Batch  180/269 - Train Accuracy: 0.9337, Validation Accuracy: 0.9191, Loss: 0.0759
Epoch   3 Batch  190/269 - Train Accuracy: 0.9122, Validation Accuracy: 0.9227, Loss: 0.0839
Epoch   3 Batch  200/269 - Train Accuracy: 0.9146, Validation Accuracy: 0.9231, Loss: 0.0861
Epoch   3 Batch  210/269 - Train Accuracy: 0.9230, Validation Accuracy: 0.9196, Loss: 0.0787
Epoch   3 Batch  220/269 - Train Accuracy: 0.9215, Validation Accuracy: 0.9166, Loss: 0.0740
Epoch   3 Batch  230/269 - Train Accuracy: 0.9293, Validation Accuracy: 0.9296, Loss: 0.0694
Epoch   3 Batch  240/269 - Train Accuracy: 0.9262, Validation Accuracy: 0.9206, Loss: 0.0656
Epoch   3 Batch  250/269 - Train Accuracy: 0.9360, Validation Accuracy: 0.9240, Loss: 0.0643
Epoch   3 Batch  260/269 - Train Accuracy: 0.9265, Validation Accuracy: 0.9347, Loss: 0.0755
Epoch   4 Batch   10/269 - Train Accuracy: 0.9369, Validation Accuracy: 0.9363, Loss: 0.0620
Epoch   4 Batch   20/269 - Train Accuracy: 0.9270, Validation Accuracy: 0.9285, Loss: 0.0802
Epoch   4 Batch   30/269 - Train Accuracy: 0.9244, Validation Accuracy: 0.9265, Loss: 0.0963
Epoch   4 Batch   40/269 - Train Accuracy: 0.9196, Validation Accuracy: 0.9252, Loss: 0.0849
Epoch   4 Batch   50/269 - Train Accuracy: 0.9159, Validation Accuracy: 0.9271, Loss: 0.0775
Epoch   4 Batch   60/269 - Train Accuracy: 0.9277, Validation Accuracy: 0.9371, Loss: 0.0804
Epoch   4 Batch   70/269 - Train Accuracy: 0.9330, Validation Accuracy: 0.9298, Loss: 0.0905
Epoch   4 Batch   80/269 - Train Accuracy: 0.9287, Validation Accuracy: 0.9428, Loss: 0.0664
Epoch   4 Batch   90/269 - Train Accuracy: 0.9316, Validation Accuracy: 0.9395, Loss: 0.0626
Epoch   4 Batch  100/269 - Train Accuracy: 0.9341, Validation Accuracy: 0.9295, Loss: 0.0586
Epoch   4 Batch  110/269 - Train Accuracy: 0.9343, Validation Accuracy: 0.9394, Loss: 0.0516
Epoch   4 Batch  120/269 - Train Accuracy: 0.9424, Validation Accuracy: 0.9371, Loss: 0.0618
Epoch   4 Batch  130/269 - Train Accuracy: 0.9400, Validation Accuracy: 0.9423, Loss: 0.0559
Epoch   4 Batch  140/269 - Train Accuracy: 0.9396, Validation Accuracy: 0.9482, Loss: 0.0589
Epoch   4 Batch  150/269 - Train Accuracy: 0.9467, Validation Accuracy: 0.9476, Loss: 0.0540
Epoch   4 Batch  160/269 - Train Accuracy: 0.9392, Validation Accuracy: 0.9498, Loss: 0.0478
Epoch   4 Batch  170/269 - Train Accuracy: 0.9425, Validation Accuracy: 0.9496, Loss: 0.0444
Epoch   4 Batch  180/269 - Train Accuracy: 0.9563, Validation Accuracy: 0.9477, Loss: 0.0456
Epoch   4 Batch  190/269 - Train Accuracy: 0.9322, Validation Accuracy: 0.9405, Loss: 0.0520
Epoch   4 Batch  200/269 - Train Accuracy: 0.9393, Validation Accuracy: 0.9448, Loss: 0.0461
Epoch   4 Batch  210/269 - Train Accuracy: 0.9395, Validation Accuracy: 0.9405, Loss: 0.0473
Epoch   4 Batch  220/269 - Train Accuracy: 0.9434, Validation Accuracy: 0.9563, Loss: 0.0455
Epoch   4 Batch  230/269 - Train Accuracy: 0.9527, Validation Accuracy: 0.9598, Loss: 0.0457
Epoch   4 Batch  240/269 - Train Accuracy: 0.9456, Validation Accuracy: 0.9513, Loss: 0.0419
Epoch   4 Batch  250/269 - Train Accuracy: 0.9519, Validation Accuracy: 0.9440, Loss: 0.0429
Epoch   4 Batch  260/269 - Train Accuracy: 0.9459, Validation Accuracy: 0.9446, Loss: 0.0567
Epoch   5 Batch   10/269 - Train Accuracy: 0.9565, Validation Accuracy: 0.9563, Loss: 0.0396
Epoch   5 Batch   20/269 - Train Accuracy: 0.9460, Validation Accuracy: 0.9523, Loss: 0.0444
Epoch   5 Batch   30/269 - Train Accuracy: 0.9456, Validation Accuracy: 0.9458, Loss: 0.0449
Epoch   5 Batch   40/269 - Train Accuracy: 0.9525, Validation Accuracy: 0.9423, Loss: 0.0476
Epoch   5 Batch   50/269 - Train Accuracy: 0.9333, Validation Accuracy: 0.9569, Loss: 0.0484
Epoch   5 Batch   60/269 - Train Accuracy: 0.9482, Validation Accuracy: 0.9515, Loss: 0.0399
Epoch   5 Batch   70/269 - Train Accuracy: 0.9555, Validation Accuracy: 0.9436, Loss: 0.0451
Epoch   5 Batch   80/269 - Train Accuracy: 0.9551, Validation Accuracy: 0.9450, Loss: 0.0423
Epoch   5 Batch   90/269 - Train Accuracy: 0.9553, Validation Accuracy: 0.9545, Loss: 0.0419
Epoch   5 Batch  100/269 - Train Accuracy: 0.9572, Validation Accuracy: 0.9525, Loss: 0.0426
Epoch   5 Batch  110/269 - Train Accuracy: 0.9423, Validation Accuracy: 0.9483, Loss: 0.0398
Epoch   5 Batch  120/269 - Train Accuracy: 0.9473, Validation Accuracy: 0.9454, Loss: 0.0402
Epoch   5 Batch  130/269 - Train Accuracy: 0.9539, Validation Accuracy: 0.9517, Loss: 0.0413
Epoch   5 Batch  140/269 - Train Accuracy: 0.9467, Validation Accuracy: 0.9579, Loss: 0.0468
Epoch   5 Batch  150/269 - Train Accuracy: 0.9586, Validation Accuracy: 0.9569, Loss: 0.0427
Epoch   5 Batch  160/269 - Train Accuracy: 0.9534, Validation Accuracy: 0.9557, Loss: 0.0370
Epoch   5 Batch  170/269 - Train Accuracy: 0.9566, Validation Accuracy: 0.9495, Loss: 0.0367
Epoch   5 Batch  180/269 - Train Accuracy: 0.9598, Validation Accuracy: 0.9532, Loss: 0.0344
Epoch   5 Batch  190/269 - Train Accuracy: 0.9455, Validation Accuracy: 0.9537, Loss: 0.0422
Epoch   5 Batch  200/269 - Train Accuracy: 0.9536, Validation Accuracy: 0.9498, Loss: 0.0347
Epoch   5 Batch  210/269 - Train Accuracy: 0.9406, Validation Accuracy: 0.9459, Loss: 0.0414
Epoch   5 Batch  220/269 - Train Accuracy: 0.9435, Validation Accuracy: 0.9534, Loss: 0.0428
Epoch   5 Batch  230/269 - Train Accuracy: 0.9567, Validation Accuracy: 0.9541, Loss: 0.0382
Epoch   5 Batch  240/269 - Train Accuracy: 0.9523, Validation Accuracy: 0.9467, Loss: 0.0344
Epoch   5 Batch  250/269 - Train Accuracy: 0.9559, Validation Accuracy: 0.9556, Loss: 0.0398
Epoch   5 Batch  260/269 - Train Accuracy: 0.9516, Validation Accuracy: 0.9573, Loss: 0.0441
Epoch   6 Batch   10/269 - Train Accuracy: 0.9659, Validation Accuracy: 0.9609, Loss: 0.0341
Epoch   6 Batch   20/269 - Train Accuracy: 0.9589, Validation Accuracy: 0.9613, Loss: 0.0365
Epoch   6 Batch   30/269 - Train Accuracy: 0.9539, Validation Accuracy: 0.9555, Loss: 0.0376
Epoch   6 Batch   40/269 - Train Accuracy: 0.9646, Validation Accuracy: 0.9584, Loss: 0.0358
Epoch   6 Batch   50/269 - Train Accuracy: 0.9496, Validation Accuracy: 0.9542, Loss: 0.0420
Epoch   6 Batch   60/269 - Train Accuracy: 0.9617, Validation Accuracy: 0.9570, Loss: 0.0325
Epoch   6 Batch   70/269 - Train Accuracy: 0.9592, Validation Accuracy: 0.9580, Loss: 0.0378
Epoch   6 Batch   80/269 - Train Accuracy: 0.9565, Validation Accuracy: 0.9584, Loss: 0.0348
Epoch   6 Batch   90/269 - Train Accuracy: 0.9582, Validation Accuracy: 0.9515, Loss: 0.0382
Epoch   6 Batch  100/269 - Train Accuracy: 0.9536, Validation Accuracy: 0.9497, Loss: 0.0353
Epoch   6 Batch  110/269 - Train Accuracy: 0.9602, Validation Accuracy: 0.9586, Loss: 0.0299
Epoch   6 Batch  120/269 - Train Accuracy: 0.9704, Validation Accuracy: 0.9573, Loss: 0.0363
Epoch   6 Batch  130/269 - Train Accuracy: 0.9630, Validation Accuracy: 0.9527, Loss: 0.0340
Epoch   6 Batch  140/269 - Train Accuracy: 0.9584, Validation Accuracy: 0.9600, Loss: 0.0379
Epoch   6 Batch  150/269 - Train Accuracy: 0.9578, Validation Accuracy: 0.9611, Loss: 0.0336
Epoch   6 Batch  160/269 - Train Accuracy: 0.9623, Validation Accuracy: 0.9630, Loss: 0.0326
Epoch   6 Batch  170/269 - Train Accuracy: 0.9563, Validation Accuracy: 0.9579, Loss: 0.0316
Epoch   6 Batch  180/269 - Train Accuracy: 0.9763, Validation Accuracy: 0.9632, Loss: 0.0277
Epoch   6 Batch  190/269 - Train Accuracy: 0.9622, Validation Accuracy: 0.9531, Loss: 0.0346
Epoch   6 Batch  200/269 - Train Accuracy: 0.9629, Validation Accuracy: 0.9581, Loss: 0.0307
Epoch   6 Batch  210/269 - Train Accuracy: 0.9569, Validation Accuracy: 0.9637, Loss: 0.0271
Epoch   6 Batch  220/269 - Train Accuracy: 0.9652, Validation Accuracy: 0.9592, Loss: 0.0283
Epoch   6 Batch  230/269 - Train Accuracy: 0.9634, Validation Accuracy: 0.9731, Loss: 0.0295
Epoch   6 Batch  240/269 - Train Accuracy: 0.9626, Validation Accuracy: 0.9648, Loss: 0.0274
Epoch   6 Batch  250/269 - Train Accuracy: 0.9633, Validation Accuracy: 0.9569, Loss: 0.0272
Epoch   6 Batch  260/269 - Train Accuracy: 0.9610, Validation Accuracy: 0.9649, Loss: 0.0330
Epoch   7 Batch   10/269 - Train Accuracy: 0.9731, Validation Accuracy: 0.9598, Loss: 0.0230
Epoch   7 Batch   20/269 - Train Accuracy: 0.9712, Validation Accuracy: 0.9609, Loss: 0.0282
Epoch   7 Batch   30/269 - Train Accuracy: 0.9696, Validation Accuracy: 0.9618, Loss: 0.0285
Epoch   7 Batch   40/269 - Train Accuracy: 0.9681, Validation Accuracy: 0.9642, Loss: 0.0306
Epoch   7 Batch   50/269 - Train Accuracy: 0.9535, Validation Accuracy: 0.9640, Loss: 0.0338
Epoch   7 Batch   60/269 - Train Accuracy: 0.9735, Validation Accuracy: 0.9633, Loss: 0.0265
Epoch   7 Batch   70/269 - Train Accuracy: 0.9672, Validation Accuracy: 0.9625, Loss: 0.0287
Epoch   7 Batch   80/269 - Train Accuracy: 0.9660, Validation Accuracy: 0.9640, Loss: 0.0250
Epoch   7 Batch   90/269 - Train Accuracy: 0.9758, Validation Accuracy: 0.9600, Loss: 0.0269
Epoch   7 Batch  100/269 - Train Accuracy: 0.9690, Validation Accuracy: 0.9590, Loss: 0.0293
Epoch   7 Batch  110/269 - Train Accuracy: 0.9625, Validation Accuracy: 0.9545, Loss: 0.0255
Epoch   7 Batch  120/269 - Train Accuracy: 0.9693, Validation Accuracy: 0.9669, Loss: 0.0310
Epoch   7 Batch  130/269 - Train Accuracy: 0.9705, Validation Accuracy: 0.9664, Loss: 0.0294
Epoch   7 Batch  140/269 - Train Accuracy: 0.9661, Validation Accuracy: 0.9629, Loss: 0.0309
Epoch   7 Batch  150/269 - Train Accuracy: 0.9632, Validation Accuracy: 0.9688, Loss: 0.0277
Epoch   7 Batch  160/269 - Train Accuracy: 0.9664, Validation Accuracy: 0.9614, Loss: 0.0263
Epoch   7 Batch  170/269 - Train Accuracy: 0.9621, Validation Accuracy: 0.9672, Loss: 0.0249
Epoch   7 Batch  180/269 - Train Accuracy: 0.9717, Validation Accuracy: 0.9626, Loss: 0.0236
Epoch   7 Batch  190/269 - Train Accuracy: 0.9658, Validation Accuracy: 0.9624, Loss: 0.0273
Epoch   7 Batch  200/269 - Train Accuracy: 0.9798, Validation Accuracy: 0.9596, Loss: 0.0223
Epoch   7 Batch  210/269 - Train Accuracy: 0.9664, Validation Accuracy: 0.9664, Loss: 0.0243
Epoch   7 Batch  220/269 - Train Accuracy: 0.9661, Validation Accuracy: 0.9587, Loss: 0.0260
Epoch   7 Batch  230/269 - Train Accuracy: 0.9699, Validation Accuracy: 0.9673, Loss: 0.0275
Epoch   7 Batch  240/269 - Train Accuracy: 0.9721, Validation Accuracy: 0.9656, Loss: 0.0221
Epoch   7 Batch  250/269 - Train Accuracy: 0.9704, Validation Accuracy: 0.9679, Loss: 0.0245
Epoch   7 Batch  260/269 - Train Accuracy: 0.9729, Validation Accuracy: 0.9696, Loss: 0.0248
Epoch   8 Batch   10/269 - Train Accuracy: 0.9753, Validation Accuracy: 0.9640, Loss: 0.0224
Epoch   8 Batch   20/269 - Train Accuracy: 0.9734, Validation Accuracy: 0.9637, Loss: 0.0216
Epoch   8 Batch   30/269 - Train Accuracy: 0.9698, Validation Accuracy: 0.9643, Loss: 0.0252
Epoch   8 Batch   40/269 - Train Accuracy: 0.9737, Validation Accuracy: 0.9668, Loss: 0.0266
Epoch   8 Batch   50/269 - Train Accuracy: 0.9652, Validation Accuracy: 0.9664, Loss: 0.0290
Epoch   8 Batch   60/269 - Train Accuracy: 0.9793, Validation Accuracy: 0.9693, Loss: 0.0224
Epoch   8 Batch   70/269 - Train Accuracy: 0.9785, Validation Accuracy: 0.9596, Loss: 0.0256
Epoch   8 Batch   80/269 - Train Accuracy: 0.9704, Validation Accuracy: 0.9633, Loss: 0.0243
Epoch   8 Batch   90/269 - Train Accuracy: 0.9685, Validation Accuracy: 0.9667, Loss: 0.0220
Epoch   8 Batch  100/269 - Train Accuracy: 0.9702, Validation Accuracy: 0.9659, Loss: 0.0217
Epoch   8 Batch  110/269 - Train Accuracy: 0.9690, Validation Accuracy: 0.9661, Loss: 0.0202
Epoch   8 Batch  120/269 - Train Accuracy: 0.9703, Validation Accuracy: 0.9687, Loss: 0.0246
Epoch   8 Batch  130/269 - Train Accuracy: 0.9745, Validation Accuracy: 0.9687, Loss: 0.0270
Epoch   8 Batch  140/269 - Train Accuracy: 0.9745, Validation Accuracy: 0.9712, Loss: 0.0234
Epoch   8 Batch  150/269 - Train Accuracy: 0.9706, Validation Accuracy: 0.9731, Loss: 0.0251
Epoch   8 Batch  160/269 - Train Accuracy: 0.9753, Validation Accuracy: 0.9674, Loss: 0.0209
Epoch   8 Batch  170/269 - Train Accuracy: 0.9741, Validation Accuracy: 0.9706, Loss: 0.0205
Epoch   8 Batch  180/269 - Train Accuracy: 0.9788, Validation Accuracy: 0.9733, Loss: 0.0250
Epoch   8 Batch  190/269 - Train Accuracy: 0.9759, Validation Accuracy: 0.9706, Loss: 0.0221
Epoch   8 Batch  200/269 - Train Accuracy: 0.9765, Validation Accuracy: 0.9688, Loss: 0.0191
Epoch   8 Batch  210/269 - Train Accuracy: 0.9718, Validation Accuracy: 0.9653, Loss: 0.0205
Epoch   8 Batch  220/269 - Train Accuracy: 0.9724, Validation Accuracy: 0.9606, Loss: 0.0221
Epoch   8 Batch  230/269 - Train Accuracy: 0.9774, Validation Accuracy: 0.9703, Loss: 0.0200
Epoch   8 Batch  240/269 - Train Accuracy: 0.9733, Validation Accuracy: 0.9728, Loss: 0.0209
Epoch   8 Batch  250/269 - Train Accuracy: 0.9738, Validation Accuracy: 0.9640, Loss: 0.0214
Epoch   8 Batch  260/269 - Train Accuracy: 0.9771, Validation Accuracy: 0.9737, Loss: 0.0240
Epoch   9 Batch   10/269 - Train Accuracy: 0.9863, Validation Accuracy: 0.9680, Loss: 0.0182
Epoch   9 Batch   20/269 - Train Accuracy: 0.9798, Validation Accuracy: 0.9694, Loss: 0.0193
Epoch   9 Batch   30/269 - Train Accuracy: 0.9678, Validation Accuracy: 0.9670, Loss: 0.0258
Epoch   9 Batch   40/269 - Train Accuracy: 0.9751, Validation Accuracy: 0.9654, Loss: 0.0213
Epoch   9 Batch   50/269 - Train Accuracy: 0.9662, Validation Accuracy: 0.9681, Loss: 0.0256
Epoch   9 Batch   60/269 - Train Accuracy: 0.9779, Validation Accuracy: 0.9673, Loss: 0.0219
Epoch   9 Batch   70/269 - Train Accuracy: 0.9747, Validation Accuracy: 0.9577, Loss: 0.0202
Epoch   9 Batch   80/269 - Train Accuracy: 0.9768, Validation Accuracy: 0.9666, Loss: 0.0202
Epoch   9 Batch   90/269 - Train Accuracy: 0.9680, Validation Accuracy: 0.9690, Loss: 0.0244
Epoch   9 Batch  100/269 - Train Accuracy: 0.9715, Validation Accuracy: 0.9718, Loss: 0.0228
Epoch   9 Batch  110/269 - Train Accuracy: 0.9750, Validation Accuracy: 0.9584, Loss: 0.0182
Epoch   9 Batch  120/269 - Train Accuracy: 0.9661, Validation Accuracy: 0.9659, Loss: 0.0245
Epoch   9 Batch  130/269 - Train Accuracy: 0.9750, Validation Accuracy: 0.9727, Loss: 0.0224
Epoch   9 Batch  140/269 - Train Accuracy: 0.9767, Validation Accuracy: 0.9682, Loss: 0.0257
Epoch   9 Batch  150/269 - Train Accuracy: 0.9800, Validation Accuracy: 0.9666, Loss: 0.0213
Epoch   9 Batch  160/269 - Train Accuracy: 0.9740, Validation Accuracy: 0.9654, Loss: 0.0302
Epoch   9 Batch  170/269 - Train Accuracy: 0.9640, Validation Accuracy: 0.9595, Loss: 0.0370
Epoch   9 Batch  180/269 - Train Accuracy: 0.9796, Validation Accuracy: 0.9622, Loss: 0.0260
Epoch   9 Batch  190/269 - Train Accuracy: 0.9750, Validation Accuracy: 0.9756, Loss: 0.0294
Epoch   9 Batch  200/269 - Train Accuracy: 0.9800, Validation Accuracy: 0.9607, Loss: 0.0197
Epoch   9 Batch  210/269 - Train Accuracy: 0.9680, Validation Accuracy: 0.9703, Loss: 0.0230
Epoch   9 Batch  220/269 - Train Accuracy: 0.9729, Validation Accuracy: 0.9690, Loss: 0.0224
Epoch   9 Batch  230/269 - Train Accuracy: 0.9762, Validation Accuracy: 0.9713, Loss: 0.0216
Epoch   9 Batch  240/269 - Train Accuracy: 0.9752, Validation Accuracy: 0.9656, Loss: 0.0184
Epoch   9 Batch  250/269 - Train Accuracy: 0.9746, Validation Accuracy: 0.9688, Loss: 0.0198
Epoch   9 Batch  260/269 - Train Accuracy: 0.9831, Validation Accuracy: 0.9662, Loss: 0.0199
Epoch  10 Batch   10/269 - Train Accuracy: 0.9696, Validation Accuracy: 0.9705, Loss: 0.0165
Epoch  10 Batch   20/269 - Train Accuracy: 0.9817, Validation Accuracy: 0.9686, Loss: 0.0188
Epoch  10 Batch   30/269 - Train Accuracy: 0.9759, Validation Accuracy: 0.9734, Loss: 0.0211
Epoch  10 Batch   40/269 - Train Accuracy: 0.9751, Validation Accuracy: 0.9706, Loss: 0.0224
Epoch  10 Batch   50/269 - Train Accuracy: 0.9695, Validation Accuracy: 0.9726, Loss: 0.0225
Epoch  10 Batch   60/269 - Train Accuracy: 0.9787, Validation Accuracy: 0.9765, Loss: 0.0185
Epoch  10 Batch   70/269 - Train Accuracy: 0.9778, Validation Accuracy: 0.9711, Loss: 0.0182
Epoch  10 Batch   80/269 - Train Accuracy: 0.9837, Validation Accuracy: 0.9788, Loss: 0.0171
Epoch  10 Batch   90/269 - Train Accuracy: 0.9754, Validation Accuracy: 0.9751, Loss: 0.0187
Epoch  10 Batch  100/269 - Train Accuracy: 0.9765, Validation Accuracy: 0.9684, Loss: 0.0185
Epoch  10 Batch  110/269 - Train Accuracy: 0.9757, Validation Accuracy: 0.9619, Loss: 0.0156
Epoch  10 Batch  120/269 - Train Accuracy: 0.9754, Validation Accuracy: 0.9742, Loss: 0.0176
Epoch  10 Batch  130/269 - Train Accuracy: 0.9783, Validation Accuracy: 0.9735, Loss: 0.0228
Epoch  10 Batch  140/269 - Train Accuracy: 0.9729, Validation Accuracy: 0.9755, Loss: 0.0175
Epoch  10 Batch  150/269 - Train Accuracy: 0.9811, Validation Accuracy: 0.9735, Loss: 0.0198
Epoch  10 Batch  160/269 - Train Accuracy: 0.9811, Validation Accuracy: 0.9693, Loss: 0.0178
Epoch  10 Batch  170/269 - Train Accuracy: 0.9729, Validation Accuracy: 0.9684, Loss: 0.0182
Epoch  10 Batch  180/269 - Train Accuracy: 0.9873, Validation Accuracy: 0.9719, Loss: 0.0148
Epoch  10 Batch  190/269 - Train Accuracy: 0.9810, Validation Accuracy: 0.9764, Loss: 0.0186
Epoch  10 Batch  200/269 - Train Accuracy: 0.9879, Validation Accuracy: 0.9693, Loss: 0.0162
Epoch  10 Batch  210/269 - Train Accuracy: 0.9774, Validation Accuracy: 0.9679, Loss: 0.0154
Epoch  10 Batch  220/269 - Train Accuracy: 0.9757, Validation Accuracy: 0.9733, Loss: 0.0163
Epoch  10 Batch  230/269 - Train Accuracy: 0.9849, Validation Accuracy: 0.9813, Loss: 0.0177
Epoch  10 Batch  240/269 - Train Accuracy: 0.9817, Validation Accuracy: 0.9712, Loss: 0.0157
Epoch  10 Batch  250/269 - Train Accuracy: 0.9802, Validation Accuracy: 0.9742, Loss: 0.0163
Epoch  10 Batch  260/269 - Train Accuracy: 0.9811, Validation Accuracy: 0.9756, Loss: 0.0173
Epoch  11 Batch   10/269 - Train Accuracy: 0.9824, Validation Accuracy: 0.9720, Loss: 0.0139
Epoch  11 Batch   20/269 - Train Accuracy: 0.9840, Validation Accuracy: 0.9704, Loss: 0.0160
Epoch  11 Batch   30/269 - Train Accuracy: 0.9766, Validation Accuracy: 0.9764, Loss: 0.0203
Epoch  11 Batch   40/269 - Train Accuracy: 0.9824, Validation Accuracy: 0.9701, Loss: 0.0176
Epoch  11 Batch   50/269 - Train Accuracy: 0.9723, Validation Accuracy: 0.9762, Loss: 0.0218
Epoch  11 Batch   60/269 - Train Accuracy: 0.9766, Validation Accuracy: 0.9751, Loss: 0.0179
Epoch  11 Batch   70/269 - Train Accuracy: 0.9820, Validation Accuracy: 0.9733, Loss: 0.0160
Epoch  11 Batch   80/269 - Train Accuracy: 0.9843, Validation Accuracy: 0.9680, Loss: 0.0125
Epoch  11 Batch   90/269 - Train Accuracy: 0.9785, Validation Accuracy: 0.9702, Loss: 0.0165
Epoch  11 Batch  100/269 - Train Accuracy: 0.9799, Validation Accuracy: 0.9734, Loss: 0.0181
Epoch  11 Batch  110/269 - Train Accuracy: 0.9774, Validation Accuracy: 0.9719, Loss: 0.0144
Epoch  11 Batch  120/269 - Train Accuracy: 0.9788, Validation Accuracy: 0.9692, Loss: 0.0167
Epoch  11 Batch  130/269 - Train Accuracy: 0.9792, Validation Accuracy: 0.9762, Loss: 0.0184
Epoch  11 Batch  140/269 - Train Accuracy: 0.9773, Validation Accuracy: 0.9732, Loss: 0.0162
Epoch  11 Batch  150/269 - Train Accuracy: 0.9815, Validation Accuracy: 0.9749, Loss: 0.0186
Epoch  11 Batch  160/269 - Train Accuracy: 0.9825, Validation Accuracy: 0.9792, Loss: 0.0164
Epoch  11 Batch  170/269 - Train Accuracy: 0.9784, Validation Accuracy: 0.9698, Loss: 0.0145
Epoch  11 Batch  180/269 - Train Accuracy: 0.9881, Validation Accuracy: 0.9776, Loss: 0.0122
Epoch  11 Batch  190/269 - Train Accuracy: 0.9813, Validation Accuracy: 0.9788, Loss: 0.0160
Epoch  11 Batch  200/269 - Train Accuracy: 0.9901, Validation Accuracy: 0.9661, Loss: 0.0138
Epoch  11 Batch  210/269 - Train Accuracy: 0.9884, Validation Accuracy: 0.9703, Loss: 0.0149
Epoch  11 Batch  220/269 - Train Accuracy: 0.9844, Validation Accuracy: 0.9702, Loss: 0.0133
Epoch  11 Batch  230/269 - Train Accuracy: 0.9819, Validation Accuracy: 0.9744, Loss: 0.0148
Epoch  11 Batch  240/269 - Train Accuracy: 0.9791, Validation Accuracy: 0.9667, Loss: 0.0176
Epoch  11 Batch  250/269 - Train Accuracy: 0.9832, Validation Accuracy: 0.9705, Loss: 0.0120
Epoch  11 Batch  260/269 - Train Accuracy: 0.9836, Validation Accuracy: 0.9745, Loss: 0.0134
Epoch  12 Batch   10/269 - Train Accuracy: 0.9872, Validation Accuracy: 0.9743, Loss: 0.0127
Epoch  12 Batch   20/269 - Train Accuracy: 0.9850, Validation Accuracy: 0.9770, Loss: 0.0136
Epoch  12 Batch   30/269 - Train Accuracy: 0.9762, Validation Accuracy: 0.9750, Loss: 0.0204
Epoch  12 Batch   40/269 - Train Accuracy: 0.9841, Validation Accuracy: 0.9653, Loss: 0.0140
Epoch  12 Batch   50/269 - Train Accuracy: 0.9744, Validation Accuracy: 0.9725, Loss: 0.0180
Epoch  12 Batch   60/269 - Train Accuracy: 0.9842, Validation Accuracy: 0.9787, Loss: 0.0141
Epoch  12 Batch   70/269 - Train Accuracy: 0.9794, Validation Accuracy: 0.9712, Loss: 0.0141
Epoch  12 Batch   80/269 - Train Accuracy: 0.9883, Validation Accuracy: 0.9661, Loss: 0.0140
Epoch  12 Batch   90/269 - Train Accuracy: 0.9776, Validation Accuracy: 0.9751, Loss: 0.0178
Epoch  12 Batch  100/269 - Train Accuracy: 0.9848, Validation Accuracy: 0.9692, Loss: 0.0165
Epoch  12 Batch  110/269 - Train Accuracy: 0.9847, Validation Accuracy: 0.9658, Loss: 0.0111
Epoch  12 Batch  120/269 - Train Accuracy: 0.9781, Validation Accuracy: 0.9745, Loss: 0.0194
Epoch  12 Batch  130/269 - Train Accuracy: 0.9752, Validation Accuracy: 0.9766, Loss: 0.0205
Epoch  12 Batch  140/269 - Train Accuracy: 0.9817, Validation Accuracy: 0.9749, Loss: 0.0189
Epoch  12 Batch  150/269 - Train Accuracy: 0.9823, Validation Accuracy: 0.9749, Loss: 0.0168
Epoch  12 Batch  160/269 - Train Accuracy: 0.9807, Validation Accuracy: 0.9751, Loss: 0.0150
Epoch  12 Batch  170/269 - Train Accuracy: 0.9807, Validation Accuracy: 0.9751, Loss: 0.0151
Epoch  12 Batch  180/269 - Train Accuracy: 0.9861, Validation Accuracy: 0.9741, Loss: 0.0129
Epoch  12 Batch  190/269 - Train Accuracy: 0.9825, Validation Accuracy: 0.9731, Loss: 0.0150
Epoch  12 Batch  200/269 - Train Accuracy: 0.9919, Validation Accuracy: 0.9704, Loss: 0.0109
Epoch  12 Batch  210/269 - Train Accuracy: 0.9837, Validation Accuracy: 0.9731, Loss: 0.0108
Epoch  12 Batch  220/269 - Train Accuracy: 0.9849, Validation Accuracy: 0.9751, Loss: 0.0129
Epoch  12 Batch  230/269 - Train Accuracy: 0.9863, Validation Accuracy: 0.9805, Loss: 0.0112
Epoch  12 Batch  240/269 - Train Accuracy: 0.9822, Validation Accuracy: 0.9703, Loss: 0.0134
Epoch  12 Batch  250/269 - Train Accuracy: 0.9844, Validation Accuracy: 0.9678, Loss: 0.0148
Epoch  12 Batch  260/269 - Train Accuracy: 0.9892, Validation Accuracy: 0.9756, Loss: 0.0167
Epoch  13 Batch   10/269 - Train Accuracy: 0.9848, Validation Accuracy: 0.9838, Loss: 0.0126
Epoch  13 Batch   20/269 - Train Accuracy: 0.9857, Validation Accuracy: 0.9753, Loss: 0.0123
Epoch  13 Batch   30/269 - Train Accuracy: 0.9799, Validation Accuracy: 0.9815, Loss: 0.0173
Epoch  13 Batch   40/269 - Train Accuracy: 0.9843, Validation Accuracy: 0.9767, Loss: 0.0144
Epoch  13 Batch   50/269 - Train Accuracy: 0.9758, Validation Accuracy: 0.9762, Loss: 0.0201
Epoch  13 Batch   60/269 - Train Accuracy: 0.9847, Validation Accuracy: 0.9790, Loss: 0.0147
Epoch  13 Batch   70/269 - Train Accuracy: 0.9888, Validation Accuracy: 0.9733, Loss: 0.0149
Epoch  13 Batch   80/269 - Train Accuracy: 0.9837, Validation Accuracy: 0.9751, Loss: 0.0113
Epoch  13 Batch   90/269 - Train Accuracy: 0.9777, Validation Accuracy: 0.9760, Loss: 0.0144
Epoch  13 Batch  100/269 - Train Accuracy: 0.9824, Validation Accuracy: 0.9703, Loss: 0.0148
Epoch  13 Batch  110/269 - Train Accuracy: 0.9910, Validation Accuracy: 0.9759, Loss: 0.0107
Epoch  13 Batch  120/269 - Train Accuracy: 0.9802, Validation Accuracy: 0.9771, Loss: 0.0161
Epoch  13 Batch  130/269 - Train Accuracy: 0.9805, Validation Accuracy: 0.9727, Loss: 0.0179
Epoch  13 Batch  140/269 - Train Accuracy: 0.9820, Validation Accuracy: 0.9827, Loss: 0.0147
Epoch  13 Batch  150/269 - Train Accuracy: 0.9880, Validation Accuracy: 0.9709, Loss: 0.0184
Epoch  13 Batch  160/269 - Train Accuracy: 0.9883, Validation Accuracy: 0.9720, Loss: 0.0124
Epoch  13 Batch  170/269 - Train Accuracy: 0.9856, Validation Accuracy: 0.9719, Loss: 0.0126
Epoch  13 Batch  180/269 - Train Accuracy: 0.9848, Validation Accuracy: 0.9772, Loss: 0.0139
Epoch  13 Batch  190/269 - Train Accuracy: 0.9849, Validation Accuracy: 0.9772, Loss: 0.0149
Epoch  13 Batch  200/269 - Train Accuracy: 0.9895, Validation Accuracy: 0.9723, Loss: 0.0091
Epoch  13 Batch  210/269 - Train Accuracy: 0.9872, Validation Accuracy: 0.9719, Loss: 0.0137
Epoch  13 Batch  220/269 - Train Accuracy: 0.9860, Validation Accuracy: 0.9757, Loss: 0.0122
Epoch  13 Batch  230/269 - Train Accuracy: 0.9867, Validation Accuracy: 0.9781, Loss: 0.0112
Epoch  13 Batch  240/269 - Train Accuracy: 0.9843, Validation Accuracy: 0.9789, Loss: 0.0130
Epoch  13 Batch  250/269 - Train Accuracy: 0.9854, Validation Accuracy: 0.9783, Loss: 0.0131
Epoch  13 Batch  260/269 - Train Accuracy: 0.9864, Validation Accuracy: 0.9745, Loss: 0.0123
Epoch  14 Batch   10/269 - Train Accuracy: 0.9821, Validation Accuracy: 0.9794, Loss: 0.0108
Epoch  14 Batch   20/269 - Train Accuracy: 0.9807, Validation Accuracy: 0.9743, Loss: 0.0107
Epoch  14 Batch   30/269 - Train Accuracy: 0.9792, Validation Accuracy: 0.9753, Loss: 0.0170
Epoch  14 Batch   40/269 - Train Accuracy: 0.9802, Validation Accuracy: 0.9773, Loss: 0.0143
Epoch  14 Batch   50/269 - Train Accuracy: 0.9882, Validation Accuracy: 0.9759, Loss: 0.0158
Epoch  14 Batch   60/269 - Train Accuracy: 0.9806, Validation Accuracy: 0.9766, Loss: 0.0163
Epoch  14 Batch   70/269 - Train Accuracy: 0.9844, Validation Accuracy: 0.9715, Loss: 0.0139
Epoch  14 Batch   80/269 - Train Accuracy: 0.9900, Validation Accuracy: 0.9701, Loss: 0.0110
Epoch  14 Batch   90/269 - Train Accuracy: 0.9836, Validation Accuracy: 0.9693, Loss: 0.0139
Epoch  14 Batch  100/269 - Train Accuracy: 0.9845, Validation Accuracy: 0.9617, Loss: 0.0136
Epoch  14 Batch  110/269 - Train Accuracy: 0.9880, Validation Accuracy: 0.9770, Loss: 0.0087
Epoch  14 Batch  120/269 - Train Accuracy: 0.9834, Validation Accuracy: 0.9732, Loss: 0.0169
Epoch  14 Batch  130/269 - Train Accuracy: 0.9862, Validation Accuracy: 0.9721, Loss: 0.0135
Epoch  14 Batch  140/269 - Train Accuracy: 0.9829, Validation Accuracy: 0.9770, Loss: 0.0123
Epoch  14 Batch  150/269 - Train Accuracy: 0.9900, Validation Accuracy: 0.9746, Loss: 0.0132
Epoch  14 Batch  160/269 - Train Accuracy: 0.9853, Validation Accuracy: 0.9735, Loss: 0.0124
Epoch  14 Batch  170/269 - Train Accuracy: 0.9822, Validation Accuracy: 0.9775, Loss: 0.0146
Epoch  14 Batch  180/269 - Train Accuracy: 0.9879, Validation Accuracy: 0.9775, Loss: 0.0120
Epoch  14 Batch  190/269 - Train Accuracy: 0.9871, Validation Accuracy: 0.9766, Loss: 0.0103
Epoch  14 Batch  200/269 - Train Accuracy: 0.9877, Validation Accuracy: 0.9719, Loss: 0.0107
Epoch  14 Batch  210/269 - Train Accuracy: 0.9882, Validation Accuracy: 0.9705, Loss: 0.0118
Epoch  14 Batch  220/269 - Train Accuracy: 0.9875, Validation Accuracy: 0.9800, Loss: 0.0113
Epoch  14 Batch  230/269 - Train Accuracy: 0.9860, Validation Accuracy: 0.9804, Loss: 0.0119
Epoch  14 Batch  240/269 - Train Accuracy: 0.9864, Validation Accuracy: 0.9779, Loss: 0.0128
Epoch  14 Batch  250/269 - Train Accuracy: 0.9875, Validation Accuracy: 0.9774, Loss: 0.0122
Epoch  14 Batch  260/269 - Train Accuracy: 0.9863, Validation Accuracy: 0.9700, Loss: 0.0146
Epoch  15 Batch   10/269 - Train Accuracy: 0.9838, Validation Accuracy: 0.9830, Loss: 0.0114
Epoch  15 Batch   20/269 - Train Accuracy: 0.9850, Validation Accuracy: 0.9720, Loss: 0.0100
Epoch  15 Batch   30/269 - Train Accuracy: 0.9830, Validation Accuracy: 0.9816, Loss: 0.0161
Epoch  15 Batch   40/269 - Train Accuracy: 0.9835, Validation Accuracy: 0.9687, Loss: 0.0152
Epoch  15 Batch   50/269 - Train Accuracy: 0.9788, Validation Accuracy: 0.9800, Loss: 0.0146
Epoch  15 Batch   60/269 - Train Accuracy: 0.9834, Validation Accuracy: 0.9792, Loss: 0.0151
Epoch  15 Batch   70/269 - Train Accuracy: 0.9879, Validation Accuracy: 0.9817, Loss: 0.0120
Epoch  15 Batch   80/269 - Train Accuracy: 0.9857, Validation Accuracy: 0.9836, Loss: 0.0125
Epoch  15 Batch   90/269 - Train Accuracy: 0.9834, Validation Accuracy: 0.9716, Loss: 0.0122
Epoch  15 Batch  100/269 - Train Accuracy: 0.9874, Validation Accuracy: 0.9689, Loss: 0.0140
Epoch  15 Batch  110/269 - Train Accuracy: 0.9884, Validation Accuracy: 0.9769, Loss: 0.0092
Epoch  15 Batch  120/269 - Train Accuracy: 0.9852, Validation Accuracy: 0.9785, Loss: 0.0128
Epoch  15 Batch  130/269 - Train Accuracy: 0.9841, Validation Accuracy: 0.9728, Loss: 0.0148
Epoch  15 Batch  140/269 - Train Accuracy: 0.9839, Validation Accuracy: 0.9797, Loss: 0.0134
Epoch  15 Batch  150/269 - Train Accuracy: 0.9853, Validation Accuracy: 0.9790, Loss: 0.0153
Epoch  15 Batch  160/269 - Train Accuracy: 0.9850, Validation Accuracy: 0.9761, Loss: 0.0116
Epoch  15 Batch  170/269 - Train Accuracy: 0.9813, Validation Accuracy: 0.9802, Loss: 0.0144
Epoch  15 Batch  180/269 - Train Accuracy: 0.9895, Validation Accuracy: 0.9804, Loss: 0.0113
Epoch  15 Batch  190/269 - Train Accuracy: 0.9842, Validation Accuracy: 0.9798, Loss: 0.0111
Epoch  15 Batch  200/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9746, Loss: 0.0094
Epoch  15 Batch  210/269 - Train Accuracy: 0.9914, Validation Accuracy: 0.9710, Loss: 0.0088
Epoch  15 Batch  220/269 - Train Accuracy: 0.9834, Validation Accuracy: 0.9765, Loss: 0.0098
Epoch  15 Batch  230/269 - Train Accuracy: 0.9871, Validation Accuracy: 0.9811, Loss: 0.0098
Epoch  15 Batch  240/269 - Train Accuracy: 0.9830, Validation Accuracy: 0.9740, Loss: 0.0113
Epoch  15 Batch  250/269 - Train Accuracy: 0.9884, Validation Accuracy: 0.9733, Loss: 0.0133
Epoch  15 Batch  260/269 - Train Accuracy: 0.9886, Validation Accuracy: 0.9819, Loss: 0.0104
Epoch  16 Batch   10/269 - Train Accuracy: 0.9894, Validation Accuracy: 0.9796, Loss: 0.0092
Epoch  16 Batch   20/269 - Train Accuracy: 0.9811, Validation Accuracy: 0.9720, Loss: 0.0123
Epoch  16 Batch   30/269 - Train Accuracy: 0.9828, Validation Accuracy: 0.9758, Loss: 0.0128
Epoch  16 Batch   40/269 - Train Accuracy: 0.9833, Validation Accuracy: 0.9728, Loss: 0.0162
Epoch  16 Batch   50/269 - Train Accuracy: 0.9841, Validation Accuracy: 0.9773, Loss: 0.0125
Epoch  16 Batch   60/269 - Train Accuracy: 0.9799, Validation Accuracy: 0.9826, Loss: 0.0116
Epoch  16 Batch   70/269 - Train Accuracy: 0.9928, Validation Accuracy: 0.9783, Loss: 0.0138
Epoch  16 Batch   80/269 - Train Accuracy: 0.9895, Validation Accuracy: 0.9794, Loss: 0.0096
Epoch  16 Batch   90/269 - Train Accuracy: 0.9812, Validation Accuracy: 0.9758, Loss: 0.0130
Epoch  16 Batch  100/269 - Train Accuracy: 0.9834, Validation Accuracy: 0.9732, Loss: 0.0114
Epoch  16 Batch  110/269 - Train Accuracy: 0.9875, Validation Accuracy: 0.9745, Loss: 0.0102
Epoch  16 Batch  120/269 - Train Accuracy: 0.9883, Validation Accuracy: 0.9779, Loss: 0.0138
Epoch  16 Batch  130/269 - Train Accuracy: 0.9907, Validation Accuracy: 0.9716, Loss: 0.0491
Epoch  16 Batch  140/269 - Train Accuracy: 0.9860, Validation Accuracy: 0.9770, Loss: 0.0187
Epoch  16 Batch  150/269 - Train Accuracy: 0.9887, Validation Accuracy: 0.9810, Loss: 0.0201
Epoch  16 Batch  160/269 - Train Accuracy: 0.9844, Validation Accuracy: 0.9746, Loss: 0.0168
Epoch  16 Batch  170/269 - Train Accuracy: 0.9851, Validation Accuracy: 0.9735, Loss: 0.0151
Epoch  16 Batch  180/269 - Train Accuracy: 0.9881, Validation Accuracy: 0.9782, Loss: 0.0125
Epoch  16 Batch  190/269 - Train Accuracy: 0.9833, Validation Accuracy: 0.9798, Loss: 0.0122
Epoch  16 Batch  200/269 - Train Accuracy: 0.9920, Validation Accuracy: 0.9785, Loss: 0.0098
Epoch  16 Batch  210/269 - Train Accuracy: 0.9870, Validation Accuracy: 0.9798, Loss: 0.0104
Epoch  16 Batch  220/269 - Train Accuracy: 0.9848, Validation Accuracy: 0.9779, Loss: 0.0111
Epoch  16 Batch  230/269 - Train Accuracy: 0.9886, Validation Accuracy: 0.9817, Loss: 0.0104
Epoch  16 Batch  240/269 - Train Accuracy: 0.9884, Validation Accuracy: 0.9801, Loss: 0.0118
Epoch  16 Batch  250/269 - Train Accuracy: 0.9876, Validation Accuracy: 0.9821, Loss: 0.0093
Epoch  16 Batch  260/269 - Train Accuracy: 0.9912, Validation Accuracy: 0.9772, Loss: 0.0124
Epoch  17 Batch   10/269 - Train Accuracy: 0.9877, Validation Accuracy: 0.9781, Loss: 0.0106
Epoch  17 Batch   20/269 - Train Accuracy: 0.9875, Validation Accuracy: 0.9742, Loss: 0.0108
Epoch  17 Batch   30/269 - Train Accuracy: 0.9807, Validation Accuracy: 0.9748, Loss: 0.0130
Epoch  17 Batch   40/269 - Train Accuracy: 0.9874, Validation Accuracy: 0.9774, Loss: 0.0096
Epoch  17 Batch   50/269 - Train Accuracy: 0.9828, Validation Accuracy: 0.9770, Loss: 0.0164
Epoch  17 Batch   60/269 - Train Accuracy: 0.9869, Validation Accuracy: 0.9785, Loss: 0.0116
Epoch  17 Batch   70/269 - Train Accuracy: 0.9887, Validation Accuracy: 0.9807, Loss: 0.0109
Epoch  17 Batch   80/269 - Train Accuracy: 0.9860, Validation Accuracy: 0.9806, Loss: 0.0097
Epoch  17 Batch   90/269 - Train Accuracy: 0.9841, Validation Accuracy: 0.9768, Loss: 0.0102
Epoch  17 Batch  100/269 - Train Accuracy: 0.9900, Validation Accuracy: 0.9707, Loss: 0.0117
Epoch  17 Batch  110/269 - Train Accuracy: 0.9907, Validation Accuracy: 0.9695, Loss: 0.0073
Epoch  17 Batch  120/269 - Train Accuracy: 0.9833, Validation Accuracy: 0.9743, Loss: 0.0117
Epoch  17 Batch  130/269 - Train Accuracy: 0.9882, Validation Accuracy: 0.9783, Loss: 0.0136
Epoch  17 Batch  140/269 - Train Accuracy: 0.9890, Validation Accuracy: 0.9797, Loss: 0.0107
Epoch  17 Batch  150/269 - Train Accuracy: 0.9905, Validation Accuracy: 0.9733, Loss: 0.0135
Epoch  17 Batch  160/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9754, Loss: 0.0100
Epoch  17 Batch  170/269 - Train Accuracy: 0.9846, Validation Accuracy: 0.9806, Loss: 0.0102
Epoch  17 Batch  180/269 - Train Accuracy: 0.9861, Validation Accuracy: 0.9792, Loss: 0.0109
Epoch  17 Batch  190/269 - Train Accuracy: 0.9860, Validation Accuracy: 0.9796, Loss: 0.0084
Epoch  17 Batch  200/269 - Train Accuracy: 0.9920, Validation Accuracy: 0.9771, Loss: 0.0079
Epoch  17 Batch  210/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9738, Loss: 0.0108
Epoch  17 Batch  220/269 - Train Accuracy: 0.9883, Validation Accuracy: 0.9788, Loss: 0.0107
Epoch  17 Batch  230/269 - Train Accuracy: 0.9905, Validation Accuracy: 0.9738, Loss: 0.0087
Epoch  17 Batch  240/269 - Train Accuracy: 0.9866, Validation Accuracy: 0.9759, Loss: 0.0125
Epoch  17 Batch  250/269 - Train Accuracy: 0.9860, Validation Accuracy: 0.9724, Loss: 0.0112
Epoch  17 Batch  260/269 - Train Accuracy: 0.9846, Validation Accuracy: 0.9789, Loss: 0.0130
Epoch  18 Batch   10/269 - Train Accuracy: 0.9885, Validation Accuracy: 0.9817, Loss: 0.0089
Epoch  18 Batch   20/269 - Train Accuracy: 0.9935, Validation Accuracy: 0.9861, Loss: 0.0086
Epoch  18 Batch   30/269 - Train Accuracy: 0.9894, Validation Accuracy: 0.9817, Loss: 0.0113
Epoch  18 Batch   40/269 - Train Accuracy: 0.9875, Validation Accuracy: 0.9830, Loss: 0.0111
Epoch  18 Batch   50/269 - Train Accuracy: 0.9814, Validation Accuracy: 0.9789, Loss: 0.0133
Epoch  18 Batch   60/269 - Train Accuracy: 0.9832, Validation Accuracy: 0.9825, Loss: 0.0110
Epoch  18 Batch   70/269 - Train Accuracy: 0.9915, Validation Accuracy: 0.9765, Loss: 0.0104
Epoch  18 Batch   80/269 - Train Accuracy: 0.9901, Validation Accuracy: 0.9795, Loss: 0.0081
Epoch  18 Batch   90/269 - Train Accuracy: 0.9816, Validation Accuracy: 0.9767, Loss: 0.0096
Epoch  18 Batch  100/269 - Train Accuracy: 0.9873, Validation Accuracy: 0.9770, Loss: 0.0086
Epoch  18 Batch  110/269 - Train Accuracy: 0.9882, Validation Accuracy: 0.9690, Loss: 0.0083
Epoch  18 Batch  120/269 - Train Accuracy: 0.9869, Validation Accuracy: 0.9790, Loss: 0.0111
Epoch  18 Batch  130/269 - Train Accuracy: 0.9925, Validation Accuracy: 0.9777, Loss: 0.0115
Epoch  18 Batch  140/269 - Train Accuracy: 0.9857, Validation Accuracy: 0.9805, Loss: 0.0140
Epoch  18 Batch  150/269 - Train Accuracy: 0.9889, Validation Accuracy: 0.9781, Loss: 0.0126
Epoch  18 Batch  160/269 - Train Accuracy: 0.9852, Validation Accuracy: 0.9769, Loss: 0.0088
Epoch  18 Batch  170/269 - Train Accuracy: 0.9881, Validation Accuracy: 0.9826, Loss: 0.0126
Epoch  18 Batch  180/269 - Train Accuracy: 0.9893, Validation Accuracy: 0.9791, Loss: 0.0108
Epoch  18 Batch  190/269 - Train Accuracy: 0.9911, Validation Accuracy: 0.9810, Loss: 0.0084
Epoch  18 Batch  200/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9829, Loss: 0.0100
Epoch  18 Batch  210/269 - Train Accuracy: 0.9885, Validation Accuracy: 0.9745, Loss: 0.0078
Epoch  18 Batch  220/269 - Train Accuracy: 0.9875, Validation Accuracy: 0.9763, Loss: 0.0097
Epoch  18 Batch  230/269 - Train Accuracy: 0.9890, Validation Accuracy: 0.9791, Loss: 0.0099
Epoch  18 Batch  240/269 - Train Accuracy: 0.9856, Validation Accuracy: 0.9683, Loss: 0.0101
Epoch  18 Batch  250/269 - Train Accuracy: 0.9875, Validation Accuracy: 0.9762, Loss: 0.0091
Epoch  18 Batch  260/269 - Train Accuracy: 0.9905, Validation Accuracy: 0.9760, Loss: 0.0089
Epoch  19 Batch   10/269 - Train Accuracy: 0.9920, Validation Accuracy: 0.9809, Loss: 0.0057
Epoch  19 Batch   20/269 - Train Accuracy: 0.9906, Validation Accuracy: 0.9825, Loss: 0.0078
Epoch  19 Batch   30/269 - Train Accuracy: 0.9846, Validation Accuracy: 0.9801, Loss: 0.0090
Epoch  19 Batch   40/269 - Train Accuracy: 0.9916, Validation Accuracy: 0.9793, Loss: 0.0101
Epoch  19 Batch   50/269 - Train Accuracy: 0.9868, Validation Accuracy: 0.9842, Loss: 0.0121
Epoch  19 Batch   60/269 - Train Accuracy: 0.9856, Validation Accuracy: 0.9804, Loss: 0.0114
Epoch  19 Batch   70/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9737, Loss: 0.0105
Epoch  19 Batch   80/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9789, Loss: 0.0083
Epoch  19 Batch   90/269 - Train Accuracy: 0.9876, Validation Accuracy: 0.9785, Loss: 0.0083
Epoch  19 Batch  100/269 - Train Accuracy: 0.9929, Validation Accuracy: 0.9760, Loss: 0.0108
Epoch  19 Batch  110/269 - Train Accuracy: 0.9936, Validation Accuracy: 0.9737, Loss: 0.0101
Epoch  19 Batch  120/269 - Train Accuracy: 0.9875, Validation Accuracy: 0.9773, Loss: 0.0117
Epoch  19 Batch  130/269 - Train Accuracy: 0.9880, Validation Accuracy: 0.9816, Loss: 0.0131
Epoch  19 Batch  140/269 - Train Accuracy: 0.9876, Validation Accuracy: 0.9806, Loss: 0.0119
Epoch  19 Batch  150/269 - Train Accuracy: 0.9850, Validation Accuracy: 0.9738, Loss: 0.0111
Epoch  19 Batch  160/269 - Train Accuracy: 0.9872, Validation Accuracy: 0.9767, Loss: 0.0093
Epoch  19 Batch  170/269 - Train Accuracy: 0.9905, Validation Accuracy: 0.9799, Loss: 0.0100
Epoch  19 Batch  180/269 - Train Accuracy: 0.9920, Validation Accuracy: 0.9768, Loss: 0.0098
Epoch  19 Batch  190/269 - Train Accuracy: 0.9913, Validation Accuracy: 0.9717, Loss: 0.0114
Epoch  19 Batch  200/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9765, Loss: 0.0088
Epoch  19 Batch  210/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9741, Loss: 0.0081
Epoch  19 Batch  220/269 - Train Accuracy: 0.9909, Validation Accuracy: 0.9768, Loss: 0.0091
Epoch  19 Batch  230/269 - Train Accuracy: 0.9910, Validation Accuracy: 0.9788, Loss: 0.0082
Epoch  19 Batch  240/269 - Train Accuracy: 0.9867, Validation Accuracy: 0.9789, Loss: 0.0096
Epoch  19 Batch  250/269 - Train Accuracy: 0.9885, Validation Accuracy: 0.9771, Loss: 0.0109
Epoch  19 Batch  260/269 - Train Accuracy: 0.9902, Validation Accuracy: 0.9779, Loss: 0.0102
Epoch  20 Batch   10/269 - Train Accuracy: 0.9929, Validation Accuracy: 0.9787, Loss: 0.0086
Epoch  20 Batch   20/269 - Train Accuracy: 0.9912, Validation Accuracy: 0.9763, Loss: 0.0075
Epoch  20 Batch   30/269 - Train Accuracy: 0.9887, Validation Accuracy: 0.9787, Loss: 0.0117
Epoch  20 Batch   40/269 - Train Accuracy: 0.9918, Validation Accuracy: 0.9803, Loss: 0.0081
Epoch  20 Batch   50/269 - Train Accuracy: 0.9885, Validation Accuracy: 0.9814, Loss: 0.0117
Epoch  20 Batch   60/269 - Train Accuracy: 0.9881, Validation Accuracy: 0.9811, Loss: 0.0076
Epoch  20 Batch   70/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9705, Loss: 0.0090
Epoch  20 Batch   80/269 - Train Accuracy: 0.9880, Validation Accuracy: 0.9749, Loss: 0.0092
Epoch  20 Batch   90/269 - Train Accuracy: 0.9858, Validation Accuracy: 0.9816, Loss: 0.0092
Epoch  20 Batch  100/269 - Train Accuracy: 0.9858, Validation Accuracy: 0.9757, Loss: 0.0100
Epoch  20 Batch  110/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9776, Loss: 0.0073
Epoch  20 Batch  120/269 - Train Accuracy: 0.9883, Validation Accuracy: 0.9784, Loss: 0.0120
Epoch  20 Batch  130/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9828, Loss: 0.0117
Epoch  20 Batch  140/269 - Train Accuracy: 0.9912, Validation Accuracy: 0.9834, Loss: 0.0124
Epoch  20 Batch  150/269 - Train Accuracy: 0.9873, Validation Accuracy: 0.9798, Loss: 0.0120
Epoch  20 Batch  160/269 - Train Accuracy: 0.9900, Validation Accuracy: 0.9763, Loss: 0.0081
Epoch  20 Batch  170/269 - Train Accuracy: 0.9849, Validation Accuracy: 0.9819, Loss: 0.0115
Epoch  20 Batch  180/269 - Train Accuracy: 0.9892, Validation Accuracy: 0.9782, Loss: 0.0108
Epoch  20 Batch  190/269 - Train Accuracy: 0.9876, Validation Accuracy: 0.9747, Loss: 0.0114
Epoch  20 Batch  200/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9741, Loss: 0.0092
Epoch  20 Batch  210/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9738, Loss: 0.0077
Epoch  20 Batch  220/269 - Train Accuracy: 0.9889, Validation Accuracy: 0.9749, Loss: 0.0107
Epoch  20 Batch  230/269 - Train Accuracy: 0.9929, Validation Accuracy: 0.9841, Loss: 0.0088
Epoch  20 Batch  240/269 - Train Accuracy: 0.9900, Validation Accuracy: 0.9815, Loss: 0.0094
Epoch  20 Batch  250/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9810, Loss: 0.0090
Epoch  20 Batch  260/269 - Train Accuracy: 0.9898, Validation Accuracy: 0.9822, Loss: 0.0102
Epoch  21 Batch   10/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9777, Loss: 0.0082
Epoch  21 Batch   20/269 - Train Accuracy: 0.9907, Validation Accuracy: 0.9811, Loss: 0.0086
Epoch  21 Batch   30/269 - Train Accuracy: 0.9897, Validation Accuracy: 0.9801, Loss: 0.0103
Epoch  21 Batch   40/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9775, Loss: 0.0107
Epoch  21 Batch   50/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9809, Loss: 0.0093
Epoch  21 Batch   60/269 - Train Accuracy: 0.9876, Validation Accuracy: 0.9830, Loss: 0.0102
Epoch  21 Batch   70/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9759, Loss: 0.0097
Epoch  21 Batch   80/269 - Train Accuracy: 0.9901, Validation Accuracy: 0.9812, Loss: 0.0091
Epoch  21 Batch   90/269 - Train Accuracy: 0.9864, Validation Accuracy: 0.9728, Loss: 0.0084
Epoch  21 Batch  100/269 - Train Accuracy: 0.9861, Validation Accuracy: 0.9739, Loss: 0.0175
Epoch  21 Batch  110/269 - Train Accuracy: 0.9907, Validation Accuracy: 0.9724, Loss: 0.0085
Epoch  21 Batch  120/269 - Train Accuracy: 0.9887, Validation Accuracy: 0.9712, Loss: 0.0206
Epoch  21 Batch  130/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9814, Loss: 0.0124
Epoch  21 Batch  140/269 - Train Accuracy: 0.9899, Validation Accuracy: 0.9833, Loss: 0.0114
Epoch  21 Batch  150/269 - Train Accuracy: 0.9874, Validation Accuracy: 0.9823, Loss: 0.0122
Epoch  21 Batch  160/269 - Train Accuracy: 0.9916, Validation Accuracy: 0.9792, Loss: 0.0070
Epoch  21 Batch  170/269 - Train Accuracy: 0.9899, Validation Accuracy: 0.9829, Loss: 0.0117
Epoch  21 Batch  180/269 - Train Accuracy: 0.9909, Validation Accuracy: 0.9771, Loss: 0.0080
Epoch  21 Batch  190/269 - Train Accuracy: 0.9886, Validation Accuracy: 0.9745, Loss: 0.0070
Epoch  21 Batch  200/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9750, Loss: 0.0100
Epoch  21 Batch  210/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9740, Loss: 0.0097
Epoch  21 Batch  220/269 - Train Accuracy: 0.9933, Validation Accuracy: 0.9772, Loss: 0.0088
Epoch  21 Batch  230/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9822, Loss: 0.0082
Epoch  21 Batch  240/269 - Train Accuracy: 0.9932, Validation Accuracy: 0.9800, Loss: 0.0078
Epoch  21 Batch  250/269 - Train Accuracy: 0.9915, Validation Accuracy: 0.9798, Loss: 0.0101
Epoch  21 Batch  260/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9883, Loss: 0.0112
Epoch  22 Batch   10/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9792, Loss: 0.0071
Epoch  22 Batch   20/269 - Train Accuracy: 0.9936, Validation Accuracy: 0.9809, Loss: 0.0074
Epoch  22 Batch   30/269 - Train Accuracy: 0.9903, Validation Accuracy: 0.9785, Loss: 0.0083
Epoch  22 Batch   40/269 - Train Accuracy: 0.9905, Validation Accuracy: 0.9821, Loss: 0.0083
Epoch  22 Batch   50/269 - Train Accuracy: 0.9910, Validation Accuracy: 0.9820, Loss: 0.0081
Epoch  22 Batch   60/269 - Train Accuracy: 0.9858, Validation Accuracy: 0.9857, Loss: 0.0092
Epoch  22 Batch   70/269 - Train Accuracy: 0.9912, Validation Accuracy: 0.9810, Loss: 0.0093
Epoch  22 Batch   80/269 - Train Accuracy: 0.9896, Validation Accuracy: 0.9785, Loss: 0.0101
Epoch  22 Batch   90/269 - Train Accuracy: 0.9889, Validation Accuracy: 0.9776, Loss: 0.0120
Epoch  22 Batch  100/269 - Train Accuracy: 0.9904, Validation Accuracy: 0.9799, Loss: 0.0101
Epoch  22 Batch  110/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9767, Loss: 0.0081
Epoch  22 Batch  120/269 - Train Accuracy: 0.9854, Validation Accuracy: 0.9796, Loss: 0.0123
Epoch  22 Batch  130/269 - Train Accuracy: 0.9912, Validation Accuracy: 0.9759, Loss: 0.0114
Epoch  22 Batch  140/269 - Train Accuracy: 0.9900, Validation Accuracy: 0.9802, Loss: 0.0097
Epoch  22 Batch  150/269 - Train Accuracy: 0.9852, Validation Accuracy: 0.9795, Loss: 0.0100
Epoch  22 Batch  160/269 - Train Accuracy: 0.9913, Validation Accuracy: 0.9813, Loss: 0.0073
Epoch  22 Batch  170/269 - Train Accuracy: 0.9895, Validation Accuracy: 0.9732, Loss: 0.0083
Epoch  22 Batch  180/269 - Train Accuracy: 0.9825, Validation Accuracy: 0.9796, Loss: 0.0091
Epoch  22 Batch  190/269 - Train Accuracy: 0.9914, Validation Accuracy: 0.9745, Loss: 0.0095
Epoch  22 Batch  200/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9767, Loss: 0.0079
Epoch  22 Batch  210/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9701, Loss: 0.0073
Epoch  22 Batch  220/269 - Train Accuracy: 0.9885, Validation Accuracy: 0.9754, Loss: 0.0086
Epoch  22 Batch  230/269 - Train Accuracy: 0.9857, Validation Accuracy: 0.9797, Loss: 0.0088
Epoch  22 Batch  240/269 - Train Accuracy: 0.9879, Validation Accuracy: 0.9720, Loss: 0.0080
Epoch  22 Batch  250/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9805, Loss: 0.0087
Epoch  22 Batch  260/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9809, Loss: 0.0075
Epoch  23 Batch   10/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9768, Loss: 0.0070
Epoch  23 Batch   20/269 - Train Accuracy: 0.9892, Validation Accuracy: 0.9801, Loss: 0.0083
Epoch  23 Batch   30/269 - Train Accuracy: 0.9904, Validation Accuracy: 0.9778, Loss: 0.0092
Epoch  23 Batch   40/269 - Train Accuracy: 0.9939, Validation Accuracy: 0.9727, Loss: 0.0085
Epoch  23 Batch   50/269 - Train Accuracy: 0.9915, Validation Accuracy: 0.9817, Loss: 0.0096
Epoch  23 Batch   60/269 - Train Accuracy: 0.9922, Validation Accuracy: 0.9828, Loss: 0.0093
Epoch  23 Batch   70/269 - Train Accuracy: 0.9904, Validation Accuracy: 0.9785, Loss: 0.0076
Epoch  23 Batch   80/269 - Train Accuracy: 0.9867, Validation Accuracy: 0.9808, Loss: 0.0085
Epoch  23 Batch   90/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9780, Loss: 0.0071
Epoch  23 Batch  100/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9729, Loss: 0.0059
Epoch  23 Batch  110/269 - Train Accuracy: 0.9974, Validation Accuracy: 0.9733, Loss: 0.0055
Epoch  23 Batch  120/269 - Train Accuracy: 0.9862, Validation Accuracy: 0.9780, Loss: 0.0117
Epoch  23 Batch  130/269 - Train Accuracy: 0.9913, Validation Accuracy: 0.9801, Loss: 0.0108
Epoch  23 Batch  140/269 - Train Accuracy: 0.9869, Validation Accuracy: 0.9825, Loss: 0.0083
Epoch  23 Batch  150/269 - Train Accuracy: 0.9906, Validation Accuracy: 0.9833, Loss: 0.0099
Epoch  23 Batch  160/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9830, Loss: 0.0059
Epoch  23 Batch  170/269 - Train Accuracy: 0.9930, Validation Accuracy: 0.9823, Loss: 0.0093
Epoch  23 Batch  180/269 - Train Accuracy: 0.9913, Validation Accuracy: 0.9842, Loss: 0.0078
Epoch  23 Batch  190/269 - Train Accuracy: 0.9893, Validation Accuracy: 0.9733, Loss: 0.0092
Epoch  23 Batch  200/269 - Train Accuracy: 0.9932, Validation Accuracy: 0.9786, Loss: 0.0084
Epoch  23 Batch  210/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9744, Loss: 0.0063
Epoch  23 Batch  220/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9752, Loss: 0.0078
Epoch  23 Batch  230/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9773, Loss: 0.0062
Epoch  23 Batch  240/269 - Train Accuracy: 0.9887, Validation Accuracy: 0.9813, Loss: 0.0097
Epoch  23 Batch  250/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9789, Loss: 0.0079
Epoch  23 Batch  260/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9746, Loss: 0.0091
Epoch  24 Batch   10/269 - Train Accuracy: 0.9871, Validation Accuracy: 0.9748, Loss: 0.0068
Epoch  24 Batch   20/269 - Train Accuracy: 0.9922, Validation Accuracy: 0.9811, Loss: 0.0071
Epoch  24 Batch   30/269 - Train Accuracy: 0.9897, Validation Accuracy: 0.9793, Loss: 0.0093
Epoch  24 Batch   40/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9760, Loss: 0.0095
Epoch  24 Batch   50/269 - Train Accuracy: 0.9907, Validation Accuracy: 0.9854, Loss: 0.0108
Epoch  24 Batch   60/269 - Train Accuracy: 0.9907, Validation Accuracy: 0.9826, Loss: 0.0088
Epoch  24 Batch   70/269 - Train Accuracy: 0.9884, Validation Accuracy: 0.9814, Loss: 0.0084
Epoch  24 Batch   80/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9795, Loss: 0.0078
Epoch  24 Batch   90/269 - Train Accuracy: 0.9899, Validation Accuracy: 0.9814, Loss: 0.0082
Epoch  24 Batch  100/269 - Train Accuracy: 0.9916, Validation Accuracy: 0.9753, Loss: 0.0087
Epoch  24 Batch  110/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9772, Loss: 0.0039
Epoch  24 Batch  120/269 - Train Accuracy: 0.9919, Validation Accuracy: 0.9759, Loss: 0.0110
Epoch  24 Batch  130/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9808, Loss: 0.0105
Epoch  24 Batch  140/269 - Train Accuracy: 0.9912, Validation Accuracy: 0.9842, Loss: 0.0085
Epoch  24 Batch  150/269 - Train Accuracy: 0.9899, Validation Accuracy: 0.9798, Loss: 0.0101
Epoch  24 Batch  160/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9808, Loss: 0.0073
Epoch  24 Batch  170/269 - Train Accuracy: 0.9908, Validation Accuracy: 0.9797, Loss: 0.0080
Epoch  24 Batch  180/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9780, Loss: 0.0068
Epoch  24 Batch  190/269 - Train Accuracy: 0.9910, Validation Accuracy: 0.9706, Loss: 0.0089
Epoch  24 Batch  200/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9750, Loss: 0.0066
Epoch  24 Batch  210/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9722, Loss: 0.0068
Epoch  24 Batch  220/269 - Train Accuracy: 0.9929, Validation Accuracy: 0.9776, Loss: 0.0069
Epoch  24 Batch  230/269 - Train Accuracy: 0.9903, Validation Accuracy: 0.9763, Loss: 0.0070
Epoch  24 Batch  240/269 - Train Accuracy: 0.9911, Validation Accuracy: 0.9798, Loss: 0.0078
Epoch  24 Batch  250/269 - Train Accuracy: 0.9905, Validation Accuracy: 0.9819, Loss: 0.0089
Epoch  24 Batch  260/269 - Train Accuracy: 0.9914, Validation Accuracy: 0.9774, Loss: 0.0068
Epoch  25 Batch   10/269 - Train Accuracy: 0.9916, Validation Accuracy: 0.9739, Loss: 0.0067
Epoch  25 Batch   20/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9784, Loss: 0.0067
Epoch  25 Batch   30/269 - Train Accuracy: 0.9874, Validation Accuracy: 0.9782, Loss: 0.0087
Epoch  25 Batch   40/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9791, Loss: 0.0098
Epoch  25 Batch   50/269 - Train Accuracy: 0.9926, Validation Accuracy: 0.9815, Loss: 0.0092
Epoch  25 Batch   60/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9811, Loss: 0.0085
Epoch  25 Batch   70/269 - Train Accuracy: 0.9919, Validation Accuracy: 0.9793, Loss: 0.0094
Epoch  25 Batch   80/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9802, Loss: 0.0088
Epoch  25 Batch   90/269 - Train Accuracy: 0.9883, Validation Accuracy: 0.9775, Loss: 0.0092
Epoch  25 Batch  100/269 - Train Accuracy: 0.9935, Validation Accuracy: 0.9789, Loss: 0.0066
Epoch  25 Batch  110/269 - Train Accuracy: 0.9965, Validation Accuracy: 0.9775, Loss: 0.0059
Epoch  25 Batch  120/269 - Train Accuracy: 0.9835, Validation Accuracy: 0.9760, Loss: 0.0096
Epoch  25 Batch  130/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9790, Loss: 0.0108
Epoch  25 Batch  140/269 - Train Accuracy: 0.9838, Validation Accuracy: 0.9840, Loss: 0.0116
Epoch  25 Batch  150/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9813, Loss: 0.0094
Epoch  25 Batch  160/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9758, Loss: 0.0077
Epoch  25 Batch  170/269 - Train Accuracy: 0.9906, Validation Accuracy: 0.9823, Loss: 0.0082
Epoch  25 Batch  180/269 - Train Accuracy: 0.9874, Validation Accuracy: 0.9791, Loss: 0.0090
Epoch  25 Batch  190/269 - Train Accuracy: 0.9918, Validation Accuracy: 0.9747, Loss: 0.0067
Epoch  25 Batch  200/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9789, Loss: 0.0078
Epoch  25 Batch  210/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9822, Loss: 0.0082
Epoch  25 Batch  220/269 - Train Accuracy: 0.9899, Validation Accuracy: 0.9776, Loss: 0.0062
Epoch  25 Batch  230/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9839, Loss: 0.0069
Epoch  25 Batch  240/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9850, Loss: 0.0088
Epoch  25 Batch  250/269 - Train Accuracy: 0.9918, Validation Accuracy: 0.9828, Loss: 0.0064
Epoch  25 Batch  260/269 - Train Accuracy: 0.9888, Validation Accuracy: 0.9835, Loss: 0.0077
Epoch  26 Batch   10/269 - Train Accuracy: 0.9936, Validation Accuracy: 0.9820, Loss: 0.0089
Epoch  26 Batch   20/269 - Train Accuracy: 0.9900, Validation Accuracy: 0.9771, Loss: 0.0070
Epoch  26 Batch   30/269 - Train Accuracy: 0.9929, Validation Accuracy: 0.9862, Loss: 0.0082
Epoch  26 Batch   40/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9827, Loss: 0.0078
Epoch  26 Batch   50/269 - Train Accuracy: 0.9899, Validation Accuracy: 0.9857, Loss: 0.0090
Epoch  26 Batch   60/269 - Train Accuracy: 0.9893, Validation Accuracy: 0.9841, Loss: 0.0107
Epoch  26 Batch   70/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9775, Loss: 0.0091
Epoch  26 Batch   80/269 - Train Accuracy: 0.9901, Validation Accuracy: 0.9812, Loss: 0.0074
Epoch  26 Batch   90/269 - Train Accuracy: 0.9853, Validation Accuracy: 0.9837, Loss: 0.0084
Epoch  26 Batch  100/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9863, Loss: 0.0087
Epoch  26 Batch  110/269 - Train Accuracy: 0.9908, Validation Accuracy: 0.9831, Loss: 0.0061
Epoch  26 Batch  120/269 - Train Accuracy: 0.9874, Validation Accuracy: 0.9743, Loss: 0.0091
Epoch  26 Batch  130/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9837, Loss: 0.0124
Epoch  26 Batch  140/269 - Train Accuracy: 0.9892, Validation Accuracy: 0.9877, Loss: 0.0093
Epoch  26 Batch  150/269 - Train Accuracy: 0.9925, Validation Accuracy: 0.9778, Loss: 0.0099
Epoch  26 Batch  160/269 - Train Accuracy: 0.9932, Validation Accuracy: 0.9850, Loss: 0.0066
Epoch  26 Batch  170/269 - Train Accuracy: 0.9903, Validation Accuracy: 0.9855, Loss: 0.0068
Epoch  26 Batch  180/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9762, Loss: 0.0076
Epoch  26 Batch  190/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9786, Loss: 0.0063
Epoch  26 Batch  200/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9802, Loss: 0.0056
Epoch  26 Batch  210/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9681, Loss: 0.0069
Epoch  26 Batch  220/269 - Train Accuracy: 0.9918, Validation Accuracy: 0.9761, Loss: 0.0085
Epoch  26 Batch  230/269 - Train Accuracy: 0.9939, Validation Accuracy: 0.9812, Loss: 0.0062
Epoch  26 Batch  240/269 - Train Accuracy: 0.9914, Validation Accuracy: 0.9813, Loss: 0.0077
Epoch  26 Batch  250/269 - Train Accuracy: 0.9925, Validation Accuracy: 0.9785, Loss: 0.0088
Epoch  26 Batch  260/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9747, Loss: 0.0107
Epoch  27 Batch   10/269 - Train Accuracy: 0.9918, Validation Accuracy: 0.9790, Loss: 0.0081
Epoch  27 Batch   20/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9845, Loss: 0.0079
Epoch  27 Batch   30/269 - Train Accuracy: 0.9879, Validation Accuracy: 0.9835, Loss: 0.0080
Epoch  27 Batch   40/269 - Train Accuracy: 0.9936, Validation Accuracy: 0.9823, Loss: 0.0083
Epoch  27 Batch   50/269 - Train Accuracy: 0.9924, Validation Accuracy: 0.9836, Loss: 0.0077
Epoch  27 Batch   60/269 - Train Accuracy: 0.9922, Validation Accuracy: 0.9834, Loss: 0.0094
Epoch  27 Batch   70/269 - Train Accuracy: 0.9920, Validation Accuracy: 0.9834, Loss: 0.0071
Epoch  27 Batch   80/269 - Train Accuracy: 0.9937, Validation Accuracy: 0.9818, Loss: 0.0078
Epoch  27 Batch   90/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9808, Loss: 0.0073
Epoch  27 Batch  100/269 - Train Accuracy: 0.9919, Validation Accuracy: 0.9835, Loss: 0.0067
Epoch  27 Batch  110/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9798, Loss: 0.0053
Epoch  27 Batch  120/269 - Train Accuracy: 0.9883, Validation Accuracy: 0.9753, Loss: 0.0085
Epoch  27 Batch  130/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9831, Loss: 0.0106
Epoch  27 Batch  140/269 - Train Accuracy: 0.9878, Validation Accuracy: 0.9835, Loss: 0.0098
Epoch  27 Batch  150/269 - Train Accuracy: 0.9906, Validation Accuracy: 0.9793, Loss: 0.0088
Epoch  27 Batch  160/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9800, Loss: 0.0047
Epoch  27 Batch  170/269 - Train Accuracy: 0.9909, Validation Accuracy: 0.9800, Loss: 0.0076
Epoch  27 Batch  180/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9766, Loss: 0.0055
Epoch  27 Batch  190/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9797, Loss: 0.0069
Epoch  27 Batch  200/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9750, Loss: 0.0064
Epoch  27 Batch  210/269 - Train Accuracy: 0.9965, Validation Accuracy: 0.9729, Loss: 0.0048
Epoch  27 Batch  220/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9781, Loss: 0.0068
Epoch  27 Batch  230/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9787, Loss: 0.0071
Epoch  27 Batch  240/269 - Train Accuracy: 0.9930, Validation Accuracy: 0.9788, Loss: 0.0076
Epoch  27 Batch  250/269 - Train Accuracy: 0.9926, Validation Accuracy: 0.9798, Loss: 0.0079
Epoch  27 Batch  260/269 - Train Accuracy: 0.9914, Validation Accuracy: 0.9756, Loss: 0.0077
Epoch  28 Batch   10/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9777, Loss: 0.0047
Epoch  28 Batch   20/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9796, Loss: 0.0065
Epoch  28 Batch   30/269 - Train Accuracy: 0.9915, Validation Accuracy: 0.9814, Loss: 0.0090
Epoch  28 Batch   40/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9817, Loss: 0.0065
Epoch  28 Batch   50/269 - Train Accuracy: 0.9990, Validation Accuracy: 0.9848, Loss: 0.0077
Epoch  28 Batch   60/269 - Train Accuracy: 0.9909, Validation Accuracy: 0.9840, Loss: 0.0087
Epoch  28 Batch   70/269 - Train Accuracy: 0.9947, Validation Accuracy: 0.9810, Loss: 0.0057
Epoch  28 Batch   80/269 - Train Accuracy: 0.9913, Validation Accuracy: 0.9855, Loss: 0.0079
Epoch  28 Batch   90/269 - Train Accuracy: 0.9955, Validation Accuracy: 0.9789, Loss: 0.0069
Epoch  28 Batch  100/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9782, Loss: 0.0085
Epoch  28 Batch  110/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9752, Loss: 0.0049
Epoch  28 Batch  120/269 - Train Accuracy: 0.9902, Validation Accuracy: 0.9815, Loss: 0.0090
Epoch  28 Batch  130/269 - Train Accuracy: 0.9886, Validation Accuracy: 0.9788, Loss: 0.0130
Epoch  28 Batch  140/269 - Train Accuracy: 0.9908, Validation Accuracy: 0.9819, Loss: 0.0084
Epoch  28 Batch  150/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9778, Loss: 0.0100
Epoch  28 Batch  160/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9804, Loss: 0.0058
Epoch  28 Batch  170/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9782, Loss: 0.0075
Epoch  28 Batch  180/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9770, Loss: 0.0067
Epoch  28 Batch  190/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9756, Loss: 0.0051
Epoch  28 Batch  200/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9790, Loss: 0.0044
Epoch  28 Batch  210/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9772, Loss: 0.0043
Epoch  28 Batch  220/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9803, Loss: 0.0059
Epoch  28 Batch  230/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9803, Loss: 0.0040
Epoch  28 Batch  240/269 - Train Accuracy: 0.9935, Validation Accuracy: 0.9774, Loss: 0.0077
Epoch  28 Batch  250/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9814, Loss: 0.0070
Epoch  28 Batch  260/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9812, Loss: 0.0065
Epoch  29 Batch   10/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9823, Loss: 0.0070
Epoch  29 Batch   20/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9867, Loss: 0.0069
Epoch  29 Batch   30/269 - Train Accuracy: 0.9907, Validation Accuracy: 0.9812, Loss: 0.0079
Epoch  29 Batch   40/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9802, Loss: 0.0065
Epoch  29 Batch   50/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9812, Loss: 0.0083
Epoch  29 Batch   60/269 - Train Accuracy: 0.9869, Validation Accuracy: 0.9860, Loss: 0.0082
Epoch  29 Batch   70/269 - Train Accuracy: 0.9922, Validation Accuracy: 0.9790, Loss: 0.0055
Epoch  29 Batch   80/269 - Train Accuracy: 0.9928, Validation Accuracy: 0.9769, Loss: 0.0048
Epoch  29 Batch   90/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9827, Loss: 0.0057
Epoch  29 Batch  100/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9776, Loss: 0.0055
Epoch  29 Batch  110/269 - Train Accuracy: 0.9955, Validation Accuracy: 0.9813, Loss: 0.0044
Epoch  29 Batch  120/269 - Train Accuracy: 0.9932, Validation Accuracy: 0.9764, Loss: 0.0068
Epoch  29 Batch  130/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9764, Loss: 0.0090
Epoch  29 Batch  140/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9775, Loss: 0.0064
Epoch  29 Batch  150/269 - Train Accuracy: 0.9894, Validation Accuracy: 0.9757, Loss: 0.0090
Epoch  29 Batch  160/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9759, Loss: 0.0065
Epoch  29 Batch  170/269 - Train Accuracy: 0.9929, Validation Accuracy: 0.9792, Loss: 0.0054
Epoch  29 Batch  180/269 - Train Accuracy: 0.9925, Validation Accuracy: 0.9828, Loss: 0.0065
Epoch  29 Batch  190/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9800, Loss: 0.0061
Epoch  29 Batch  200/269 - Train Accuracy: 0.9962, Validation Accuracy: 0.9793, Loss: 0.0056
Epoch  29 Batch  210/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9801, Loss: 0.0058
Epoch  29 Batch  220/269 - Train Accuracy: 0.9988, Validation Accuracy: 0.9806, Loss: 0.0055
Epoch  29 Batch  230/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9796, Loss: 0.0039
Epoch  29 Batch  240/269 - Train Accuracy: 0.9986, Validation Accuracy: 0.9845, Loss: 0.0058
Epoch  29 Batch  250/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9845, Loss: 0.0063
Epoch  29 Batch  260/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9818, Loss: 0.0067
Epoch  30 Batch   10/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9824, Loss: 0.0065
Epoch  30 Batch   20/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9798, Loss: 0.0070
Epoch  30 Batch   30/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9811, Loss: 0.0083
Epoch  30 Batch   40/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9808, Loss: 0.0092
Epoch  30 Batch   50/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9832, Loss: 0.0091
Epoch  30 Batch   60/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9840, Loss: 0.0086
Epoch  30 Batch   70/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9864, Loss: 0.0062
Epoch  30 Batch   80/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9828, Loss: 0.0064
Epoch  30 Batch   90/269 - Train Accuracy: 0.9872, Validation Accuracy: 0.9833, Loss: 0.0067
Epoch  30 Batch  100/269 - Train Accuracy: 0.9947, Validation Accuracy: 0.9798, Loss: 0.0079
Epoch  30 Batch  110/269 - Train Accuracy: 0.9989, Validation Accuracy: 0.9807, Loss: 0.0061
Epoch  30 Batch  120/269 - Train Accuracy: 0.9858, Validation Accuracy: 0.9838, Loss: 0.0120
Epoch  30 Batch  130/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9835, Loss: 0.0093
Epoch  30 Batch  140/269 - Train Accuracy: 0.9922, Validation Accuracy: 0.9875, Loss: 0.0073
Epoch  30 Batch  150/269 - Train Accuracy: 0.9932, Validation Accuracy: 0.9832, Loss: 0.0088
Epoch  30 Batch  160/269 - Train Accuracy: 0.9939, Validation Accuracy: 0.9780, Loss: 0.0063
Epoch  30 Batch  170/269 - Train Accuracy: 0.9908, Validation Accuracy: 0.9749, Loss: 0.0078
Epoch  30 Batch  180/269 - Train Accuracy: 0.9935, Validation Accuracy: 0.9809, Loss: 0.0062
Epoch  30 Batch  190/269 - Train Accuracy: 0.9947, Validation Accuracy: 0.9768, Loss: 0.0069
Epoch  30 Batch  200/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9798, Loss: 0.0038
Epoch  30 Batch  210/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9788, Loss: 0.0056
Epoch  30 Batch  220/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9811, Loss: 0.0073
Epoch  30 Batch  230/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9806, Loss: 0.0070
Epoch  30 Batch  240/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9864, Loss: 0.0074
Epoch  30 Batch  250/269 - Train Accuracy: 0.9962, Validation Accuracy: 0.9852, Loss: 0.0077
Epoch  30 Batch  260/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9819, Loss: 0.0059
Epoch  31 Batch   10/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9831, Loss: 0.0050
Epoch  31 Batch   20/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9802, Loss: 0.0058
Epoch  31 Batch   30/269 - Train Accuracy: 0.9896, Validation Accuracy: 0.9814, Loss: 0.0079
Epoch  31 Batch   40/269 - Train Accuracy: 0.9958, Validation Accuracy: 0.9797, Loss: 0.0061
Epoch  31 Batch   50/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9837, Loss: 0.0072
Epoch  31 Batch   60/269 - Train Accuracy: 0.9925, Validation Accuracy: 0.9822, Loss: 0.0066
Epoch  31 Batch   70/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9874, Loss: 0.0066
Epoch  31 Batch   80/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9884, Loss: 0.0057
Epoch  31 Batch   90/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9808, Loss: 0.0066
Epoch  31 Batch  100/269 - Train Accuracy: 0.9946, Validation Accuracy: 0.9773, Loss: 0.0085
Epoch  31 Batch  110/269 - Train Accuracy: 0.9962, Validation Accuracy: 0.9813, Loss: 0.0059
Epoch  31 Batch  120/269 - Train Accuracy: 0.9885, Validation Accuracy: 0.9820, Loss: 0.0087
Epoch  31 Batch  130/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9824, Loss: 0.0117
Epoch  31 Batch  140/269 - Train Accuracy: 0.9864, Validation Accuracy: 0.9814, Loss: 0.0106
Epoch  31 Batch  150/269 - Train Accuracy: 0.9900, Validation Accuracy: 0.9758, Loss: 0.0080
Epoch  31 Batch  160/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9828, Loss: 0.0067
Epoch  31 Batch  170/269 - Train Accuracy: 0.9896, Validation Accuracy: 0.9829, Loss: 0.0091
Epoch  31 Batch  180/269 - Train Accuracy: 0.9919, Validation Accuracy: 0.9823, Loss: 0.0050
Epoch  31 Batch  190/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9793, Loss: 0.0087
Epoch  31 Batch  200/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9792, Loss: 0.0049
Epoch  31 Batch  210/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9804, Loss: 0.0048
Epoch  31 Batch  220/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9795, Loss: 0.0052
Epoch  31 Batch  230/269 - Train Accuracy: 0.9912, Validation Accuracy: 0.9814, Loss: 0.0076
Epoch  31 Batch  240/269 - Train Accuracy: 0.9901, Validation Accuracy: 0.9794, Loss: 0.0052
Epoch  31 Batch  250/269 - Train Accuracy: 0.9926, Validation Accuracy: 0.9857, Loss: 0.0062
Epoch  31 Batch  260/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9813, Loss: 0.0063
Epoch  32 Batch   10/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9797, Loss: 0.0075
Epoch  32 Batch   20/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9832, Loss: 0.0082
Epoch  32 Batch   30/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9829, Loss: 0.0057
Epoch  32 Batch   40/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9777, Loss: 0.0075
Epoch  32 Batch   50/269 - Train Accuracy: 0.9912, Validation Accuracy: 0.9831, Loss: 0.0086
Epoch  32 Batch   60/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9807, Loss: 0.0075
Epoch  32 Batch   70/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9794, Loss: 0.0083
Epoch  32 Batch   80/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9816, Loss: 0.0070
Epoch  32 Batch   90/269 - Train Accuracy: 0.9896, Validation Accuracy: 0.9796, Loss: 0.0055
Epoch  32 Batch  100/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9768, Loss: 0.0068
Epoch  32 Batch  110/269 - Train Accuracy: 0.9985, Validation Accuracy: 0.9811, Loss: 0.0047
Epoch  32 Batch  120/269 - Train Accuracy: 0.9913, Validation Accuracy: 0.9825, Loss: 0.0054
Epoch  32 Batch  130/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9809, Loss: 0.0082
Epoch  32 Batch  140/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9869, Loss: 0.0065
Epoch  32 Batch  150/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9820, Loss: 0.0074
Epoch  32 Batch  160/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9798, Loss: 0.0066
Epoch  32 Batch  170/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9781, Loss: 0.0065
Epoch  32 Batch  180/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9777, Loss: 0.0070
Epoch  32 Batch  190/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9819, Loss: 0.0046
Epoch  32 Batch  200/269 - Train Accuracy: 0.9978, Validation Accuracy: 0.9818, Loss: 0.0053
Epoch  32 Batch  210/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9810, Loss: 0.0052
Epoch  32 Batch  220/269 - Train Accuracy: 0.9958, Validation Accuracy: 0.9808, Loss: 0.0053
Epoch  32 Batch  230/269 - Train Accuracy: 0.9991, Validation Accuracy: 0.9765, Loss: 0.0046
Epoch  32 Batch  240/269 - Train Accuracy: 0.9925, Validation Accuracy: 0.9841, Loss: 0.0084
Epoch  32 Batch  250/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9798, Loss: 0.0058
Epoch  32 Batch  260/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9769, Loss: 0.0097
Epoch  33 Batch   10/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9784, Loss: 0.0056
Epoch  33 Batch   20/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9795, Loss: 0.0051
Epoch  33 Batch   30/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9769, Loss: 0.0051
Epoch  33 Batch   40/269 - Train Accuracy: 0.9937, Validation Accuracy: 0.9821, Loss: 0.0062
Epoch  33 Batch   50/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9827, Loss: 0.0069
Epoch  33 Batch   60/269 - Train Accuracy: 0.9936, Validation Accuracy: 0.9832, Loss: 0.0061
Epoch  33 Batch   70/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9814, Loss: 0.0059
Epoch  33 Batch   80/269 - Train Accuracy: 0.9920, Validation Accuracy: 0.9853, Loss: 0.0054
Epoch  33 Batch   90/269 - Train Accuracy: 0.9908, Validation Accuracy: 0.9836, Loss: 0.0057
Epoch  33 Batch  100/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9829, Loss: 0.0052
Epoch  33 Batch  110/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9841, Loss: 0.0058
Epoch  33 Batch  120/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9824, Loss: 0.0074
Epoch  33 Batch  130/269 - Train Accuracy: 0.9924, Validation Accuracy: 0.9828, Loss: 0.0095
Epoch  33 Batch  140/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9808, Loss: 0.0070
Epoch  33 Batch  150/269 - Train Accuracy: 0.9883, Validation Accuracy: 0.9822, Loss: 0.0083
Epoch  33 Batch  160/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9819, Loss: 0.0063
Epoch  33 Batch  170/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9767, Loss: 0.0082
Epoch  33 Batch  180/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9816, Loss: 0.0055
Epoch  33 Batch  190/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9830, Loss: 0.0054
Epoch  33 Batch  200/269 - Train Accuracy: 0.9984, Validation Accuracy: 0.9817, Loss: 0.0048
Epoch  33 Batch  210/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9790, Loss: 0.0072
Epoch  33 Batch  220/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9814, Loss: 0.0054
Epoch  33 Batch  230/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9782, Loss: 0.0063
Epoch  33 Batch  240/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9823, Loss: 0.0061
Epoch  33 Batch  250/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9857, Loss: 0.0053
Epoch  33 Batch  260/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9814, Loss: 0.0084
Epoch  34 Batch   10/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9764, Loss: 0.0063
Epoch  34 Batch   20/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9805, Loss: 0.0055
Epoch  34 Batch   30/269 - Train Accuracy: 0.9946, Validation Accuracy: 0.9849, Loss: 0.0061
Epoch  34 Batch   40/269 - Train Accuracy: 0.9955, Validation Accuracy: 0.9826, Loss: 0.0047
Epoch  34 Batch   50/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9798, Loss: 0.0057
Epoch  34 Batch   60/269 - Train Accuracy: 0.9962, Validation Accuracy: 0.9852, Loss: 0.0068
Epoch  34 Batch   70/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9834, Loss: 0.0075
Epoch  34 Batch   80/269 - Train Accuracy: 0.9937, Validation Accuracy: 0.9810, Loss: 0.0051
Epoch  34 Batch   90/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9777, Loss: 0.0067
Epoch  34 Batch  100/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9809, Loss: 0.0047
Epoch  34 Batch  110/269 - Train Accuracy: 0.9987, Validation Accuracy: 0.9790, Loss: 0.0032
Epoch  34 Batch  120/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9768, Loss: 0.0081
Epoch  34 Batch  130/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9841, Loss: 0.0088
Epoch  34 Batch  140/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9804, Loss: 0.0073
Epoch  34 Batch  150/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9784, Loss: 0.0068
Epoch  34 Batch  160/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9824, Loss: 0.0060
Epoch  34 Batch  170/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9847, Loss: 0.0077
Epoch  34 Batch  180/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9794, Loss: 0.0082
Epoch  34 Batch  190/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9821, Loss: 0.0052
Epoch  34 Batch  200/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9807, Loss: 0.0031
Epoch  34 Batch  210/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9773, Loss: 0.0046
Epoch  34 Batch  220/269 - Train Accuracy: 0.9932, Validation Accuracy: 0.9800, Loss: 0.0058
Epoch  34 Batch  230/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9837, Loss: 0.0054
Epoch  34 Batch  240/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9873, Loss: 0.0062
Epoch  34 Batch  250/269 - Train Accuracy: 0.9913, Validation Accuracy: 0.9825, Loss: 0.0058
Epoch  34 Batch  260/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9849, Loss: 0.0081
Epoch  35 Batch   10/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9809, Loss: 0.0075
Epoch  35 Batch   20/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9819, Loss: 0.0049
Epoch  35 Batch   30/269 - Train Accuracy: 0.9947, Validation Accuracy: 0.9820, Loss: 0.0070
Epoch  35 Batch   40/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9830, Loss: 0.0098
Epoch  35 Batch   50/269 - Train Accuracy: 0.9928, Validation Accuracy: 0.9850, Loss: 0.0060
Epoch  35 Batch   60/269 - Train Accuracy: 0.9933, Validation Accuracy: 0.9862, Loss: 0.0070
Epoch  35 Batch   70/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9860, Loss: 0.0065
Epoch  35 Batch   80/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9836, Loss: 0.0039
Epoch  35 Batch   90/269 - Train Accuracy: 0.9939, Validation Accuracy: 0.9868, Loss: 0.0048
Epoch  35 Batch  100/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9835, Loss: 0.0058
Epoch  35 Batch  110/269 - Train Accuracy: 0.9962, Validation Accuracy: 0.9868, Loss: 0.0047
Epoch  35 Batch  120/269 - Train Accuracy: 0.9931, Validation Accuracy: 0.9833, Loss: 0.0067
Epoch  35 Batch  130/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9898, Loss: 0.0077
Epoch  35 Batch  140/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9830, Loss: 0.0071
Epoch  35 Batch  150/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9817, Loss: 0.0062
Epoch  35 Batch  160/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9807, Loss: 0.0046
Epoch  35 Batch  170/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9807, Loss: 0.0062
Epoch  35 Batch  180/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9838, Loss: 0.0055
Epoch  35 Batch  190/269 - Train Accuracy: 0.9976, Validation Accuracy: 0.9813, Loss: 0.0056
Epoch  35 Batch  200/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9811, Loss: 0.0056
Epoch  35 Batch  210/269 - Train Accuracy: 0.9955, Validation Accuracy: 0.9751, Loss: 0.0048
Epoch  35 Batch  220/269 - Train Accuracy: 0.9925, Validation Accuracy: 0.9833, Loss: 0.0077
Epoch  35 Batch  230/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9786, Loss: 0.0047
Epoch  35 Batch  240/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9826, Loss: 0.0075
Epoch  35 Batch  250/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9784, Loss: 0.0062
Epoch  35 Batch  260/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9768, Loss: 0.0045
Epoch  36 Batch   10/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9767, Loss: 0.0061
Epoch  36 Batch   20/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9782, Loss: 0.0107
Epoch  36 Batch   30/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9815, Loss: 0.0084
Epoch  36 Batch   40/269 - Train Accuracy: 0.9915, Validation Accuracy: 0.9811, Loss: 0.0107
Epoch  36 Batch   50/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9808, Loss: 0.0083
Epoch  36 Batch   60/269 - Train Accuracy: 0.9920, Validation Accuracy: 0.9821, Loss: 0.0089
Epoch  36 Batch   70/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9809, Loss: 0.0074
Epoch  36 Batch   80/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9827, Loss: 0.0051
Epoch  36 Batch   90/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9854, Loss: 0.0080
Epoch  36 Batch  100/269 - Train Accuracy: 0.9947, Validation Accuracy: 0.9816, Loss: 0.0069
Epoch  36 Batch  110/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9818, Loss: 0.0037
Epoch  36 Batch  120/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9745, Loss: 0.0074
Epoch  36 Batch  130/269 - Train Accuracy: 0.9929, Validation Accuracy: 0.9820, Loss: 0.0097
Epoch  36 Batch  140/269 - Train Accuracy: 0.9918, Validation Accuracy: 0.9819, Loss: 0.0065
Epoch  36 Batch  150/269 - Train Accuracy: 0.9906, Validation Accuracy: 0.9800, Loss: 0.0084
Epoch  36 Batch  160/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9806, Loss: 0.0067
Epoch  36 Batch  170/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9815, Loss: 0.0050
Epoch  36 Batch  180/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9832, Loss: 0.0044
Epoch  36 Batch  190/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9863, Loss: 0.0064
Epoch  36 Batch  200/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9794, Loss: 0.0043
Epoch  36 Batch  210/269 - Train Accuracy: 0.9963, Validation Accuracy: 0.9763, Loss: 0.0052
Epoch  36 Batch  220/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9807, Loss: 0.0043
Epoch  36 Batch  230/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9816, Loss: 0.0042
Epoch  36 Batch  240/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9862, Loss: 0.0044
Epoch  36 Batch  250/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9854, Loss: 0.0045
Epoch  36 Batch  260/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9842, Loss: 0.0058
Epoch  37 Batch   10/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9865, Loss: 0.0048
Epoch  37 Batch   20/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9838, Loss: 0.0043
Epoch  37 Batch   30/269 - Train Accuracy: 0.9880, Validation Accuracy: 0.9812, Loss: 0.0079
Epoch  37 Batch   40/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9849, Loss: 0.0053
Epoch  37 Batch   50/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9848, Loss: 0.0064
Epoch  37 Batch   60/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9828, Loss: 0.0062
Epoch  37 Batch   70/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9812, Loss: 0.0054
Epoch  37 Batch   80/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9848, Loss: 0.0052
Epoch  37 Batch   90/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9866, Loss: 0.0053
Epoch  37 Batch  100/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9861, Loss: 0.0045
Epoch  37 Batch  110/269 - Train Accuracy: 0.9957, Validation Accuracy: 0.9833, Loss: 0.0039
Epoch  37 Batch  120/269 - Train Accuracy: 0.9983, Validation Accuracy: 0.9813, Loss: 0.0054
Epoch  37 Batch  130/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9854, Loss: 0.0056
Epoch  37 Batch  140/269 - Train Accuracy: 0.9933, Validation Accuracy: 0.9841, Loss: 0.0056
Epoch  37 Batch  150/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9822, Loss: 0.0052
Epoch  37 Batch  160/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9847, Loss: 0.0060
Epoch  37 Batch  170/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9852, Loss: 0.0053
Epoch  37 Batch  180/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9820, Loss: 0.0046
Epoch  37 Batch  190/269 - Train Accuracy: 0.9957, Validation Accuracy: 0.9862, Loss: 0.0067
Epoch  37 Batch  200/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9768, Loss: 0.0029
Epoch  37 Batch  210/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9751, Loss: 0.0053
Epoch  37 Batch  220/269 - Train Accuracy: 0.9980, Validation Accuracy: 0.9811, Loss: 0.0053
Epoch  37 Batch  230/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9805, Loss: 0.0048
Epoch  37 Batch  240/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9834, Loss: 0.0062
Epoch  37 Batch  250/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9829, Loss: 0.0061
Epoch  37 Batch  260/269 - Train Accuracy: 0.9974, Validation Accuracy: 0.9854, Loss: 0.0049
Epoch  38 Batch   10/269 - Train Accuracy: 0.9989, Validation Accuracy: 0.9862, Loss: 0.0028
Epoch  38 Batch   20/269 - Train Accuracy: 0.9987, Validation Accuracy: 0.9862, Loss: 0.0038
Epoch  38 Batch   30/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9859, Loss: 0.0056
Epoch  38 Batch   40/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9822, Loss: 0.0053
Epoch  38 Batch   50/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9862, Loss: 0.0047
Epoch  38 Batch   60/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9857, Loss: 0.0038
Epoch  38 Batch   70/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9861, Loss: 0.0053
Epoch  38 Batch   80/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9847, Loss: 0.0060
Epoch  38 Batch   90/269 - Train Accuracy: 0.9978, Validation Accuracy: 0.9843, Loss: 0.0049
Epoch  38 Batch  100/269 - Train Accuracy: 0.9958, Validation Accuracy: 0.9811, Loss: 0.0062
Epoch  38 Batch  110/269 - Train Accuracy: 0.9974, Validation Accuracy: 0.9846, Loss: 0.0025
Epoch  38 Batch  120/269 - Train Accuracy: 0.9924, Validation Accuracy: 0.9797, Loss: 0.0055
Epoch  38 Batch  130/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9808, Loss: 0.0079
Epoch  38 Batch  140/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9838, Loss: 0.0049
Epoch  38 Batch  150/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9843, Loss: 0.0072
Epoch  38 Batch  160/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9814, Loss: 0.0061
Epoch  38 Batch  170/269 - Train Accuracy: 0.9946, Validation Accuracy: 0.9812, Loss: 0.0061
Epoch  38 Batch  180/269 - Train Accuracy: 0.9957, Validation Accuracy: 0.9832, Loss: 0.0052
Epoch  38 Batch  190/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9836, Loss: 0.0053
Epoch  38 Batch  200/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9791, Loss: 0.0042
Epoch  38 Batch  210/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9782, Loss: 0.0042
Epoch  38 Batch  220/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9814, Loss: 0.0054
Epoch  38 Batch  230/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9788, Loss: 0.0042
Epoch  38 Batch  240/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9835, Loss: 0.0039
Epoch  38 Batch  250/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9821, Loss: 0.0037
Epoch  38 Batch  260/269 - Train Accuracy: 0.9992, Validation Accuracy: 0.9819, Loss: 0.0051
Epoch  39 Batch   10/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9788, Loss: 0.0043
Epoch  39 Batch   20/269 - Train Accuracy: 0.9976, Validation Accuracy: 0.9817, Loss: 0.0049
Epoch  39 Batch   30/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9825, Loss: 0.0059
Epoch  39 Batch   40/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9821, Loss: 0.0041
Epoch  39 Batch   50/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9852, Loss: 0.0069
Epoch  39 Batch   60/269 - Train Accuracy: 0.9939, Validation Accuracy: 0.9854, Loss: 0.0079
Epoch  39 Batch   70/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9838, Loss: 0.0059
Epoch  39 Batch   80/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9804, Loss: 0.0054
Epoch  39 Batch   90/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9788, Loss: 0.0067
Epoch  39 Batch  100/269 - Train Accuracy: 0.9927, Validation Accuracy: 0.9771, Loss: 0.0077
Epoch  39 Batch  110/269 - Train Accuracy: 1.0000, Validation Accuracy: 0.9822, Loss: 0.0042
Epoch  39 Batch  120/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9799, Loss: 0.0059
Epoch  39 Batch  130/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9836, Loss: 0.0086
Epoch  39 Batch  140/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9751, Loss: 0.0072
Epoch  39 Batch  150/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9772, Loss: 0.0086
Epoch  39 Batch  160/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9791, Loss: 0.0051
Epoch  39 Batch  170/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9787, Loss: 0.0075
Epoch  39 Batch  180/269 - Train Accuracy: 0.9983, Validation Accuracy: 0.9789, Loss: 0.0027
Epoch  39 Batch  190/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9759, Loss: 0.0048
Epoch  39 Batch  200/269 - Train Accuracy: 0.9978, Validation Accuracy: 0.9767, Loss: 0.0064
Epoch  39 Batch  210/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9753, Loss: 0.0070
Epoch  39 Batch  220/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9756, Loss: 0.0057
Epoch  39 Batch  230/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9825, Loss: 0.0065
Epoch  39 Batch  240/269 - Train Accuracy: 0.9958, Validation Accuracy: 0.9777, Loss: 0.0077
Epoch  39 Batch  250/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9799, Loss: 0.0124
Epoch  39 Batch  260/269 - Train Accuracy: 0.9957, Validation Accuracy: 0.9814, Loss: 0.0070
Epoch  40 Batch   10/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9851, Loss: 0.0081
Epoch  40 Batch   20/269 - Train Accuracy: 0.9939, Validation Accuracy: 0.9862, Loss: 0.0088
Epoch  40 Batch   30/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9822, Loss: 0.0088
Epoch  40 Batch   40/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9866, Loss: 0.0089
Epoch  40 Batch   50/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9816, Loss: 0.0086
Epoch  40 Batch   60/269 - Train Accuracy: 0.9921, Validation Accuracy: 0.9885, Loss: 0.0100
Epoch  40 Batch   70/269 - Train Accuracy: 0.9947, Validation Accuracy: 0.9816, Loss: 0.0068
Epoch  40 Batch   80/269 - Train Accuracy: 0.9946, Validation Accuracy: 0.9846, Loss: 0.0051
Epoch  40 Batch   90/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9796, Loss: 0.0056
Epoch  40 Batch  100/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9808, Loss: 0.0050
Epoch  40 Batch  110/269 - Train Accuracy: 0.9980, Validation Accuracy: 0.9749, Loss: 0.0056
Epoch  40 Batch  120/269 - Train Accuracy: 0.9893, Validation Accuracy: 0.9789, Loss: 0.0093
Epoch  40 Batch  130/269 - Train Accuracy: 0.9952, Validation Accuracy: 0.9823, Loss: 0.0093
Epoch  40 Batch  140/269 - Train Accuracy: 0.9941, Validation Accuracy: 0.9812, Loss: 0.0058
Epoch  40 Batch  150/269 - Train Accuracy: 0.9933, Validation Accuracy: 0.9834, Loss: 0.0050
Epoch  40 Batch  160/269 - Train Accuracy: 0.9984, Validation Accuracy: 0.9806, Loss: 0.0050
Epoch  40 Batch  170/269 - Train Accuracy: 0.9963, Validation Accuracy: 0.9828, Loss: 0.0074
Epoch  40 Batch  180/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9815, Loss: 0.0061
Epoch  40 Batch  190/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9820, Loss: 0.0064
Epoch  40 Batch  200/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9830, Loss: 0.0040
Epoch  40 Batch  210/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9788, Loss: 0.0054
Epoch  40 Batch  220/269 - Train Accuracy: 0.9958, Validation Accuracy: 0.9807, Loss: 0.0058
Epoch  40 Batch  230/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9822, Loss: 0.0068
Epoch  40 Batch  240/269 - Train Accuracy: 0.9939, Validation Accuracy: 0.9817, Loss: 0.0057
Epoch  40 Batch  250/269 - Train Accuracy: 0.9955, Validation Accuracy: 0.9813, Loss: 0.0067
Epoch  40 Batch  260/269 - Train Accuracy: 0.9917, Validation Accuracy: 0.9833, Loss: 0.0071
Epoch  41 Batch   10/269 - Train Accuracy: 0.9965, Validation Accuracy: 0.9827, Loss: 0.0056
Epoch  41 Batch   20/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9775, Loss: 0.0056
Epoch  41 Batch   30/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9814, Loss: 0.0051
Epoch  41 Batch   40/269 - Train Accuracy: 0.9995, Validation Accuracy: 0.9823, Loss: 0.0050
Epoch  41 Batch   50/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9790, Loss: 0.0055
Epoch  41 Batch   60/269 - Train Accuracy: 0.9932, Validation Accuracy: 0.9802, Loss: 0.0047
Epoch  41 Batch   70/269 - Train Accuracy: 0.9980, Validation Accuracy: 0.9854, Loss: 0.0051
Epoch  41 Batch   80/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9837, Loss: 0.0056
Epoch  41 Batch   90/269 - Train Accuracy: 0.9983, Validation Accuracy: 0.9851, Loss: 0.0050
Epoch  41 Batch  100/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9824, Loss: 0.0060
Epoch  41 Batch  110/269 - Train Accuracy: 0.9999, Validation Accuracy: 0.9768, Loss: 0.0046
Epoch  41 Batch  120/269 - Train Accuracy: 0.9920, Validation Accuracy: 0.9859, Loss: 0.0086
Epoch  41 Batch  130/269 - Train Accuracy: 0.9936, Validation Accuracy: 0.9806, Loss: 0.0083
Epoch  41 Batch  140/269 - Train Accuracy: 0.9933, Validation Accuracy: 0.9751, Loss: 0.0056
Epoch  41 Batch  150/269 - Train Accuracy: 0.9933, Validation Accuracy: 0.9788, Loss: 0.0061
Epoch  41 Batch  160/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9821, Loss: 0.0040
Epoch  41 Batch  170/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9826, Loss: 0.0080
Epoch  41 Batch  180/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9806, Loss: 0.0052
Epoch  41 Batch  190/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9798, Loss: 0.0051
Epoch  41 Batch  200/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9831, Loss: 0.0043
Epoch  41 Batch  210/269 - Train Accuracy: 0.9965, Validation Accuracy: 0.9807, Loss: 0.0045
Epoch  41 Batch  220/269 - Train Accuracy: 0.9985, Validation Accuracy: 0.9775, Loss: 0.0052
Epoch  41 Batch  230/269 - Train Accuracy: 0.9993, Validation Accuracy: 0.9801, Loss: 0.0043
Epoch  41 Batch  240/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9810, Loss: 0.0046
Epoch  41 Batch  250/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9833, Loss: 0.0045
Epoch  41 Batch  260/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9832, Loss: 0.0052
Epoch  42 Batch   10/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9854, Loss: 0.0044
Epoch  42 Batch   20/269 - Train Accuracy: 0.9985, Validation Accuracy: 0.9804, Loss: 0.0031
Epoch  42 Batch   30/269 - Train Accuracy: 0.9942, Validation Accuracy: 0.9831, Loss: 0.0056
Epoch  42 Batch   40/269 - Train Accuracy: 0.9978, Validation Accuracy: 0.9814, Loss: 0.0044
Epoch  42 Batch   50/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9820, Loss: 0.0073
Epoch  42 Batch   60/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9817, Loss: 0.0046
Epoch  42 Batch   70/269 - Train Accuracy: 0.9965, Validation Accuracy: 0.9841, Loss: 0.0058
Epoch  42 Batch   80/269 - Train Accuracy: 0.9997, Validation Accuracy: 0.9826, Loss: 0.0041
Epoch  42 Batch   90/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9825, Loss: 0.0046
Epoch  42 Batch  100/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9820, Loss: 0.0053
Epoch  42 Batch  110/269 - Train Accuracy: 0.9957, Validation Accuracy: 0.9838, Loss: 0.0031
Epoch  42 Batch  120/269 - Train Accuracy: 0.9963, Validation Accuracy: 0.9791, Loss: 0.0055
Epoch  42 Batch  130/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9821, Loss: 0.0073
Epoch  42 Batch  140/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9819, Loss: 0.0053
Epoch  42 Batch  150/269 - Train Accuracy: 0.9919, Validation Accuracy: 0.9820, Loss: 0.0081
Epoch  42 Batch  160/269 - Train Accuracy: 0.9963, Validation Accuracy: 0.9822, Loss: 0.0050
Epoch  42 Batch  170/269 - Train Accuracy: 0.9947, Validation Accuracy: 0.9814, Loss: 0.0067
Epoch  42 Batch  180/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9805, Loss: 0.0036
Epoch  42 Batch  190/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9799, Loss: 0.0040
Epoch  42 Batch  200/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9764, Loss: 0.0035
Epoch  42 Batch  210/269 - Train Accuracy: 0.9974, Validation Accuracy: 0.9787, Loss: 0.0027
Epoch  42 Batch  220/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9797, Loss: 0.0045
Epoch  42 Batch  230/269 - Train Accuracy: 0.9947, Validation Accuracy: 0.9830, Loss: 0.0043
Epoch  42 Batch  240/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9853, Loss: 0.0058
Epoch  42 Batch  250/269 - Train Accuracy: 0.9978, Validation Accuracy: 0.9875, Loss: 0.0052
Epoch  42 Batch  260/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9862, Loss: 0.0050
Epoch  43 Batch   10/269 - Train Accuracy: 0.9986, Validation Accuracy: 0.9843, Loss: 0.0042
Epoch  43 Batch   20/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9826, Loss: 0.0051
Epoch  43 Batch   30/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9844, Loss: 0.0044
Epoch  43 Batch   40/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9821, Loss: 0.0058
Epoch  43 Batch   50/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9805, Loss: 0.0040
Epoch  43 Batch   60/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9818, Loss: 0.0055
Epoch  43 Batch   70/269 - Train Accuracy: 0.9995, Validation Accuracy: 0.9793, Loss: 0.0047
Epoch  43 Batch   80/269 - Train Accuracy: 0.9993, Validation Accuracy: 0.9782, Loss: 0.0035
Epoch  43 Batch   90/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9783, Loss: 0.0061
Epoch  43 Batch  100/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9766, Loss: 0.0060
Epoch  43 Batch  110/269 - Train Accuracy: 0.9995, Validation Accuracy: 0.9830, Loss: 0.0044
Epoch  43 Batch  120/269 - Train Accuracy: 0.9906, Validation Accuracy: 0.9831, Loss: 0.0065
Epoch  43 Batch  130/269 - Train Accuracy: 0.9958, Validation Accuracy: 0.9791, Loss: 0.0061
Epoch  43 Batch  140/269 - Train Accuracy: 0.9951, Validation Accuracy: 0.9824, Loss: 0.0038
Epoch  43 Batch  150/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9807, Loss: 0.0063
Epoch  43 Batch  160/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9830, Loss: 0.0049
Epoch  43 Batch  170/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9834, Loss: 0.0046
Epoch  43 Batch  180/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9827, Loss: 0.0043
Epoch  43 Batch  190/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9824, Loss: 0.0035
Epoch  43 Batch  200/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9806, Loss: 0.0024
Epoch  43 Batch  210/269 - Train Accuracy: 0.9992, Validation Accuracy: 0.9790, Loss: 0.0054
Epoch  43 Batch  220/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9773, Loss: 0.0041
Epoch  43 Batch  230/269 - Train Accuracy: 0.9984, Validation Accuracy: 0.9798, Loss: 0.0020
Epoch  43 Batch  240/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9802, Loss: 0.0041
Epoch  43 Batch  250/269 - Train Accuracy: 0.9995, Validation Accuracy: 0.9790, Loss: 0.0038
Epoch  43 Batch  260/269 - Train Accuracy: 0.9938, Validation Accuracy: 0.9820, Loss: 0.0066
Epoch  44 Batch   10/269 - Train Accuracy: 0.9986, Validation Accuracy: 0.9828, Loss: 0.0034
Epoch  44 Batch   20/269 - Train Accuracy: 0.9976, Validation Accuracy: 0.9830, Loss: 0.0047
Epoch  44 Batch   30/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9831, Loss: 0.0062
Epoch  44 Batch   40/269 - Train Accuracy: 0.9995, Validation Accuracy: 0.9822, Loss: 0.0065
Epoch  44 Batch   50/269 - Train Accuracy: 0.9983, Validation Accuracy: 0.9846, Loss: 0.0054
Epoch  44 Batch   60/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9854, Loss: 0.0043
Epoch  44 Batch   70/269 - Train Accuracy: 0.9983, Validation Accuracy: 0.9842, Loss: 0.0042
Epoch  44 Batch   80/269 - Train Accuracy: 0.9984, Validation Accuracy: 0.9855, Loss: 0.0039
Epoch  44 Batch   90/269 - Train Accuracy: 0.9987, Validation Accuracy: 0.9856, Loss: 0.0032
Epoch  44 Batch  100/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9814, Loss: 0.0039
Epoch  44 Batch  110/269 - Train Accuracy: 0.9993, Validation Accuracy: 0.9824, Loss: 0.0039
Epoch  44 Batch  120/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9824, Loss: 0.0066
Epoch  44 Batch  130/269 - Train Accuracy: 0.9962, Validation Accuracy: 0.9827, Loss: 0.0070
Epoch  44 Batch  140/269 - Train Accuracy: 0.9943, Validation Accuracy: 0.9859, Loss: 0.0048
Epoch  44 Batch  150/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9855, Loss: 0.0053
Epoch  44 Batch  160/269 - Train Accuracy: 0.9980, Validation Accuracy: 0.9840, Loss: 0.0039
Epoch  44 Batch  170/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9820, Loss: 0.0047
Epoch  44 Batch  180/269 - Train Accuracy: 0.9976, Validation Accuracy: 0.9785, Loss: 0.0036
Epoch  44 Batch  190/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9781, Loss: 0.0046
Epoch  44 Batch  200/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9791, Loss: 0.0027
Epoch  44 Batch  210/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9775, Loss: 0.0056
Epoch  44 Batch  220/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9828, Loss: 0.0043
Epoch  44 Batch  230/269 - Train Accuracy: 0.9986, Validation Accuracy: 0.9838, Loss: 0.0035
Epoch  44 Batch  240/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9848, Loss: 0.0048
Epoch  44 Batch  250/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9818, Loss: 0.0028
Epoch  44 Batch  260/269 - Train Accuracy: 0.9962, Validation Accuracy: 0.9830, Loss: 0.0048
Epoch  45 Batch   10/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9824, Loss: 0.0025
Epoch  45 Batch   20/269 - Train Accuracy: 0.9958, Validation Accuracy: 0.9815, Loss: 0.0053
Epoch  45 Batch   30/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9850, Loss: 0.0073
Epoch  45 Batch   40/269 - Train Accuracy: 0.9958, Validation Accuracy: 0.9835, Loss: 0.0062
Epoch  45 Batch   50/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9837, Loss: 0.0069
Epoch  45 Batch   60/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9826, Loss: 0.0060
Epoch  45 Batch   70/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9874, Loss: 0.0050
Epoch  45 Batch   80/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9834, Loss: 0.0054
Epoch  45 Batch   90/269 - Train Accuracy: 0.9974, Validation Accuracy: 0.9859, Loss: 0.0056
Epoch  45 Batch  100/269 - Train Accuracy: 0.9968, Validation Accuracy: 0.9792, Loss: 0.0041
Epoch  45 Batch  110/269 - Train Accuracy: 0.9987, Validation Accuracy: 0.9799, Loss: 0.0043
Epoch  45 Batch  120/269 - Train Accuracy: 0.9923, Validation Accuracy: 0.9835, Loss: 0.0068
Epoch  45 Batch  130/269 - Train Accuracy: 0.9974, Validation Accuracy: 0.9812, Loss: 0.0059
Epoch  45 Batch  140/269 - Train Accuracy: 0.9908, Validation Accuracy: 0.9828, Loss: 0.0051
Epoch  45 Batch  150/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9841, Loss: 0.0094
Epoch  45 Batch  160/269 - Train Accuracy: 0.9963, Validation Accuracy: 0.9898, Loss: 0.0036
Epoch  45 Batch  170/269 - Train Accuracy: 0.9965, Validation Accuracy: 0.9851, Loss: 0.0060
Epoch  45 Batch  180/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9806, Loss: 0.0046
Epoch  45 Batch  190/269 - Train Accuracy: 0.9961, Validation Accuracy: 0.9802, Loss: 0.0035
Epoch  45 Batch  200/269 - Train Accuracy: 0.9983, Validation Accuracy: 0.9826, Loss: 0.0051
Epoch  45 Batch  210/269 - Train Accuracy: 0.9978, Validation Accuracy: 0.9825, Loss: 0.0048
Epoch  45 Batch  220/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9813, Loss: 0.0043
Epoch  45 Batch  230/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9818, Loss: 0.0058
Epoch  45 Batch  240/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9832, Loss: 0.0061
Epoch  45 Batch  250/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9830, Loss: 0.0069
Epoch  45 Batch  260/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9833, Loss: 0.0066
Epoch  46 Batch   10/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9833, Loss: 0.0035
Epoch  46 Batch   20/269 - Train Accuracy: 0.9978, Validation Accuracy: 0.9830, Loss: 0.0050
Epoch  46 Batch   30/269 - Train Accuracy: 0.9962, Validation Accuracy: 0.9806, Loss: 0.0060
Epoch  46 Batch   40/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9834, Loss: 0.0033
Epoch  46 Batch   50/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9839, Loss: 0.0054
Epoch  46 Batch   60/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9815, Loss: 0.0058
Epoch  46 Batch   70/269 - Train Accuracy: 0.9985, Validation Accuracy: 0.9798, Loss: 0.0048
Epoch  46 Batch   80/269 - Train Accuracy: 0.9991, Validation Accuracy: 0.9861, Loss: 0.0054
Epoch  46 Batch   90/269 - Train Accuracy: 0.9993, Validation Accuracy: 0.9842, Loss: 0.0043
Epoch  46 Batch  100/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9849, Loss: 0.0031
Epoch  46 Batch  110/269 - Train Accuracy: 0.9999, Validation Accuracy: 0.9781, Loss: 0.0034
Epoch  46 Batch  120/269 - Train Accuracy: 0.9935, Validation Accuracy: 0.9798, Loss: 0.0037
Epoch  46 Batch  130/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9853, Loss: 0.0082
Epoch  46 Batch  140/269 - Train Accuracy: 0.9988, Validation Accuracy: 0.9866, Loss: 0.0033
Epoch  46 Batch  150/269 - Train Accuracy: 0.9988, Validation Accuracy: 0.9802, Loss: 0.0061
Epoch  46 Batch  160/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9840, Loss: 0.0046
Epoch  46 Batch  170/269 - Train Accuracy: 0.9957, Validation Accuracy: 0.9843, Loss: 0.0042
Epoch  46 Batch  180/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9853, Loss: 0.0047
Epoch  46 Batch  190/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9824, Loss: 0.0037
Epoch  46 Batch  200/269 - Train Accuracy: 0.9995, Validation Accuracy: 0.9823, Loss: 0.0048
Epoch  46 Batch  210/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9814, Loss: 0.0045
Epoch  46 Batch  220/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9827, Loss: 0.0052
Epoch  46 Batch  230/269 - Train Accuracy: 0.9960, Validation Accuracy: 0.9822, Loss: 0.0046
Epoch  46 Batch  240/269 - Train Accuracy: 0.9963, Validation Accuracy: 0.9856, Loss: 0.0058
Epoch  46 Batch  250/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9843, Loss: 0.0036
Epoch  46 Batch  260/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9860, Loss: 0.0072
Epoch  47 Batch   10/269 - Train Accuracy: 0.9986, Validation Accuracy: 0.9834, Loss: 0.0038
Epoch  47 Batch   20/269 - Train Accuracy: 0.9986, Validation Accuracy: 0.9835, Loss: 0.0020
Epoch  47 Batch   30/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9866, Loss: 0.0040
Epoch  47 Batch   40/269 - Train Accuracy: 0.9992, Validation Accuracy: 0.9852, Loss: 0.0034
Epoch  47 Batch   50/269 - Train Accuracy: 0.9953, Validation Accuracy: 0.9882, Loss: 0.0072
Epoch  47 Batch   60/269 - Train Accuracy: 0.9949, Validation Accuracy: 0.9832, Loss: 0.0052
Epoch  47 Batch   70/269 - Train Accuracy: 0.9971, Validation Accuracy: 0.9848, Loss: 0.0035
Epoch  47 Batch   80/269 - Train Accuracy: 0.9987, Validation Accuracy: 0.9834, Loss: 0.0036
Epoch  47 Batch   90/269 - Train Accuracy: 0.9992, Validation Accuracy: 0.9814, Loss: 0.0044
Epoch  47 Batch  100/269 - Train Accuracy: 0.9976, Validation Accuracy: 0.9841, Loss: 0.0050
Epoch  47 Batch  110/269 - Train Accuracy: 0.9997, Validation Accuracy: 0.9842, Loss: 0.0028
Epoch  47 Batch  120/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9790, Loss: 0.0069
Epoch  47 Batch  130/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9820, Loss: 0.0054
Epoch  47 Batch  140/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9860, Loss: 0.0047
Epoch  47 Batch  150/269 - Train Accuracy: 0.9945, Validation Accuracy: 0.9890, Loss: 0.0076
Epoch  47 Batch  160/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9799, Loss: 0.0046
Epoch  47 Batch  170/269 - Train Accuracy: 0.9957, Validation Accuracy: 0.9840, Loss: 0.0048
Epoch  47 Batch  180/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9854, Loss: 0.0037
Epoch  47 Batch  190/269 - Train Accuracy: 0.9950, Validation Accuracy: 0.9911, Loss: 0.0053
Epoch  47 Batch  200/269 - Train Accuracy: 0.9997, Validation Accuracy: 0.9862, Loss: 0.0023
Epoch  47 Batch  210/269 - Train Accuracy: 0.9990, Validation Accuracy: 0.9868, Loss: 0.0042
Epoch  47 Batch  220/269 - Train Accuracy: 0.9965, Validation Accuracy: 0.9828, Loss: 0.0049
Epoch  47 Batch  230/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9817, Loss: 0.0038
Epoch  47 Batch  240/269 - Train Accuracy: 0.9963, Validation Accuracy: 0.9873, Loss: 0.0049
Epoch  47 Batch  250/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9794, Loss: 0.0037
Epoch  47 Batch  260/269 - Train Accuracy: 0.9910, Validation Accuracy: 0.9812, Loss: 0.0087
Epoch  48 Batch   10/269 - Train Accuracy: 0.9964, Validation Accuracy: 0.9862, Loss: 0.0036
Epoch  48 Batch   20/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9882, Loss: 0.0054
Epoch  48 Batch   30/269 - Train Accuracy: 0.9930, Validation Accuracy: 0.9871, Loss: 0.0063
Epoch  48 Batch   40/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9835, Loss: 0.0045
Epoch  48 Batch   50/269 - Train Accuracy: 0.9955, Validation Accuracy: 0.9849, Loss: 0.0063
Epoch  48 Batch   60/269 - Train Accuracy: 0.9944, Validation Accuracy: 0.9763, Loss: 0.0081
Epoch  48 Batch   70/269 - Train Accuracy: 0.9980, Validation Accuracy: 0.9808, Loss: 0.0058
Epoch  48 Batch   80/269 - Train Accuracy: 0.9980, Validation Accuracy: 0.9780, Loss: 0.0029
Epoch  48 Batch   90/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9798, Loss: 0.0042
Epoch  48 Batch  100/269 - Train Accuracy: 0.9934, Validation Accuracy: 0.9742, Loss: 0.0066
Epoch  48 Batch  110/269 - Train Accuracy: 0.9999, Validation Accuracy: 0.9724, Loss: 0.0043
Epoch  48 Batch  120/269 - Train Accuracy: 0.9959, Validation Accuracy: 0.9801, Loss: 0.0057
Epoch  48 Batch  130/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9816, Loss: 0.0070
Epoch  48 Batch  140/269 - Train Accuracy: 0.9973, Validation Accuracy: 0.9854, Loss: 0.0051
Epoch  48 Batch  150/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9786, Loss: 0.0078
Epoch  48 Batch  160/269 - Train Accuracy: 0.9993, Validation Accuracy: 0.9792, Loss: 0.0031
Epoch  48 Batch  170/269 - Train Accuracy: 0.9954, Validation Accuracy: 0.9798, Loss: 0.0076
Epoch  48 Batch  180/269 - Train Accuracy: 0.9981, Validation Accuracy: 0.9753, Loss: 0.0044
Epoch  48 Batch  190/269 - Train Accuracy: 0.9966, Validation Accuracy: 0.9759, Loss: 0.0033
Epoch  48 Batch  200/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9811, Loss: 0.0050
Epoch  48 Batch  210/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9806, Loss: 0.0051
Epoch  48 Batch  220/269 - Train Accuracy: 0.9977, Validation Accuracy: 0.9807, Loss: 0.0047
Epoch  48 Batch  230/269 - Train Accuracy: 0.9988, Validation Accuracy: 0.9835, Loss: 0.0038
Epoch  48 Batch  240/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9865, Loss: 0.0037
Epoch  48 Batch  250/269 - Train Accuracy: 0.9967, Validation Accuracy: 0.9862, Loss: 0.0045
Epoch  48 Batch  260/269 - Train Accuracy: 0.9969, Validation Accuracy: 0.9845, Loss: 0.0058
Epoch  49 Batch   10/269 - Train Accuracy: 0.9957, Validation Accuracy: 0.9879, Loss: 0.0032
Epoch  49 Batch   20/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9858, Loss: 0.0034
Epoch  49 Batch   30/269 - Train Accuracy: 0.9963, Validation Accuracy: 0.9822, Loss: 0.0071
Epoch  49 Batch   40/269 - Train Accuracy: 0.9985, Validation Accuracy: 0.9840, Loss: 0.0052
Epoch  49 Batch   50/269 - Train Accuracy: 0.9965, Validation Accuracy: 0.9827, Loss: 0.0050
Epoch  49 Batch   60/269 - Train Accuracy: 0.9975, Validation Accuracy: 0.9868, Loss: 0.0057
Epoch  49 Batch   70/269 - Train Accuracy: 0.9992, Validation Accuracy: 0.9826, Loss: 0.0045
Epoch  49 Batch   80/269 - Train Accuracy: 0.9992, Validation Accuracy: 0.9812, Loss: 0.0041
Epoch  49 Batch   90/269 - Train Accuracy: 0.9955, Validation Accuracy: 0.9811, Loss: 0.0051
Epoch  49 Batch  100/269 - Train Accuracy: 0.9982, Validation Accuracy: 0.9773, Loss: 0.0064
Epoch  49 Batch  110/269 - Train Accuracy: 0.9999, Validation Accuracy: 0.9836, Loss: 0.0024
Epoch  49 Batch  120/269 - Train Accuracy: 0.9940, Validation Accuracy: 0.9791, Loss: 0.0056
Epoch  49 Batch  130/269 - Train Accuracy: 0.9930, Validation Accuracy: 0.9804, Loss: 0.0080
Epoch  49 Batch  140/269 - Train Accuracy: 0.9935, Validation Accuracy: 0.9864, Loss: 0.0050
Epoch  49 Batch  150/269 - Train Accuracy: 0.9992, Validation Accuracy: 0.9807, Loss: 0.0064
Epoch  49 Batch  160/269 - Train Accuracy: 0.9985, Validation Accuracy: 0.9801, Loss: 0.0025
Epoch  49 Batch  170/269 - Train Accuracy: 0.9948, Validation Accuracy: 0.9835, Loss: 0.0046
Epoch  49 Batch  180/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9771, Loss: 0.0044
Epoch  49 Batch  190/269 - Train Accuracy: 0.9956, Validation Accuracy: 0.9814, Loss: 0.0045
Epoch  49 Batch  200/269 - Train Accuracy: 0.9996, Validation Accuracy: 0.9804, Loss: 0.0028
Epoch  49 Batch  210/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9798, Loss: 0.0053
Epoch  49 Batch  220/269 - Train Accuracy: 0.9974, Validation Accuracy: 0.9818, Loss: 0.0038
Epoch  49 Batch  230/269 - Train Accuracy: 0.9970, Validation Accuracy: 0.9806, Loss: 0.0039
Epoch  49 Batch  240/269 - Train Accuracy: 0.9972, Validation Accuracy: 0.9822, Loss: 0.0049
Epoch  49 Batch  250/269 - Train Accuracy: 0.9998, Validation Accuracy: 0.9821, Loss: 0.0032
Epoch  49 Batch  260/269 - Train Accuracy: 0.9979, Validation Accuracy: 0.9824, Loss: 0.0051
Model Trained and Saved

In [18]:
# Save Model
saver = tf.train.Saver()
saver.save(sess, save_path)
print('Model Trained and Saved')


---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-18-73f229cb16c2> in <module>()
      1 # Save Model
----> 2 saver = tf.train.Saver()
      3 saver.save(sess, save_path)
      4 print('Model Trained and Saved')

C:\Users\julien\AppData\Local\conda\conda\envs\dlnd-tf-lab\lib\site-packages\tensorflow\python\training\saver.py in __init__(self, var_list, reshape, sharded, max_to_keep, keep_checkpoint_every_n_hours, name, restore_sequentially, saver_def, builder, defer_build, allow_empty, write_version, pad_step_number)
   1054     self._pad_step_number = pad_step_number
   1055     if not defer_build:
-> 1056       self.build()
   1057     if self.saver_def:
   1058       self._check_saver_def()

C:\Users\julien\AppData\Local\conda\conda\envs\dlnd-tf-lab\lib\site-packages\tensorflow\python\training\saver.py in build(self)
   1075           return
   1076         else:
-> 1077           raise ValueError("No variables to save")
   1078       self._is_empty = False
   1079       self.saver_def = self._builder.build(

ValueError: No variables to save

Save Parameters

Save the batch_size and save_path parameters for inference.


In [19]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Save parameters for checkpoint
helper.save_params(save_path)

Checkpoint


In [20]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import tensorflow as tf
import numpy as np
import helper
import problem_unittests as tests

_, (source_vocab_to_int, target_vocab_to_int), (source_int_to_vocab, target_int_to_vocab) = helper.load_preprocess()
load_path = helper.load_params()

Sentence to Sequence

To feed a sentence into the model for translation, you first need to preprocess it. Implement the function sentence_to_seq() to preprocess new sentences.

  • Convert the sentence to lowercase
  • Convert words into ids using vocab_to_int
    • Convert words not in the vocabulary, to the <UNK> word id.

In [21]:
def sentence_to_seq(sentence, vocab_to_int):
    """
    Convert a sentence to a sequence of ids
    :param sentence: String
    :param vocab_to_int: Dictionary to go from the words to an id
    :return: List of word ids
    """
    
    return [vocab_to_int.get(w,vocab_to_int["<UNK>"]) for w in sentence.lower().split(" ")]


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_sentence_to_seq(sentence_to_seq)


Tests Passed

Translate

This will translate translate_sentence from English to French.


In [22]:
translate_sentence = 'he saw a old yellow truck .'


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
translate_sentence = sentence_to_seq(translate_sentence, source_vocab_to_int)

loaded_graph = tf.Graph()
with tf.Session(graph=loaded_graph) as sess:
    # Load saved model
    loader = tf.train.import_meta_graph(load_path + '.meta')
    loader.restore(sess, load_path)

    input_data = loaded_graph.get_tensor_by_name('input:0')
    logits = loaded_graph.get_tensor_by_name('predictions:0')
    target_sequence_length = loaded_graph.get_tensor_by_name('target_sequence_length:0')
    source_sequence_length = loaded_graph.get_tensor_by_name('source_sequence_length:0')
    keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')

    translate_logits = sess.run(logits, {input_data: [translate_sentence]*batch_size,
                                         target_sequence_length: [len(translate_sentence)*2]*batch_size,
                                         source_sequence_length: [len(translate_sentence)]*batch_size,
                                         keep_prob: 1.0})[0]

print('Input')
print('  Word Ids:      {}'.format([i for i in translate_sentence]))
print('  English Words: {}'.format([source_int_to_vocab[i] for i in translate_sentence]))

print('\nPrediction')
print('  Word Ids:      {}'.format([i for i in translate_logits]))
print('  French Words: {}'.format(" ".join([target_int_to_vocab[i] for i in translate_logits])))


INFO:tensorflow:Restoring parameters from checkpoints/dev
Input
  Word Ids:      [158, 80, 177, 10, 104, 139, 212]
  English Words: ['he', 'saw', 'a', 'old', 'yellow', 'truck', '.']

Prediction
  Word Ids:      [117, 345, 148, 155, 13, 312, 84, 352, 1]
  French Words: il a vu un vieux camion jaune . <EOS>

Imperfect Translation

You might notice that some sentences translate better than others. Since the dataset you're using only has a vocabulary of 227 English words of the thousands that you use, you're only going to see good results using these words. For this project, you don't need a perfect translation. However, if you want to create a better translation model, you'll need better data.

You can train on the WMT10 French-English corpus. This dataset has more vocabulary and richer in topics discussed. However, this will take you days to train, so make sure you've a GPU and the neural network is performing well on dataset we provided. Just make sure you play with the WMT10 corpus after you've submitted this project.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_language_translation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.


In [ ]:


In [ ]: