Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.


In [18]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

# Use Floyd's cifar-10 dataset if present
floyd_cifar10_location = '/input/cifar-10/python.tar.gz'
if isfile(floyd_cifar10_location):
    tar_gz_path = floyd_cifar10_location
else:
    tar_gz_path = 'cifar-10-python.tar.gz'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(tar_gz_path):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            tar_gz_path,
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open(tar_gz_path) as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


All files found!

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

  • airplane
  • automobile
  • bird
  • cat
  • deer
  • dog
  • frog
  • horse
  • ship
  • truck

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.


In [19]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id = 1
sample_id = 5
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 1:
Samples: 10000
Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}
First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]

Example of Image 5:
Image - Min Value: 0 Max Value: 252
Image - Shape: (32, 32, 3)
Label - Label Id: 1 Name: automobile

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.


In [20]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    return x / 255


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_normalize(normalize)


Tests Passed

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.

Hint: Don't reinvent the wheel.


In [21]:
from sklearn.preprocessing import OneHotEncoder


def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    labels = np.array(x).reshape(-1, 1)
    enc = OneHotEncoder(10, sparse=False)
    enc.fit(labels)
    encoded_labels = enc.transform(labels)
    return encoded_labels


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_one_hot_encode(one_hot_encode)


Tests Passed

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.


In [22]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [23]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

Note: If you're finding it hard to dedicate enough time for this course each week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages to build each layer, except the layers you build in the "Convolutional and Max Pooling Layer" section. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

However, if you would like to get the most out of this course, try to solve all the problems without using anything from the TF Layers packages. You can still use classes from other packages that happen to have the same name as ones you find in TF Layers! For example, instead of using the TF Layers version of the conv2d class, tf.layers.conv2d, you would want to use the TF Neural Network version of conv2d, tf.nn.conv2d.

Let's begin!

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

  • Implement neural_net_image_input
    • Return a TF Placeholder
    • Set the shape using image_shape with batch size set to None.
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_label_input
    • Return a TF Placeholder
    • Set the shape using n_classes with batch size set to None.
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_keep_prob_input
    • Return a TF Placeholder for dropout keep probability.
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.

These names will be used at the end of the project to load your saved model.

Note: None for shapes in TensorFlow allow for a dynamic size.


In [24]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a batch of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    (h, w, color_channels) = image_shape
    return tf.placeholder(tf.float32, shape=(None, h, w, color_channels), name="x")


def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    
    return tf.placeholder(tf.float32, shape=(None, n_classes), name="y")


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32, name="keep_prob")


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Image Input Tests Passed.
Label Input Tests Passed.
Keep Prob Tests Passed.

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • Apply a convolution to x_tensor using weight and conv_strides.
    • We recommend you use same padding, but you're welcome to use any padding.
  • Add bias
  • Add a nonlinear activation to the convolution.
  • Apply Max Pooling using pool_ksize and pool_strides.
    • We recommend you use same padding, but you're welcome to use any padding.

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer, but you can still use TensorFlow's Neural Network package. You may still use the shortcut option for all the other layers.


In [25]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_ksize: kernal size 2-D Tuple for the convolutional layer
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """

    [_, _, _, in_channels] = x_tensor.get_shape().as_list()
    output_channels = conv_num_outputs
    (conv_k_height, conv_k_width) = conv_ksize
    (conv_stride_height, conv_stride_width) = conv_strides 
    (pool_k_height, pool_k_width) = pool_ksize
    (pool_stride_height, pool_stride_width) = pool_strides
    
    stddev = 0.1
    conv2d_filter = tf.Variable(tf.truncated_normal([conv_k_height, conv_k_width, in_channels, int(output_channels)], stddev=stddev))
    conv2d_strides = [1, conv_stride_height, conv_stride_width, 1]
    bias = tf.Variable(tf.truncated_normal([conv_num_outputs], stddev=stddev))
    pool_ksize = [1, pool_k_height, pool_k_width, 1]
    pool_strides = [1, pool_stride_height, pool_stride_width, 1]
    padding = 'SAME'
    
    x_conv2d_maxpool = tf.nn.conv2d(x_tensor, conv2d_filter, strides=conv2d_strides, padding=padding)
    x_conv2d_maxpool = tf.nn.bias_add(x_conv2d_maxpool, bias)
    x_conv2d_maxpool = tf.nn.relu(x_conv2d_maxpool)
    x_conv2d_maxpool = tf.nn.max_pool(x_conv2d_maxpool, pool_ksize, pool_strides, padding=padding)
    return x_conv2d_maxpool


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_con_pool(conv2d_maxpool)


Tests Passed

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [26]:
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    [batch_size, height, width, channels] = x_tensor.get_shape().as_list()

    flatten_x_tensor = tf.reshape(x_tensor, shape=[-1, height * width * channels])
    return flatten_x_tensor


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_flatten(flatten)


Tests Passed

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [27]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    stddev = 0.1
    [_, num_inputs] = x_tensor.get_shape().as_list()
    weight = tf.Variable(tf.truncated_normal([num_inputs, num_outputs], stddev=stddev))
    bias = tf.Variable(tf.truncated_normal([num_outputs], stddev=stddev))
    conn_layer = tf.add(tf.matmul(x_tensor, weight), bias)
    conn_layer = tf.nn.relu(conn_layer)
    return conn_layer


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_fully_conn(fully_conn)


Tests Passed

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.

Note: Activation, softmax, or cross entropy should not be applied to this.


In [28]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    stddev = 0.1
    [_, num_inputs] = x_tensor.get_shape().as_list()
    weight = tf.Variable(tf.truncated_normal([num_inputs, num_outputs], stddev=stddev))
    bias = tf.Variable(tf.truncated_normal([num_outputs], stddev=stddev))
    output_layer = tf.add(tf.matmul(x_tensor, weight), bias)
    return output_layer


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_output(output)


Tests Passed

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • Apply a Flatten Layer
  • Apply 1, 2, or 3 Fully Connected Layers
  • Apply an Output Layer
  • Return the output
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.

In [77]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """

    net = conv2d_maxpool(x, 64, (5, 5), (1, 1), (3, 3), (2, 2))
    net = conv2d_maxpool(net, 128, (3, 3), (1, 1), (3, 3), (2, 2))
    net = conv2d_maxpool(net, 128, (2, 2), (1, 1), (2, 2), (2, 2))
    net = tf.nn.dropout(net, keep_prob)
    
    
    net = flatten(net)

    net = fully_conn(net, 512)
    net = tf.nn.dropout(net, keep_prob)
    net = fully_conn(net, 1024)
    net = tf.nn.dropout(net, keep_prob)

    net = output(net, 10)

    return net


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""

##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

tests.test_conv_net(conv_net)


Neural Network Built!

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

  • x for image input
  • y for labels
  • keep_prob for keep probability for dropout

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

Note: Nothing needs to be returned. This function is only optimizing the neural network.


In [78]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    session.run(optimizer, feed_dict={x: feature_batch, y: label_batch, keep_prob: keep_probability})

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_train_nn(train_neural_network)


Tests Passed

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.


In [79]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """
    loss           = session.run(cost, feed_dict={x: feature_batch, y: label_batch, keep_prob: 1})
    validation_acc = session.run(accuracy, feed_dict={x: valid_features, y: valid_labels, keep_prob: 1})
    print('Loss:', "%.12f" % loss,'Validation Accuracy: ',"%.12f" % (validation_acc))

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
    • 64
    • 128
    • 256
    • ...
  • Set keep_probability to the probability of keeping a node using dropout

In [82]:
# TODO: Tune Parameters
epochs = 200
batch_size = 300
keep_probability = 0.5

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.


In [83]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  Loss: 2.277827978134 Validation Accuracy:  0.163599982858
Epoch  2, CIFAR-10 Batch 1:  Loss: 2.185697078705 Validation Accuracy:  0.226799994707
Epoch  3, CIFAR-10 Batch 1:  Loss: 2.110402107239 Validation Accuracy:  0.264999985695
Epoch  4, CIFAR-10 Batch 1:  Loss: 2.019529342651 Validation Accuracy:  0.304999977350
Epoch  5, CIFAR-10 Batch 1:  Loss: 1.917164564133 Validation Accuracy:  0.356000006199
Epoch  6, CIFAR-10 Batch 1:  Loss: 1.862139225006 Validation Accuracy:  0.350199997425
Epoch  7, CIFAR-10 Batch 1:  Loss: 1.799019575119 Validation Accuracy:  0.364399999380
Epoch  8, CIFAR-10 Batch 1:  Loss: 1.715746760368 Validation Accuracy:  0.397399991751
Epoch  9, CIFAR-10 Batch 1:  Loss: 1.703898191452 Validation Accuracy:  0.400999963284
Epoch 10, CIFAR-10 Batch 1:  Loss: 1.639391422272 Validation Accuracy:  0.418799996376
Epoch 11, CIFAR-10 Batch 1:  Loss: 1.616883277893 Validation Accuracy:  0.421400010586
Epoch 12, CIFAR-10 Batch 1:  Loss: 1.538308143616 Validation Accuracy:  0.455399960279
Epoch 13, CIFAR-10 Batch 1:  Loss: 1.500655412674 Validation Accuracy:  0.462999969721
Epoch 14, CIFAR-10 Batch 1:  Loss: 1.513589382172 Validation Accuracy:  0.461399942636
Epoch 15, CIFAR-10 Batch 1:  Loss: 1.475956797600 Validation Accuracy:  0.469399958849
Epoch 16, CIFAR-10 Batch 1:  Loss: 1.424783706665 Validation Accuracy:  0.479399949312
Epoch 17, CIFAR-10 Batch 1:  Loss: 1.416352152824 Validation Accuracy:  0.484399974346
Epoch 18, CIFAR-10 Batch 1:  Loss: 1.423943996429 Validation Accuracy:  0.481999993324
Epoch 19, CIFAR-10 Batch 1:  Loss: 1.364701628685 Validation Accuracy:  0.503199934959
Epoch 20, CIFAR-10 Batch 1:  Loss: 1.378351688385 Validation Accuracy:  0.484599947929
Epoch 21, CIFAR-10 Batch 1:  Loss: 1.332377791405 Validation Accuracy:  0.507999956608
Epoch 22, CIFAR-10 Batch 1:  Loss: 1.292882919312 Validation Accuracy:  0.509599924088
Epoch 23, CIFAR-10 Batch 1:  Loss: 1.265131235123 Validation Accuracy:  0.522399902344
Epoch 24, CIFAR-10 Batch 1:  Loss: 1.247468709946 Validation Accuracy:  0.520599961281
Epoch 25, CIFAR-10 Batch 1:  Loss: 1.233933448792 Validation Accuracy:  0.525999903679
Epoch 26, CIFAR-10 Batch 1:  Loss: 1.231987833977 Validation Accuracy:  0.534199953079
Epoch 27, CIFAR-10 Batch 1:  Loss: 1.241397023201 Validation Accuracy:  0.531399905682
Epoch 28, CIFAR-10 Batch 1:  Loss: 1.178035616875 Validation Accuracy:  0.539199948311
Epoch 29, CIFAR-10 Batch 1:  Loss: 1.186913013458 Validation Accuracy:  0.536599934101
Epoch 30, CIFAR-10 Batch 1:  Loss: 1.114134907722 Validation Accuracy:  0.552799940109
Epoch 31, CIFAR-10 Batch 1:  Loss: 1.079064369202 Validation Accuracy:  0.553199946880
Epoch 32, CIFAR-10 Batch 1:  Loss: 1.132760405540 Validation Accuracy:  0.547799944878
Epoch 33, CIFAR-10 Batch 1:  Loss: 1.078297376633 Validation Accuracy:  0.556599915028
Epoch 34, CIFAR-10 Batch 1:  Loss: 1.076505541801 Validation Accuracy:  0.560799956322
Epoch 35, CIFAR-10 Batch 1:  Loss: 1.051599144936 Validation Accuracy:  0.559399962425
Epoch 36, CIFAR-10 Batch 1:  Loss: 0.990880429745 Validation Accuracy:  0.573399960995
Epoch 37, CIFAR-10 Batch 1:  Loss: 1.032642602921 Validation Accuracy:  0.563199877739
Epoch 38, CIFAR-10 Batch 1:  Loss: 1.023866295815 Validation Accuracy:  0.559199929237
Epoch 39, CIFAR-10 Batch 1:  Loss: 0.995983004570 Validation Accuracy:  0.566599905491
Epoch 40, CIFAR-10 Batch 1:  Loss: 0.971346855164 Validation Accuracy:  0.577799916267
Epoch 41, CIFAR-10 Batch 1:  Loss: 0.937937438488 Validation Accuracy:  0.578399896622
Epoch 42, CIFAR-10 Batch 1:  Loss: 0.906470060349 Validation Accuracy:  0.580999970436
Epoch 43, CIFAR-10 Batch 1:  Loss: 0.885336577892 Validation Accuracy:  0.569799900055
Epoch 44, CIFAR-10 Batch 1:  Loss: 0.891027212143 Validation Accuracy:  0.576599955559
Epoch 45, CIFAR-10 Batch 1:  Loss: 0.833446979523 Validation Accuracy:  0.584999978542
Epoch 46, CIFAR-10 Batch 1:  Loss: 0.823576748371 Validation Accuracy:  0.590999901295
Epoch 47, CIFAR-10 Batch 1:  Loss: 0.808272242546 Validation Accuracy:  0.590399920940
Epoch 48, CIFAR-10 Batch 1:  Loss: 0.791549623013 Validation Accuracy:  0.589999914169
Epoch 49, CIFAR-10 Batch 1:  Loss: 0.831613779068 Validation Accuracy:  0.573799967766
Epoch 50, CIFAR-10 Batch 1:  Loss: 0.795499444008 Validation Accuracy:  0.570399940014
Epoch 51, CIFAR-10 Batch 1:  Loss: 0.762686550617 Validation Accuracy:  0.594399929047
Epoch 52, CIFAR-10 Batch 1:  Loss: 0.733068048954 Validation Accuracy:  0.598199844360
Epoch 53, CIFAR-10 Batch 1:  Loss: 0.698002636433 Validation Accuracy:  0.597999930382
Epoch 54, CIFAR-10 Batch 1:  Loss: 0.697779119015 Validation Accuracy:  0.598799884319
Epoch 55, CIFAR-10 Batch 1:  Loss: 0.724438309669 Validation Accuracy:  0.587999880314
Epoch 56, CIFAR-10 Batch 1:  Loss: 0.733010649681 Validation Accuracy:  0.590799927711
Epoch 57, CIFAR-10 Batch 1:  Loss: 0.714329123497 Validation Accuracy:  0.601399898529
Epoch 58, CIFAR-10 Batch 1:  Loss: 0.663603663445 Validation Accuracy:  0.611799895763
Epoch 59, CIFAR-10 Batch 1:  Loss: 0.656442403793 Validation Accuracy:  0.603799819946
Epoch 60, CIFAR-10 Batch 1:  Loss: 0.643716037273 Validation Accuracy:  0.601199924946
Epoch 61, CIFAR-10 Batch 1:  Loss: 0.585990905762 Validation Accuracy:  0.611199915409
Epoch 62, CIFAR-10 Batch 1:  Loss: 0.615168869495 Validation Accuracy:  0.611599922180
Epoch 63, CIFAR-10 Batch 1:  Loss: 0.628502190113 Validation Accuracy:  0.608999907970
Epoch 64, CIFAR-10 Batch 1:  Loss: 0.607718884945 Validation Accuracy:  0.607799947262
Epoch 65, CIFAR-10 Batch 1:  Loss: 0.554849386215 Validation Accuracy:  0.626199960709
Epoch 66, CIFAR-10 Batch 1:  Loss: 0.552200436592 Validation Accuracy:  0.619199931622
Epoch 67, CIFAR-10 Batch 1:  Loss: 0.515510559082 Validation Accuracy:  0.622799932957
Epoch 68, CIFAR-10 Batch 1:  Loss: 0.599810302258 Validation Accuracy:  0.599599957466
Epoch 69, CIFAR-10 Batch 1:  Loss: 0.511920750141 Validation Accuracy:  0.628999948502
Epoch 70, CIFAR-10 Batch 1:  Loss: 0.538960933685 Validation Accuracy:  0.613999962807
Epoch 71, CIFAR-10 Batch 1:  Loss: 0.518139302731 Validation Accuracy:  0.620399892330
Epoch 72, CIFAR-10 Batch 1:  Loss: 0.441350579262 Validation Accuracy:  0.631799936295
Epoch 73, CIFAR-10 Batch 1:  Loss: 0.455220758915 Validation Accuracy:  0.622399926186
Epoch 74, CIFAR-10 Batch 1:  Loss: 0.467731893063 Validation Accuracy:  0.628399968147
Epoch 75, CIFAR-10 Batch 1:  Loss: 0.414612710476 Validation Accuracy:  0.629599869251
Epoch 76, CIFAR-10 Batch 1:  Loss: 0.401874899864 Validation Accuracy:  0.634599924088
Epoch 77, CIFAR-10 Batch 1:  Loss: 0.446956604719 Validation Accuracy:  0.621199905872
Epoch 78, CIFAR-10 Batch 1:  Loss: 0.401334464550 Validation Accuracy:  0.633999884129
Epoch 79, CIFAR-10 Batch 1:  Loss: 0.428744852543 Validation Accuracy:  0.624999940395
Epoch 80, CIFAR-10 Batch 1:  Loss: 0.376083463430 Validation Accuracy:  0.630199909210
Epoch 81, CIFAR-10 Batch 1:  Loss: 0.383894383907 Validation Accuracy:  0.624399900436
Epoch 82, CIFAR-10 Batch 1:  Loss: 0.370399594307 Validation Accuracy:  0.629999876022
Epoch 83, CIFAR-10 Batch 1:  Loss: 0.340559244156 Validation Accuracy:  0.636399924755
Epoch 84, CIFAR-10 Batch 1:  Loss: 0.343039423227 Validation Accuracy:  0.629399895668
Epoch 85, CIFAR-10 Batch 1:  Loss: 0.319438517094 Validation Accuracy:  0.631399929523
Epoch 86, CIFAR-10 Batch 1:  Loss: 0.349270045757 Validation Accuracy:  0.627799868584
Epoch 87, CIFAR-10 Batch 1:  Loss: 0.319135516882 Validation Accuracy:  0.626399934292
Epoch 88, CIFAR-10 Batch 1:  Loss: 0.291644513607 Validation Accuracy:  0.637999892235
Epoch 89, CIFAR-10 Batch 1:  Loss: 0.286276578903 Validation Accuracy:  0.638199925423
Epoch 90, CIFAR-10 Batch 1:  Loss: 0.288961708546 Validation Accuracy:  0.631199896336
Epoch 91, CIFAR-10 Batch 1:  Loss: 0.289911955595 Validation Accuracy:  0.636599898338
Epoch 92, CIFAR-10 Batch 1:  Loss: 0.284311473370 Validation Accuracy:  0.624399900436
Epoch 93, CIFAR-10 Batch 1:  Loss: 0.270060420036 Validation Accuracy:  0.632799923420
Epoch 94, CIFAR-10 Batch 1:  Loss: 0.296497076750 Validation Accuracy:  0.626599907875
Epoch 95, CIFAR-10 Batch 1:  Loss: 0.261838138103 Validation Accuracy:  0.633599877357
Epoch 96, CIFAR-10 Batch 1:  Loss: 0.240870803595 Validation Accuracy:  0.638999879360
Epoch 97, CIFAR-10 Batch 1:  Loss: 0.243485778570 Validation Accuracy:  0.633999943733
Epoch 98, CIFAR-10 Batch 1:  Loss: 0.241105943918 Validation Accuracy:  0.627399921417
Epoch 99, CIFAR-10 Batch 1:  Loss: 0.239797741175 Validation Accuracy:  0.630399882793
Epoch 100, CIFAR-10 Batch 1:  Loss: 0.212632611394 Validation Accuracy:  0.647199928761
Epoch 101, CIFAR-10 Batch 1:  Loss: 0.225571647286 Validation Accuracy:  0.636599898338
Epoch 102, CIFAR-10 Batch 1:  Loss: 0.203853413463 Validation Accuracy:  0.632399916649
Epoch 103, CIFAR-10 Batch 1:  Loss: 0.223492011428 Validation Accuracy:  0.628999888897
Epoch 104, CIFAR-10 Batch 1:  Loss: 0.198619365692 Validation Accuracy:  0.632999956608
Epoch 105, CIFAR-10 Batch 1:  Loss: 0.200589209795 Validation Accuracy:  0.638799905777
Epoch 106, CIFAR-10 Batch 1:  Loss: 0.191073670983 Validation Accuracy:  0.640199899673
Epoch 107, CIFAR-10 Batch 1:  Loss: 0.194081291556 Validation Accuracy:  0.625599861145
Epoch 108, CIFAR-10 Batch 1:  Loss: 0.179161354899 Validation Accuracy:  0.635999917984
Epoch 109, CIFAR-10 Batch 1:  Loss: 0.204759508371 Validation Accuracy:  0.630799949169
Epoch 110, CIFAR-10 Batch 1:  Loss: 0.156247794628 Validation Accuracy:  0.639399886131
Epoch 111, CIFAR-10 Batch 1:  Loss: 0.163065239787 Validation Accuracy:  0.640999913216
Epoch 112, CIFAR-10 Batch 1:  Loss: 0.189815431833 Validation Accuracy:  0.625799894333
Epoch 113, CIFAR-10 Batch 1:  Loss: 0.175352126360 Validation Accuracy:  0.636999905109
Epoch 114, CIFAR-10 Batch 1:  Loss: 0.160932764411 Validation Accuracy:  0.637199938297
Epoch 115, CIFAR-10 Batch 1:  Loss: 0.147526472807 Validation Accuracy:  0.635599911213
Epoch 116, CIFAR-10 Batch 1:  Loss: 0.148957550526 Validation Accuracy:  0.622199892998
Epoch 117, CIFAR-10 Batch 1:  Loss: 0.138601362705 Validation Accuracy:  0.637599825859
Epoch 118, CIFAR-10 Batch 1:  Loss: 0.129868596792 Validation Accuracy:  0.626599907875
Epoch 119, CIFAR-10 Batch 1:  Loss: 0.138098657131 Validation Accuracy:  0.635799944401
Epoch 120, CIFAR-10 Batch 1:  Loss: 0.150127097964 Validation Accuracy:  0.635999917984
Epoch 121, CIFAR-10 Batch 1:  Loss: 0.124429613352 Validation Accuracy:  0.637399911880
Epoch 122, CIFAR-10 Batch 1:  Loss: 0.130800962448 Validation Accuracy:  0.640999972820
Epoch 123, CIFAR-10 Batch 1:  Loss: 0.131523802876 Validation Accuracy:  0.640399932861
Epoch 124, CIFAR-10 Batch 1:  Loss: 0.122954353690 Validation Accuracy:  0.649199903011
Epoch 125, CIFAR-10 Batch 1:  Loss: 0.097235687077 Validation Accuracy:  0.646399915218
Epoch 126, CIFAR-10 Batch 1:  Loss: 0.098432198167 Validation Accuracy:  0.636799931526
Epoch 127, CIFAR-10 Batch 1:  Loss: 0.124148517847 Validation Accuracy:  0.638399839401
Epoch 128, CIFAR-10 Batch 1:  Loss: 0.108373895288 Validation Accuracy:  0.646199882030
Epoch 129, CIFAR-10 Batch 1:  Loss: 0.102567330003 Validation Accuracy:  0.645799875259
Epoch 130, CIFAR-10 Batch 1:  Loss: 0.084180489182 Validation Accuracy:  0.645999848843
Epoch 131, CIFAR-10 Batch 1:  Loss: 0.076597012579 Validation Accuracy:  0.649199962616
Epoch 132, CIFAR-10 Batch 1:  Loss: 0.079981550574 Validation Accuracy:  0.646799921989
Epoch 133, CIFAR-10 Batch 1:  Loss: 0.072555750608 Validation Accuracy:  0.646399855614
Epoch 134, CIFAR-10 Batch 1:  Loss: 0.100577868521 Validation Accuracy:  0.639999866486
Epoch 135, CIFAR-10 Batch 1:  Loss: 0.088636115193 Validation Accuracy:  0.644799947739
Epoch 136, CIFAR-10 Batch 1:  Loss: 0.095371648669 Validation Accuracy:  0.641199946404
Epoch 137, CIFAR-10 Batch 1:  Loss: 0.077361680567 Validation Accuracy:  0.642399907112
Epoch 138, CIFAR-10 Batch 1:  Loss: 0.071356728673 Validation Accuracy:  0.635999858379
Epoch 139, CIFAR-10 Batch 1:  Loss: 0.070774272084 Validation Accuracy:  0.642999887466
Epoch 140, CIFAR-10 Batch 1:  Loss: 0.073729485273 Validation Accuracy:  0.637999892235
Epoch 141, CIFAR-10 Batch 1:  Loss: 0.077184036374 Validation Accuracy:  0.632399916649
Epoch 142, CIFAR-10 Batch 1:  Loss: 0.085913330317 Validation Accuracy:  0.647799849510
Epoch 143, CIFAR-10 Batch 1:  Loss: 0.068347036839 Validation Accuracy:  0.627999901772
Epoch 144, CIFAR-10 Batch 1:  Loss: 0.062593728304 Validation Accuracy:  0.637599885464
Epoch 145, CIFAR-10 Batch 1:  Loss: 0.055745199323 Validation Accuracy:  0.640399873257
Epoch 146, CIFAR-10 Batch 1:  Loss: 0.043993815780 Validation Accuracy:  0.644599914551
Epoch 147, CIFAR-10 Batch 1:  Loss: 0.072531446815 Validation Accuracy:  0.630199849606
Epoch 148, CIFAR-10 Batch 1:  Loss: 0.061888821423 Validation Accuracy:  0.627199947834
Epoch 149, CIFAR-10 Batch 1:  Loss: 0.057701304555 Validation Accuracy:  0.629799842834
Epoch 150, CIFAR-10 Batch 1:  Loss: 0.061230428517 Validation Accuracy:  0.628199934959
Epoch 151, CIFAR-10 Batch 1:  Loss: 0.052004113793 Validation Accuracy:  0.640999853611
Epoch 152, CIFAR-10 Batch 1:  Loss: 0.047175817192 Validation Accuracy:  0.643799841404
Epoch 153, CIFAR-10 Batch 1:  Loss: 0.039681036025 Validation Accuracy:  0.640999853611
Epoch 154, CIFAR-10 Batch 1:  Loss: 0.049577828497 Validation Accuracy:  0.632399916649
Epoch 155, CIFAR-10 Batch 1:  Loss: 0.036962199956 Validation Accuracy:  0.646399915218
Epoch 156, CIFAR-10 Batch 1:  Loss: 0.044233195484 Validation Accuracy:  0.638399899006
Epoch 157, CIFAR-10 Batch 1:  Loss: 0.043263137341 Validation Accuracy:  0.637999892235
Epoch 158, CIFAR-10 Batch 1:  Loss: 0.040880899876 Validation Accuracy:  0.636199891567
Epoch 159, CIFAR-10 Batch 1:  Loss: 0.037370998412 Validation Accuracy:  0.642999887466
Epoch 160, CIFAR-10 Batch 1:  Loss: 0.043514795601 Validation Accuracy:  0.647999882698
Epoch 161, CIFAR-10 Batch 1:  Loss: 0.035453949124 Validation Accuracy:  0.642999827862
Epoch 162, CIFAR-10 Batch 1:  Loss: 0.045587658882 Validation Accuracy:  0.637399911880
Epoch 163, CIFAR-10 Batch 1:  Loss: 0.037834968418 Validation Accuracy:  0.641399860382
Epoch 164, CIFAR-10 Batch 1:  Loss: 0.029362358153 Validation Accuracy:  0.648599922657
Epoch 165, CIFAR-10 Batch 1:  Loss: 0.031009254977 Validation Accuracy:  0.640599906445
Epoch 166, CIFAR-10 Batch 1:  Loss: 0.037879336625 Validation Accuracy:  0.634199917316
Epoch 167, CIFAR-10 Batch 1:  Loss: 0.039432294667 Validation Accuracy:  0.640399932861
Epoch 168, CIFAR-10 Batch 1:  Loss: 0.026009880006 Validation Accuracy:  0.636399865150
Epoch 169, CIFAR-10 Batch 1:  Loss: 0.028575845063 Validation Accuracy:  0.642399847507
Epoch 170, CIFAR-10 Batch 1:  Loss: 0.034617062658 Validation Accuracy:  0.648799896240
Epoch 171, CIFAR-10 Batch 1:  Loss: 0.036434445530 Validation Accuracy:  0.639999926090
Epoch 172, CIFAR-10 Batch 1:  Loss: 0.028353631496 Validation Accuracy:  0.631399929523
Epoch 173, CIFAR-10 Batch 1:  Loss: 0.029510181397 Validation Accuracy:  0.641799926758
Epoch 174, CIFAR-10 Batch 1:  Loss: 0.026453666389 Validation Accuracy:  0.636199891567
Epoch 175, CIFAR-10 Batch 1:  Loss: 0.026797231287 Validation Accuracy:  0.639399886131
Epoch 176, CIFAR-10 Batch 1:  Loss: 0.037223942578 Validation Accuracy:  0.649999916553
Epoch 177, CIFAR-10 Batch 1:  Loss: 0.031708784401 Validation Accuracy:  0.640999913216
Epoch 178, CIFAR-10 Batch 1:  Loss: 0.031505286694 Validation Accuracy:  0.636399865150
Epoch 179, CIFAR-10 Batch 1:  Loss: 0.023675017059 Validation Accuracy:  0.638399899006
Epoch 180, CIFAR-10 Batch 1:  Loss: 0.025420922786 Validation Accuracy:  0.643599867821
Epoch 181, CIFAR-10 Batch 1:  Loss: 0.030086964369 Validation Accuracy:  0.638399899006
Epoch 182, CIFAR-10 Batch 1:  Loss: 0.027017189190 Validation Accuracy:  0.649399876595
Epoch 183, CIFAR-10 Batch 1:  Loss: 0.022553581744 Validation Accuracy:  0.645599901676
Epoch 184, CIFAR-10 Batch 1:  Loss: 0.023203127086 Validation Accuracy:  0.646999895573
Epoch 185, CIFAR-10 Batch 1:  Loss: 0.021133799106 Validation Accuracy:  0.649999856949
Epoch 186, CIFAR-10 Batch 1:  Loss: 0.021468723193 Validation Accuracy:  0.645599901676
Epoch 187, CIFAR-10 Batch 1:  Loss: 0.022129382938 Validation Accuracy:  0.641799867153
Epoch 188, CIFAR-10 Batch 1:  Loss: 0.025054179132 Validation Accuracy:  0.637399852276
Epoch 189, CIFAR-10 Batch 1:  Loss: 0.016730416566 Validation Accuracy:  0.645999908447
Epoch 190, CIFAR-10 Batch 1:  Loss: 0.024481043220 Validation Accuracy:  0.651799857616
Epoch 191, CIFAR-10 Batch 1:  Loss: 0.019564911723 Validation Accuracy:  0.642799854279
Epoch 192, CIFAR-10 Batch 1:  Loss: 0.016990596429 Validation Accuracy:  0.650599896908
Epoch 193, CIFAR-10 Batch 1:  Loss: 0.023537382483 Validation Accuracy:  0.640199899673
Epoch 194, CIFAR-10 Batch 1:  Loss: 0.021219838411 Validation Accuracy:  0.649199903011
Epoch 195, CIFAR-10 Batch 1:  Loss: 0.017503134906 Validation Accuracy:  0.648199856281
Epoch 196, CIFAR-10 Batch 1:  Loss: 0.012322116643 Validation Accuracy:  0.652399837971
Epoch 197, CIFAR-10 Batch 1:  Loss: 0.013115489855 Validation Accuracy:  0.646999835968
Epoch 198, CIFAR-10 Batch 1:  Loss: 0.015028360300 Validation Accuracy:  0.648399889469
Epoch 199, CIFAR-10 Batch 1:  Loss: 0.015583045781 Validation Accuracy:  0.646999955177
Epoch 200, CIFAR-10 Batch 1:  Loss: 0.014725737274 Validation Accuracy:  0.645199894905

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.


In [ ]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  Loss: 2.281920194626 Validation Accuracy:  0.137199997902
Epoch  1, CIFAR-10 Batch 2:  Loss: 2.222514390945 Validation Accuracy:  0.251199990511
Epoch  1, CIFAR-10 Batch 3:  Loss: 2.035071611404 Validation Accuracy:  0.284399986267
Epoch  1, CIFAR-10 Batch 4:  Loss: 1.910905599594 Validation Accuracy:  0.268999993801
Epoch  1, CIFAR-10 Batch 5:  Loss: 1.837851524353 Validation Accuracy:  0.339599967003
Epoch  2, CIFAR-10 Batch 1:  Loss: 1.844520807266 Validation Accuracy:  0.370599985123
Epoch  2, CIFAR-10 Batch 2:  Loss: 1.735260725021 Validation Accuracy:  0.384199947119
Epoch  2, CIFAR-10 Batch 3:  Loss: 1.610825657845 Validation Accuracy:  0.374799966812
Epoch  2, CIFAR-10 Batch 4:  Loss: 1.549785375595 Validation Accuracy:  0.425199925900
Epoch  2, CIFAR-10 Batch 5:  Loss: 1.585264205933 Validation Accuracy:  0.427999973297
Epoch  3, CIFAR-10 Batch 1:  Loss: 1.679112195969 Validation Accuracy:  0.435599923134
Epoch  3, CIFAR-10 Batch 2:  Loss: 1.535599827766 Validation Accuracy:  0.454600006342
Epoch  3, CIFAR-10 Batch 3:  Loss: 1.396819353104 Validation Accuracy:  0.444199919701
Epoch  3, CIFAR-10 Batch 4:  Loss: 1.409455299377 Validation Accuracy:  0.467199951410
Epoch  3, CIFAR-10 Batch 5:  Loss: 1.467997908592 Validation Accuracy:  0.451599985361
Epoch  4, CIFAR-10 Batch 1:  Loss: 1.540534377098 Validation Accuracy:  0.479399949312
Epoch  4, CIFAR-10 Batch 2:  Loss: 1.476257324219 Validation Accuracy:  0.453399956226
Epoch  4, CIFAR-10 Batch 3:  Loss: 1.352427244186 Validation Accuracy:  0.458399981260
Epoch  4, CIFAR-10 Batch 4:  Loss: 1.331018805504 Validation Accuracy:  0.501199960709
Epoch  4, CIFAR-10 Batch 5:  Loss: 1.386432170868 Validation Accuracy:  0.470599949360
Epoch  5, CIFAR-10 Batch 1:  Loss: 1.447888493538 Validation Accuracy:  0.511999964714
Epoch  5, CIFAR-10 Batch 2:  Loss: 1.366274356842 Validation Accuracy:  0.505199968815
Epoch  5, CIFAR-10 Batch 3:  Loss: 1.243806719780 Validation Accuracy:  0.510999977589
Epoch  5, CIFAR-10 Batch 4:  Loss: 1.248592853546 Validation Accuracy:  0.530399918556
Epoch  5, CIFAR-10 Batch 5:  Loss: 1.333970069885 Validation Accuracy:  0.496399939060
Epoch  6, CIFAR-10 Batch 1:  Loss: 1.368748664856 Validation Accuracy:  0.539799928665
Epoch  6, CIFAR-10 Batch 2:  Loss: 1.292823076248 Validation Accuracy:  0.526399910450
Epoch  6, CIFAR-10 Batch 3:  Loss: 1.182191491127 Validation Accuracy:  0.543199956417
Epoch  6, CIFAR-10 Batch 4:  Loss: 1.213813424110 Validation Accuracy:  0.550399899483
Epoch  6, CIFAR-10 Batch 5:  Loss: 1.244578361511 Validation Accuracy:  0.542799949646
Epoch  7, CIFAR-10 Batch 1:  Loss: 1.324422717094 Validation Accuracy:  0.542199969292
Epoch  7, CIFAR-10 Batch 2:  Loss: 1.244325876236 Validation Accuracy:  0.542999923229
Epoch  7, CIFAR-10 Batch 3:  Loss: 1.174799442291 Validation Accuracy:  0.555599868298
Epoch  7, CIFAR-10 Batch 4:  Loss: 1.159662723541 Validation Accuracy:  0.565599918365
Epoch  7, CIFAR-10 Batch 5:  Loss: 1.157836079597 Validation Accuracy:  0.558599948883
Epoch  8, CIFAR-10 Batch 1:  Loss: 1.262043118477 Validation Accuracy:  0.567599952221
Epoch  8, CIFAR-10 Batch 2:  Loss: 1.197713732719 Validation Accuracy:  0.560599923134
Epoch  8, CIFAR-10 Batch 3:  Loss: 1.103801727295 Validation Accuracy:  0.572199940681
Epoch  8, CIFAR-10 Batch 4:  Loss: 1.093108057976 Validation Accuracy:  0.589199900627
Epoch  8, CIFAR-10 Batch 5:  Loss: 1.125655651093 Validation Accuracy:  0.576999902725
Epoch  9, CIFAR-10 Batch 1:  Loss: 1.200019359589 Validation Accuracy:  0.587799966335
Epoch  9, CIFAR-10 Batch 2:  Loss: 1.152778029442 Validation Accuracy:  0.593399941921
Epoch  9, CIFAR-10 Batch 3:  Loss: 1.075111746788 Validation Accuracy:  0.588799953461
Epoch  9, CIFAR-10 Batch 4:  Loss: 1.072599291801 Validation Accuracy:  0.598599910736
Epoch  9, CIFAR-10 Batch 5:  Loss: 1.078730583191 Validation Accuracy:  0.589599907398
Epoch 10, CIFAR-10 Batch 1:  Loss: 1.150241017342 Validation Accuracy:  0.610399961472
Epoch 10, CIFAR-10 Batch 2:  Loss: 1.103882908821 Validation Accuracy:  0.607599854469
Epoch 10, CIFAR-10 Batch 3:  Loss: 1.029607057571 Validation Accuracy:  0.594999909401
Epoch 10, CIFAR-10 Batch 4:  Loss: 0.997128546238 Validation Accuracy:  0.628999888897
Epoch 10, CIFAR-10 Batch 5:  Loss: 1.048191189766 Validation Accuracy:  0.606399893761
Epoch 11, CIFAR-10 Batch 1:  Loss: 1.109799265862 Validation Accuracy:  0.632999897003
Epoch 11, CIFAR-10 Batch 2:  Loss: 1.081212997437 Validation Accuracy:  0.618199944496
Epoch 11, CIFAR-10 Batch 3:  Loss: 0.946589589119 Validation Accuracy:  0.625999867916
Epoch 11, CIFAR-10 Batch 4:  Loss: 0.987298130989 Validation Accuracy:  0.639799952507
Epoch 11, CIFAR-10 Batch 5:  Loss: 0.990449726582 Validation Accuracy:  0.630599915981
Epoch 12, CIFAR-10 Batch 1:  Loss: 1.048164606094 Validation Accuracy:  0.617799878120
Epoch 12, CIFAR-10 Batch 2:  Loss: 1.051651358604 Validation Accuracy:  0.623399913311
Epoch 12, CIFAR-10 Batch 3:  Loss: 0.955022633076 Validation Accuracy:  0.627999901772
Epoch 12, CIFAR-10 Batch 4:  Loss: 0.933081388474 Validation Accuracy:  0.644799888134
Epoch 12, CIFAR-10 Batch 5:  Loss: 0.958672821522 Validation Accuracy:  0.627799928188
Epoch 13, CIFAR-10 Batch 1:  Loss: 1.000002145767 Validation Accuracy:  0.649199843407
Epoch 13, CIFAR-10 Batch 2:  Loss: 0.966631114483 Validation Accuracy:  0.636599838734
Epoch 13, CIFAR-10 Batch 3:  Loss: 0.909235835075 Validation Accuracy:  0.643399894238
Epoch 13, CIFAR-10 Batch 4:  Loss: 0.864212989807 Validation Accuracy:  0.659999907017
Epoch 13, CIFAR-10 Batch 5:  Loss: 0.909489631653 Validation Accuracy:  0.642199873924
Epoch 14, CIFAR-10 Batch 1:  Loss: 0.999900817871 Validation Accuracy:  0.651599884033
Epoch 14, CIFAR-10 Batch 2:  Loss: 0.954944550991 Validation Accuracy:  0.629199862480
Epoch 14, CIFAR-10 Batch 3:  Loss: 0.910835027695 Validation Accuracy:  0.649599909782
Epoch 14, CIFAR-10 Batch 4:  Loss: 0.839717268944 Validation Accuracy:  0.669399857521
Epoch 14, CIFAR-10 Batch 5:  Loss: 0.908296883106 Validation Accuracy:  0.653199911118
Epoch 15, CIFAR-10 Batch 1:  Loss: 0.958137631416 Validation Accuracy:  0.652599871159
Epoch 15, CIFAR-10 Batch 2:  Loss: 0.943035483360 Validation Accuracy:  0.643999934196
Epoch 15, CIFAR-10 Batch 3:  Loss: 0.883057713509 Validation Accuracy:  0.653599917889
Epoch 15, CIFAR-10 Batch 4:  Loss: 0.816228449345 Validation Accuracy:  0.670199871063
Epoch 15, CIFAR-10 Batch 5:  Loss: 0.898570656776 Validation Accuracy:  0.639599919319
Epoch 16, CIFAR-10 Batch 1:  Loss: 0.948793530464 Validation Accuracy:  0.667199909687
Epoch 16, CIFAR-10 Batch 2:  Loss: 0.887528300285 Validation Accuracy:  0.652999877930
Epoch 16, CIFAR-10 Batch 3:  Loss: 0.804571807384 Validation Accuracy:  0.667399823666
Epoch 16, CIFAR-10 Batch 4:  Loss: 0.814435660839 Validation Accuracy:  0.681399822235
Epoch 16, CIFAR-10 Batch 5:  Loss: 0.844196677208 Validation Accuracy:  0.671199858189
Epoch 17, CIFAR-10 Batch 1:  Loss: 0.919843614101 Validation Accuracy:  0.650999844074
Epoch 17, CIFAR-10 Batch 2:  Loss: 0.832394778728 Validation Accuracy:  0.670999884605
Epoch 17, CIFAR-10 Batch 3:  Loss: 0.771405875683 Validation Accuracy:  0.679599881172
Epoch 17, CIFAR-10 Batch 4:  Loss: 0.748958051205 Validation Accuracy:  0.676399886608
Epoch 17, CIFAR-10 Batch 5:  Loss: 0.835650026798 Validation Accuracy:  0.667799830437
Epoch 18, CIFAR-10 Batch 1:  Loss: 0.852792859077 Validation Accuracy:  0.671999871731
Epoch 18, CIFAR-10 Batch 2:  Loss: 0.804411113262 Validation Accuracy:  0.693799853325
Epoch 18, CIFAR-10 Batch 3:  Loss: 0.761398494244 Validation Accuracy:  0.687599837780
Epoch 18, CIFAR-10 Batch 4:  Loss: 0.724012851715 Validation Accuracy:  0.700999855995
Epoch 18, CIFAR-10 Batch 5:  Loss: 0.792735219002 Validation Accuracy:  0.667199850082
Epoch 19, CIFAR-10 Batch 1:  Loss: 0.877308249474 Validation Accuracy:  0.675799846649
Epoch 19, CIFAR-10 Batch 2:  Loss: 0.796249687672 Validation Accuracy:  0.697999835014
Epoch 19, CIFAR-10 Batch 3:  Loss: 0.735916078091 Validation Accuracy:  0.687599837780
Epoch 19, CIFAR-10 Batch 4:  Loss: 0.735692977905 Validation Accuracy:  0.698999881744
Epoch 19, CIFAR-10 Batch 5:  Loss: 0.738967061043 Validation Accuracy:  0.695799887180
Epoch 20, CIFAR-10 Batch 1:  Loss: 0.821990072727 Validation Accuracy:  0.702799916267
Epoch 20, CIFAR-10 Batch 2:  Loss: 0.782245159149 Validation Accuracy:  0.666599929333
Epoch 20, CIFAR-10 Batch 3:  Loss: 0.721034765244 Validation Accuracy:  0.700399875641
Epoch 20, CIFAR-10 Batch 4:  Loss: 0.676852405071 Validation Accuracy:  0.702799856663
Epoch 20, CIFAR-10 Batch 5:  Loss: 0.760889053345 Validation Accuracy:  0.686199843884
Epoch 21, CIFAR-10 Batch 1:  Loss: 0.762805223465 Validation Accuracy:  0.703599870205
Epoch 21, CIFAR-10 Batch 2:  Loss: 0.771231293678 Validation Accuracy:  0.683399856091
Epoch 21, CIFAR-10 Batch 3:  Loss: 0.718527495861 Validation Accuracy:  0.678199887276
Epoch 21, CIFAR-10 Batch 4:  Loss: 0.654804229736 Validation Accuracy:  0.711799860001
Epoch 21, CIFAR-10 Batch 5:  Loss: 0.682554185390 Validation Accuracy:  0.706399917603
Epoch 22, CIFAR-10 Batch 1:  Loss: 0.780891835690 Validation Accuracy:  0.689599812031
Epoch 22, CIFAR-10 Batch 2:  Loss: 0.684993863106 Validation Accuracy:  0.692599892616
Epoch 22, CIFAR-10 Batch 3:  Loss: 0.653582632542 Validation Accuracy:  0.722799956799
Epoch 22, CIFAR-10 Batch 4:  Loss: 0.633037745953 Validation Accuracy:  0.721799910069
Epoch 22, CIFAR-10 Batch 5:  Loss: 0.654444694519 Validation Accuracy:  0.710199832916
Epoch 23, CIFAR-10 Batch 1:  Loss: 0.731940746307 Validation Accuracy:  0.716599822044
Epoch 23, CIFAR-10 Batch 2:  Loss: 0.679237127304 Validation Accuracy:  0.721399843693
Epoch 23, CIFAR-10 Batch 3:  Loss: 0.652090549469 Validation Accuracy:  0.713199853897
Epoch 23, CIFAR-10 Batch 4:  Loss: 0.631657779217 Validation Accuracy:  0.718999922276
Epoch 23, CIFAR-10 Batch 5:  Loss: 0.676335215569 Validation Accuracy:  0.705599844456
Epoch 24, CIFAR-10 Batch 1:  Loss: 0.717787027359 Validation Accuracy:  0.725599825382
Epoch 24, CIFAR-10 Batch 2:  Loss: 0.676185071468 Validation Accuracy:  0.709199845791
Epoch 24, CIFAR-10 Batch 3:  Loss: 0.638724207878 Validation Accuracy:  0.721199870110
Epoch 24, CIFAR-10 Batch 4:  Loss: 0.609054028988 Validation Accuracy:  0.722199797630
Epoch 24, CIFAR-10 Batch 5:  Loss: 0.630760610104 Validation Accuracy:  0.701799809933
Epoch 25, CIFAR-10 Batch 1:  Loss: 0.687442481518 Validation Accuracy:  0.723199844360
Epoch 25, CIFAR-10 Batch 2:  Loss: 0.652903437614 Validation Accuracy:  0.715199887753
Epoch 25, CIFAR-10 Batch 3:  Loss: 0.607273876667 Validation Accuracy:  0.721599817276
Epoch 25, CIFAR-10 Batch 4:  Loss: 0.574273169041 Validation Accuracy:  0.721399843693
Epoch 25, CIFAR-10 Batch 5:  Loss: 0.620008528233 Validation Accuracy:  0.718799829483
Epoch 26, CIFAR-10 Batch 1:  Loss: 0.684665560722 Validation Accuracy:  0.734199881554
Epoch 26, CIFAR-10 Batch 2:  Loss: 0.636759817600 Validation Accuracy:  0.722999870777
Epoch 26, CIFAR-10 Batch 3:  Loss: 0.587698459625 Validation Accuracy:  0.720999836922
Epoch 26, CIFAR-10 Batch 4:  Loss: 0.568066835403 Validation Accuracy:  0.738599896431
Epoch 26, CIFAR-10 Batch 5:  Loss: 0.579418778419 Validation Accuracy:  0.726799786091
Epoch 27, CIFAR-10 Batch 1:  Loss: 0.663402855396 Validation Accuracy:  0.735799849033
Epoch 27, CIFAR-10 Batch 2:  Loss: 0.626467823982 Validation Accuracy:  0.727599859238
Epoch 27, CIFAR-10 Batch 3:  Loss: 0.587232232094 Validation Accuracy:  0.701599836349
Epoch 27, CIFAR-10 Batch 4:  Loss: 0.568675875664 Validation Accuracy:  0.727799773216
Epoch 27, CIFAR-10 Batch 5:  Loss: 0.568577051163 Validation Accuracy:  0.730999827385
Epoch 28, CIFAR-10 Batch 1:  Loss: 0.619148910046 Validation Accuracy:  0.734799861908
Epoch 28, CIFAR-10 Batch 2:  Loss: 0.583574175835 Validation Accuracy:  0.732799887657
Epoch 28, CIFAR-10 Batch 3:  Loss: 0.525461196899 Validation Accuracy:  0.729799866676
Epoch 28, CIFAR-10 Batch 4:  Loss: 0.560906291008 Validation Accuracy:  0.729799866676
Epoch 28, CIFAR-10 Batch 5:  Loss: 0.570439696312 Validation Accuracy:  0.726599872112
Epoch 29, CIFAR-10 Batch 1:  Loss: 0.596171140671 Validation Accuracy:  0.737599849701
Epoch 29, CIFAR-10 Batch 2:  Loss: 0.570169568062 Validation Accuracy:  0.733399868011
Epoch 29, CIFAR-10 Batch 3:  Loss: 0.548944115639 Validation Accuracy:  0.729799866676
Epoch 29, CIFAR-10 Batch 4:  Loss: 0.563496291637 Validation Accuracy:  0.721199810505
Epoch 29, CIFAR-10 Batch 5:  Loss: 0.559679090977 Validation Accuracy:  0.723799884319
Epoch 30, CIFAR-10 Batch 1:  Loss: 0.634495258331 Validation Accuracy:  0.729599833488
Epoch 30, CIFAR-10 Batch 2:  Loss: 0.565383434296 Validation Accuracy:  0.737999796867
Epoch 30, CIFAR-10 Batch 3:  Loss: 0.518739342690 Validation Accuracy:  0.741999804974
Epoch 30, CIFAR-10 Batch 4:  Loss: 0.521404266357 Validation Accuracy:  0.742599844933
Epoch 30, CIFAR-10 Batch 5:  Loss: 0.511966347694 Validation Accuracy:  0.743199884892
Epoch 31, CIFAR-10 Batch 1:  Loss: 0.599913358688 Validation Accuracy:  0.737599909306
Epoch 31, CIFAR-10 Batch 2:  Loss: 0.547197043896 Validation Accuracy:  0.732199847698
Epoch 31, CIFAR-10 Batch 3:  Loss: 0.527095794678 Validation Accuracy:  0.725399851799
Epoch 31, CIFAR-10 Batch 4:  Loss: 0.534137547016 Validation Accuracy:  0.744199872017
Epoch 31, CIFAR-10 Batch 5:  Loss: 0.515607595444 Validation Accuracy:  0.735999822617
Epoch 32, CIFAR-10 Batch 1:  Loss: 0.594542264938 Validation Accuracy:  0.745399832726
Epoch 32, CIFAR-10 Batch 2:  Loss: 0.515901982784 Validation Accuracy:  0.741999745369
Epoch 32, CIFAR-10 Batch 3:  Loss: 0.493417620659 Validation Accuracy:  0.747199833393
Epoch 32, CIFAR-10 Batch 4:  Loss: 0.522760212421 Validation Accuracy:  0.745399892330
Epoch 32, CIFAR-10 Batch 5:  Loss: 0.533230304718 Validation Accuracy:  0.725599884987
Epoch 33, CIFAR-10 Batch 1:  Loss: 0.554080128670 Validation Accuracy:  0.750799894333
Epoch 33, CIFAR-10 Batch 2:  Loss: 0.543783009052 Validation Accuracy:  0.728799879551
Epoch 33, CIFAR-10 Batch 3:  Loss: 0.504774332047 Validation Accuracy:  0.737999796867
Epoch 33, CIFAR-10 Batch 4:  Loss: 0.474961578846 Validation Accuracy:  0.748399853706
Epoch 33, CIFAR-10 Batch 5:  Loss: 0.484302103519 Validation Accuracy:  0.749599814415
Epoch 34, CIFAR-10 Batch 1:  Loss: 0.577329397202 Validation Accuracy:  0.749599814415
Epoch 34, CIFAR-10 Batch 2:  Loss: 0.517197906971 Validation Accuracy:  0.744399845600
Epoch 34, CIFAR-10 Batch 3:  Loss: 0.480864942074 Validation Accuracy:  0.747599840164
Epoch 34, CIFAR-10 Batch 4:  Loss: 0.459200084209 Validation Accuracy:  0.753799855709
Epoch 34, CIFAR-10 Batch 5:  Loss: 0.474791646004 Validation Accuracy:  0.760999858379
Epoch 35, CIFAR-10 Batch 1:  Loss: 0.543469130993 Validation Accuracy:  0.752599895000
Epoch 35, CIFAR-10 Batch 2:  Loss: 0.495337963104 Validation Accuracy:  0.749999821186
Epoch 35, CIFAR-10 Batch 3:  Loss: 0.474299460649 Validation Accuracy:  0.743799805641
Epoch 35, CIFAR-10 Batch 4:  Loss: 0.446195006371 Validation Accuracy:  0.750999808311
Epoch 35, CIFAR-10 Batch 5:  Loss: 0.456579595804 Validation Accuracy:  0.749199807644
Epoch 36, CIFAR-10 Batch 1:  Loss: 0.536421775818 Validation Accuracy:  0.744399845600
Epoch 36, CIFAR-10 Batch 2:  Loss: 0.511101067066 Validation Accuracy:  0.741999864578
Epoch 36, CIFAR-10 Batch 3:  Loss: 0.459531724453 Validation Accuracy:  0.747999787331
Epoch 36, CIFAR-10 Batch 4:  Loss: 0.454678177834 Validation Accuracy:  0.758199810982
Epoch 36, CIFAR-10 Batch 5:  Loss: 0.446378290653 Validation Accuracy:  0.752199769020
Epoch 37, CIFAR-10 Batch 1:  Loss: 0.500831186771 Validation Accuracy:  0.755199909210
Epoch 37, CIFAR-10 Batch 2:  Loss: 0.461420416832 Validation Accuracy:  0.750199794769
Epoch 37, CIFAR-10 Batch 3:  Loss: 0.424908965826 Validation Accuracy:  0.765799820423
Epoch 37, CIFAR-10 Batch 4:  Loss: 0.424764394760 Validation Accuracy:  0.757999777794
Epoch 37, CIFAR-10 Batch 5:  Loss: 0.432287782431 Validation Accuracy:  0.755399823189
Epoch 38, CIFAR-10 Batch 1:  Loss: 0.486989140511 Validation Accuracy:  0.766999781132
Epoch 38, CIFAR-10 Batch 2:  Loss: 0.450455367565 Validation Accuracy:  0.743399858475
Epoch 38, CIFAR-10 Batch 3:  Loss: 0.445458710194 Validation Accuracy:  0.746199786663
Epoch 38, CIFAR-10 Batch 4:  Loss: 0.400366783142 Validation Accuracy:  0.761199831963
Epoch 38, CIFAR-10 Batch 5:  Loss: 0.419298619032 Validation Accuracy:  0.751799821854
Epoch 39, CIFAR-10 Batch 1:  Loss: 0.477476537228 Validation Accuracy:  0.766399800777
Epoch 39, CIFAR-10 Batch 2:  Loss: 0.424531877041 Validation Accuracy:  0.759999811649
Epoch 39, CIFAR-10 Batch 3:  Loss: 0.392471432686 Validation Accuracy:  0.751399815083
Epoch 39, CIFAR-10 Batch 4:  Loss: 0.401588350534 Validation Accuracy:  0.760799825191
Epoch 39, CIFAR-10 Batch 5:  Loss: 0.400111377239 Validation Accuracy:  0.759999811649
Epoch 40, CIFAR-10 Batch 2:  Loss: 0.447047054768 Validation Accuracy:  0.754599809647
Epoch 40, CIFAR-10 Batch 3:  Loss: 0.424636155367 Validation Accuracy:  0.742199838161
Epoch 40, CIFAR-10 Batch 4:  Loss: 0.385671883821 Validation Accuracy:  0.763199806213
Epoch 40, CIFAR-10 Batch 5:  Loss: 0.403779149055 Validation Accuracy:  0.755599856377
Epoch 41, CIFAR-10 Batch 1:  Loss: 0.469239741564 Validation Accuracy:  0.759799838066
Epoch 41, CIFAR-10 Batch 2:  Loss: 0.446159988642 Validation Accuracy:  0.740599811077
Epoch 41, CIFAR-10 Batch 3:  Loss: 0.392940521240 Validation Accuracy:  0.757399857044
Epoch 41, CIFAR-10 Batch 4:  Loss: 0.379163652658 Validation Accuracy:  0.767999827862
Epoch 41, CIFAR-10 Batch 5:  Loss: 0.410879075527 Validation Accuracy:  0.752399861813
Epoch 42, CIFAR-10 Batch 1:  Loss: 0.444852173328 Validation Accuracy:  0.766999840736
Epoch 42, CIFAR-10 Batch 2:  Loss: 0.400913894176 Validation Accuracy:  0.753599882126
Epoch 42, CIFAR-10 Batch 3:  Loss: 0.399229317904 Validation Accuracy:  0.753199875355
Epoch 42, CIFAR-10 Batch 4:  Loss: 0.392537236214 Validation Accuracy:  0.761599898338
Epoch 42, CIFAR-10 Batch 5:  Loss: 0.382243871689 Validation Accuracy:  0.751199841499
Epoch 43, CIFAR-10 Batch 1:  Loss: 0.468116164207 Validation Accuracy:  0.761999785900
Epoch 43, CIFAR-10 Batch 2:  Loss: 0.384786009789 Validation Accuracy:  0.758199810982
Epoch 43, CIFAR-10 Batch 3:  Loss: 0.369557648897 Validation Accuracy:  0.760399758816
Epoch 43, CIFAR-10 Batch 4:  Loss: 0.381712257862 Validation Accuracy:  0.756999850273
Epoch 43, CIFAR-10 Batch 5:  Loss: 0.359421432018 Validation Accuracy:  0.766999900341
Epoch 44, CIFAR-10 Batch 1:  Loss: 0.445118486881 Validation Accuracy:  0.757399797440
Epoch 44, CIFAR-10 Batch 2:  Loss: 0.372407287359 Validation Accuracy:  0.766399919987
Epoch 44, CIFAR-10 Batch 3:  Loss: 0.373411327600 Validation Accuracy:  0.755799829960
Epoch 44, CIFAR-10 Batch 4:  Loss: 0.338380366564 Validation Accuracy:  0.767399907112
Epoch 44, CIFAR-10 Batch 5:  Loss: 0.375266849995 Validation Accuracy:  0.752999842167
Epoch 45, CIFAR-10 Batch 1:  Loss: 0.432476669550 Validation Accuracy:  0.755399823189
Epoch 45, CIFAR-10 Batch 2:  Loss: 0.397757887840 Validation Accuracy:  0.755199849606
Epoch 45, CIFAR-10 Batch 3:  Loss: 0.381783634424 Validation Accuracy:  0.745199859142
Epoch 45, CIFAR-10 Batch 4:  Loss: 0.347280114889 Validation Accuracy:  0.768799781799
Epoch 45, CIFAR-10 Batch 5:  Loss: 0.344473063946 Validation Accuracy:  0.760799825191
Epoch 46, CIFAR-10 Batch 1:  Loss: 0.435461610556 Validation Accuracy:  0.760799765587
Epoch 46, CIFAR-10 Batch 2:  Loss: 0.368339329958 Validation Accuracy:  0.760399818420
Epoch 46, CIFAR-10 Batch 3:  Loss: 0.363329708576 Validation Accuracy:  0.765999853611
Epoch 46, CIFAR-10 Batch 4:  Loss: 0.333902657032 Validation Accuracy:  0.764999806881
Epoch 46, CIFAR-10 Batch 5:  Loss: 0.321644365788 Validation Accuracy:  0.758999824524
Epoch 47, CIFAR-10 Batch 1:  Loss: 0.416299611330 Validation Accuracy:  0.768799901009
Epoch 47, CIFAR-10 Batch 2:  Loss: 0.362165987492 Validation Accuracy:  0.767399787903
Epoch 47, CIFAR-10 Batch 3:  Loss: 0.337598264217 Validation Accuracy:  0.761399805546
Epoch 47, CIFAR-10 Batch 4:  Loss: 0.335751473904 Validation Accuracy:  0.770999789238
Epoch 47, CIFAR-10 Batch 5:  Loss: 0.339490026236 Validation Accuracy:  0.761999845505
Epoch 48, CIFAR-10 Batch 1:  Loss: 0.385102748871 Validation Accuracy:  0.772599816322
Epoch 48, CIFAR-10 Batch 2:  Loss: 0.357932865620 Validation Accuracy:  0.755599856377
Epoch 48, CIFAR-10 Batch 3:  Loss: 0.335157632828 Validation Accuracy:  0.762399852276
Epoch 48, CIFAR-10 Batch 4:  Loss: 0.348944127560 Validation Accuracy:  0.766399860382
Epoch 48, CIFAR-10 Batch 5:  Loss: 0.341900050640 Validation Accuracy:  0.741799890995
Epoch 49, CIFAR-10 Batch 1:  Loss: 0.362788736820 Validation Accuracy:  0.774999797344
Epoch 49, CIFAR-10 Batch 2:  Loss: 0.329078942537 Validation Accuracy:  0.769199848175
Epoch 49, CIFAR-10 Batch 3:  Loss: 0.351224184036 Validation Accuracy:  0.753999888897
Epoch 49, CIFAR-10 Batch 4:  Loss: 0.327720403671 Validation Accuracy:  0.774799823761
Epoch 49, CIFAR-10 Batch 5:  Loss: 0.357055217028 Validation Accuracy:  0.743799865246
Epoch 50, CIFAR-10 Batch 1:  Loss: 0.367319613695 Validation Accuracy:  0.771599888802
Epoch 50, CIFAR-10 Batch 2:  Loss: 0.345762073994 Validation Accuracy:  0.760599792004
Epoch 50, CIFAR-10 Batch 3:  Loss: 0.324690788984 Validation Accuracy:  0.761399745941
Epoch 50, CIFAR-10 Batch 4:  Loss: 0.308735191822 Validation Accuracy:  0.769599795341
Epoch 50, CIFAR-10 Batch 5:  Loss: 0.307561695576 Validation Accuracy:  0.771999835968
Epoch 51, CIFAR-10 Batch 1:  Loss: 0.362416327000 Validation Accuracy:  0.777799844742
Epoch 51, CIFAR-10 Batch 2:  Loss: 0.331532895565 Validation Accuracy:  0.757799863815
Epoch 51, CIFAR-10 Batch 3:  Loss: 0.285755723715 Validation Accuracy:  0.767599821091
Epoch 51, CIFAR-10 Batch 4:  Loss: 0.296980798244 Validation Accuracy:  0.775399744511
Epoch 51, CIFAR-10 Batch 5:  Loss: 0.307321667671 Validation Accuracy:  0.764199852943
Epoch 52, CIFAR-10 Batch 1:  Loss: 0.373898893595 Validation Accuracy:  0.765599846840
Epoch 52, CIFAR-10 Batch 2:  Loss: 0.311174124479 Validation Accuracy:  0.765999794006
Epoch 52, CIFAR-10 Batch 3:  Loss: 0.317365586758 Validation Accuracy:  0.763399839401
Epoch 52, CIFAR-10 Batch 4:  Loss: 0.309851646423 Validation Accuracy:  0.769999861717
Epoch 52, CIFAR-10 Batch 5:  Loss: 0.303589165211 Validation Accuracy:  0.768799781799
Epoch 53, CIFAR-10 Batch 1:  Loss: 0.351073801517 Validation Accuracy:  0.774599790573
Epoch 53, CIFAR-10 Batch 2:  Loss: 0.319547861814 Validation Accuracy:  0.771599769592
Epoch 53, CIFAR-10 Batch 3:  Loss: 0.273535013199 Validation Accuracy:  0.764799833298
Epoch 53, CIFAR-10 Batch 4:  Loss: 0.272520840168 Validation Accuracy:  0.775999784470
Epoch 53, CIFAR-10 Batch 5:  Loss: 0.266046762466 Validation Accuracy:  0.773399829865
Epoch 54, CIFAR-10 Batch 1:  Loss: 0.358032822609 Validation Accuracy:  0.772799849510
Epoch 54, CIFAR-10 Batch 2:  Loss: 0.320535123348 Validation Accuracy:  0.763999819756
Epoch 54, CIFAR-10 Batch 3:  Loss: 0.268720626831 Validation Accuracy:  0.770399808884
Epoch 54, CIFAR-10 Batch 4:  Loss: 0.275225222111 Validation Accuracy:  0.775599837303
Epoch 54, CIFAR-10 Batch 5:  Loss: 0.286380738020 Validation Accuracy:  0.766799807549
Epoch 55, CIFAR-10 Batch 1:  Loss: 0.344194680452 Validation Accuracy:  0.776199817657
Epoch 55, CIFAR-10 Batch 2:  Loss: 0.291258096695 Validation Accuracy:  0.766599893570
Epoch 55, CIFAR-10 Batch 3:  Loss: 0.293975651264 Validation Accuracy:  0.771199822426
Epoch 55, CIFAR-10 Batch 4:  Loss: 0.313419103622 Validation Accuracy:  0.763799846172
Epoch 55, CIFAR-10 Batch 5:  Loss: 0.281380832195 Validation Accuracy:  0.764799773693
Epoch 56, CIFAR-10 Batch 1:  Loss: 0.342462867498 Validation Accuracy:  0.773799777031
Epoch 56, CIFAR-10 Batch 2:  Loss: 0.330174088478 Validation Accuracy:  0.743799746037
Epoch 56, CIFAR-10 Batch 3:  Loss: 0.278443753719 Validation Accuracy:  0.767399787903
Epoch 56, CIFAR-10 Batch 4:  Loss: 0.258202642202 Validation Accuracy:  0.772399783134
Epoch 56, CIFAR-10 Batch 5:  Loss: 0.258084297180 Validation Accuracy:  0.765199780464
Epoch 57, CIFAR-10 Batch 1:  Loss: 0.317852616310 Validation Accuracy:  0.777599871159
Epoch 57, CIFAR-10 Batch 2:  Loss: 0.249699503183 Validation Accuracy:  0.767199754715
Epoch 57, CIFAR-10 Batch 3:  Loss: 0.280204594135 Validation Accuracy:  0.763999819756
Epoch 57, CIFAR-10 Batch 4:  Loss: 0.256869643927 Validation Accuracy:  0.775999844074
Epoch 57, CIFAR-10 Batch 5:  Loss: 0.254302382469 Validation Accuracy:  0.778999805450
Epoch 58, CIFAR-10 Batch 1:  Loss: 0.315783321857 Validation Accuracy:  0.776799857616
Epoch 58, CIFAR-10 Batch 2:  Loss: 0.276038855314 Validation Accuracy:  0.762799859047
Epoch 58, CIFAR-10 Batch 3:  Loss: 0.223744094372 Validation Accuracy:  0.769599854946
Epoch 58, CIFAR-10 Batch 4:  Loss: 0.266511499882 Validation Accuracy:  0.761999845505
Epoch 58, CIFAR-10 Batch 5:  Loss: 0.260197788477 Validation Accuracy:  0.773799836636
Epoch 59, CIFAR-10 Batch 1:  Loss: 0.305157661438 Validation Accuracy:  0.778599739075
Epoch 59, CIFAR-10 Batch 2:  Loss: 0.250421613455 Validation Accuracy:  0.774399816990
Epoch 59, CIFAR-10 Batch 3:  Loss: 0.272659480572 Validation Accuracy:  0.759999811649
Epoch 59, CIFAR-10 Batch 4:  Loss: 0.257020592690 Validation Accuracy:  0.770199775696
Epoch 59, CIFAR-10 Batch 5:  Loss: 0.247717007995 Validation Accuracy:  0.777399778366
Epoch 60, CIFAR-10 Batch 1:  Loss: 0.288448184729 Validation Accuracy:  0.778399825096
Epoch 60, CIFAR-10 Batch 2:  Loss: 0.267497926950 Validation Accuracy:  0.767399787903
Epoch 60, CIFAR-10 Batch 3:  Loss: 0.240152627230 Validation Accuracy:  0.767599821091
Epoch 60, CIFAR-10 Batch 4:  Loss: 0.255174010992 Validation Accuracy:  0.768599808216
Epoch 60, CIFAR-10 Batch 5:  Loss: 0.235351234674 Validation Accuracy:  0.767999887466
Epoch 61, CIFAR-10 Batch 1:  Loss: 0.264370799065 Validation Accuracy:  0.783599793911
Epoch 61, CIFAR-10 Batch 2:  Loss: 0.248867243528 Validation Accuracy:  0.768399834633
Epoch 61, CIFAR-10 Batch 3:  Loss: 0.218419924378 Validation Accuracy:  0.776199758053
Epoch 61, CIFAR-10 Batch 4:  Loss: 0.245507836342 Validation Accuracy:  0.777999877930
Epoch 61, CIFAR-10 Batch 5:  Loss: 0.235020905733 Validation Accuracy:  0.774599850178
Epoch 62, CIFAR-10 Batch 1:  Loss: 0.268283009529 Validation Accuracy:  0.778399825096
Epoch 62, CIFAR-10 Batch 2:  Loss: 0.242505475879 Validation Accuracy:  0.763999879360
Epoch 62, CIFAR-10 Batch 3:  Loss: 0.210919141769 Validation Accuracy:  0.773999810219
Epoch 62, CIFAR-10 Batch 4:  Loss: 0.257869482040 Validation Accuracy:  0.767999768257
Epoch 62, CIFAR-10 Batch 5:  Loss: 0.230329155922 Validation Accuracy:  0.770999848843
Epoch 63, CIFAR-10 Batch 1:  Loss: 0.263437628746 Validation Accuracy:  0.782599806786
Epoch 63, CIFAR-10 Batch 2:  Loss: 0.227855831385 Validation Accuracy:  0.773799777031
Epoch 63, CIFAR-10 Batch 3:  Loss: 0.214747354388 Validation Accuracy:  0.769999802113
Epoch 63, CIFAR-10 Batch 4:  Loss: 0.225341096520 Validation Accuracy:  0.783799886703
Epoch 63, CIFAR-10 Batch 5:  Loss: 0.231722787023 Validation Accuracy:  0.776799917221
Epoch 64, CIFAR-10 Batch 1:  Loss: 0.261901617050 Validation Accuracy:  0.781199753284
Epoch 64, CIFAR-10 Batch 2:  Loss: 0.244417160749 Validation Accuracy:  0.763199806213
Epoch 64, CIFAR-10 Batch 3:  Loss: 0.214136406779 Validation Accuracy:  0.773199856281
Epoch 64, CIFAR-10 Batch 4:  Loss: 0.230735838413 Validation Accuracy:  0.769999802113
Epoch 64, CIFAR-10 Batch 5:  Loss: 0.236209660769 Validation Accuracy:  0.776999890804
Epoch 65, CIFAR-10 Batch 1:  Loss: 0.247868359089 Validation Accuracy:  0.782199800014
Epoch 65, CIFAR-10 Batch 2:  Loss: 0.223813235760 Validation Accuracy:  0.774199843407
Epoch 65, CIFAR-10 Batch 3:  Loss: 0.200924962759 Validation Accuracy:  0.773999810219
Epoch 65, CIFAR-10 Batch 4:  Loss: 0.222621455789 Validation Accuracy:  0.782199740410
Epoch 65, CIFAR-10 Batch 5:  Loss: 0.214740216732 Validation Accuracy:  0.777599811554
Epoch 66, CIFAR-10 Batch 1:  Loss: 0.259427011013 Validation Accuracy:  0.788399815559
Epoch 66, CIFAR-10 Batch 2:  Loss: 0.221851438284 Validation Accuracy:  0.760399818420
Epoch 66, CIFAR-10 Batch 3:  Loss: 0.198181778193 Validation Accuracy:  0.768199801445
Epoch 66, CIFAR-10 Batch 4:  Loss: 0.197496742010 Validation Accuracy:  0.777599871159
Epoch 66, CIFAR-10 Batch 5:  Loss: 0.212067365646 Validation Accuracy:  0.769999861717
Epoch 67, CIFAR-10 Batch 1:  Loss: 0.258730262518 Validation Accuracy:  0.782999753952
Epoch 67, CIFAR-10 Batch 2:  Loss: 0.217065915465 Validation Accuracy:  0.768399834633
Epoch 67, CIFAR-10 Batch 3:  Loss: 0.185792565346 Validation Accuracy:  0.776599824429
Epoch 67, CIFAR-10 Batch 4:  Loss: 0.229722440243 Validation Accuracy:  0.767999768257
Epoch 67, CIFAR-10 Batch 5:  Loss: 0.195297539234 Validation Accuracy:  0.785399854183
Epoch 68, CIFAR-10 Batch 1:  Loss: 0.261947154999 Validation Accuracy:  0.769199848175
Epoch 68, CIFAR-10 Batch 2:  Loss: 0.246180474758 Validation Accuracy:  0.771799862385
Epoch 68, CIFAR-10 Batch 3:  Loss: 0.183463588357 Validation Accuracy:  0.781399786472
Epoch 68, CIFAR-10 Batch 4:  Loss: 0.195584818721 Validation Accuracy:  0.786199867725
Epoch 68, CIFAR-10 Batch 5:  Loss: 0.192987531424 Validation Accuracy:  0.776599824429
Epoch 69, CIFAR-10 Batch 1:  Loss: 0.273984104395 Validation Accuracy:  0.773199915886
Epoch 69, CIFAR-10 Batch 2:  Loss: 0.238029792905 Validation Accuracy:  0.756999850273
Epoch 69, CIFAR-10 Batch 3:  Loss: 0.182932913303 Validation Accuracy:  0.785399854183
Epoch 69, CIFAR-10 Batch 4:  Loss: 0.210640817881 Validation Accuracy:  0.782399833202
Epoch 69, CIFAR-10 Batch 5:  Loss: 0.178199723363 Validation Accuracy:  0.781799852848
Epoch 70, CIFAR-10 Batch 1:  Loss: 0.268072545528 Validation Accuracy:  0.775399804115
Epoch 70, CIFAR-10 Batch 2:  Loss: 0.187957614660 Validation Accuracy:  0.782599747181
Epoch 70, CIFAR-10 Batch 3:  Loss: 0.175895452499 Validation Accuracy:  0.780799806118
Epoch 70, CIFAR-10 Batch 4:  Loss: 0.200519427657 Validation Accuracy:  0.779199779034
Epoch 70, CIFAR-10 Batch 5:  Loss: 0.236535608768 Validation Accuracy:  0.771199822426
Epoch 71, CIFAR-10 Batch 1:  Loss: 0.217852383852 Validation Accuracy:  0.783799886703
Epoch 71, CIFAR-10 Batch 2:  Loss: 0.219425261021 Validation Accuracy:  0.768999874592
Epoch 71, CIFAR-10 Batch 3:  Loss: 0.158156722784 Validation Accuracy:  0.777599811554
Epoch 71, CIFAR-10 Batch 4:  Loss: 0.168650433421 Validation Accuracy:  0.787199854851
Epoch 71, CIFAR-10 Batch 5:  Loss: 0.170187145472 Validation Accuracy:  0.776999831200
Epoch 72, CIFAR-10 Batch 1:  Loss: 0.261540353298 Validation Accuracy:  0.767999768257
Epoch 72, CIFAR-10 Batch 2:  Loss: 0.205297023058 Validation Accuracy:  0.778799831867
Epoch 72, CIFAR-10 Batch 3:  Loss: 0.160870447755 Validation Accuracy:  0.790599822998
Epoch 72, CIFAR-10 Batch 4:  Loss: 0.201314300299 Validation Accuracy:  0.779799818993
Epoch 72, CIFAR-10 Batch 5:  Loss: 0.167105779052 Validation Accuracy:  0.785999834538
Epoch 73, CIFAR-10 Batch 1:  Loss: 0.203701227903 Validation Accuracy:  0.785799801350
Epoch 73, CIFAR-10 Batch 2:  Loss: 0.176428213716 Validation Accuracy:  0.780599772930
Epoch 73, CIFAR-10 Batch 3:  Loss: 0.175534874201 Validation Accuracy:  0.774999856949
Epoch 73, CIFAR-10 Batch 4:  Loss: 0.175034105778 Validation Accuracy:  0.782399833202
Epoch 73, CIFAR-10 Batch 5:  Loss: 0.174346968532 Validation Accuracy:  0.777799844742
Epoch 74, CIFAR-10 Batch 1:  Loss: 0.217504501343 Validation Accuracy:  0.784199833870
Epoch 74, CIFAR-10 Batch 2:  Loss: 0.177429080009 Validation Accuracy:  0.776999831200
Epoch 74, CIFAR-10 Batch 3:  Loss: 0.163405895233 Validation Accuracy:  0.781799852848
Epoch 74, CIFAR-10 Batch 4:  Loss: 0.184363082051 Validation Accuracy:  0.778199791908
Epoch 74, CIFAR-10 Batch 5:  Loss: 0.160198897123 Validation Accuracy:  0.778199911118
Epoch 75, CIFAR-10 Batch 1:  Loss: 0.217691749334 Validation Accuracy:  0.774999856949
Epoch 75, CIFAR-10 Batch 2:  Loss: 0.167411893606 Validation Accuracy:  0.771599829197
Epoch 75, CIFAR-10 Batch 3:  Loss: 0.168043866754 Validation Accuracy:  0.776999831200
Epoch 75, CIFAR-10 Batch 4:  Loss: 0.173007398844 Validation Accuracy:  0.786599755287
Epoch 75, CIFAR-10 Batch 5:  Loss: 0.172882944345 Validation Accuracy:  0.777799785137
Epoch 76, CIFAR-10 Batch 1:  Loss: 0.232132703066 Validation Accuracy:  0.775999784470
Epoch 76, CIFAR-10 Batch 2:  Loss: 0.170034602284 Validation Accuracy:  0.778399825096
Epoch 76, CIFAR-10 Batch 3:  Loss: 0.149918988347 Validation Accuracy:  0.777999818325
Epoch 76, CIFAR-10 Batch 4:  Loss: 0.182816535234 Validation Accuracy:  0.783599853516
Epoch 76, CIFAR-10 Batch 5:  Loss: 0.189312338829 Validation Accuracy:  0.776599824429
Epoch 77, CIFAR-10 Batch 1:  Loss: 0.218984007835 Validation Accuracy:  0.780399799347
Epoch 77, CIFAR-10 Batch 2:  Loss: 0.169892743230 Validation Accuracy:  0.781799793243
Epoch 77, CIFAR-10 Batch 3:  Loss: 0.191955134273 Validation Accuracy:  0.774999856949
Epoch 77, CIFAR-10 Batch 4:  Loss: 0.213381677866 Validation Accuracy:  0.770999848843
Epoch 77, CIFAR-10 Batch 5:  Loss: 0.200715154409 Validation Accuracy:  0.768799781799
Epoch 78, CIFAR-10 Batch 1:  Loss: 0.212109178305 Validation Accuracy:  0.783199787140
Epoch 78, CIFAR-10 Batch 2:  Loss: 0.176410257816 Validation Accuracy:  0.778399825096
Epoch 78, CIFAR-10 Batch 3:  Loss: 0.154717251658 Validation Accuracy:  0.778399825096
Epoch 78, CIFAR-10 Batch 4:  Loss: 0.190119445324 Validation Accuracy:  0.778999805450
Epoch 78, CIFAR-10 Batch 5:  Loss: 0.163048356771 Validation Accuracy:  0.778999865055
Epoch 79, CIFAR-10 Batch 1:  Loss: 0.193769767880 Validation Accuracy:  0.785199880600
Epoch 79, CIFAR-10 Batch 2:  Loss: 0.166249796748 Validation Accuracy:  0.782599806786
Epoch 79, CIFAR-10 Batch 3:  Loss: 0.176243364811 Validation Accuracy:  0.763399839401
Epoch 79, CIFAR-10 Batch 4:  Loss: 0.174579694867 Validation Accuracy:  0.779599785805
Epoch 79, CIFAR-10 Batch 5:  Loss: 0.156932830811 Validation Accuracy:  0.781199812889
Epoch 80, CIFAR-10 Batch 1:  Loss: 0.200727492571 Validation Accuracy:  0.782199919224
Epoch 80, CIFAR-10 Batch 2:  Loss: 0.193489462137 Validation Accuracy:  0.779599785805
Epoch 80, CIFAR-10 Batch 3:  Loss: 0.154933661222 Validation Accuracy:  0.781999766827
Epoch 80, CIFAR-10 Batch 4:  Loss: 0.176646962762 Validation Accuracy:  0.771799802780
Epoch 80, CIFAR-10 Batch 5:  Loss: 0.158715948462 Validation Accuracy:  0.780199825764
Epoch 81, CIFAR-10 Batch 1:  Loss: 0.186297357082 Validation Accuracy:  0.780799806118
Epoch 81, CIFAR-10 Batch 2:  Loss: 0.163120836020 Validation Accuracy:  0.787399768829
Epoch 81, CIFAR-10 Batch 3:  Loss: 0.147252276540 Validation Accuracy:  0.779599785805
Epoch 81, CIFAR-10 Batch 4:  Loss: 0.171552225947 Validation Accuracy:  0.776199817657
Epoch 81, CIFAR-10 Batch 5:  Loss: 0.164633721113 Validation Accuracy:  0.778599858284
Epoch 82, CIFAR-10 Batch 1:  Loss: 0.181306198239 Validation Accuracy:  0.777399837971
Epoch 82, CIFAR-10 Batch 2:  Loss: 0.154910743237 Validation Accuracy:  0.788199841976
Epoch 82, CIFAR-10 Batch 3:  Loss: 0.151941791177 Validation Accuracy:  0.779599785805
Epoch 82, CIFAR-10 Batch 4:  Loss: 0.187173232436 Validation Accuracy:  0.767999768257
Epoch 82, CIFAR-10 Batch 5:  Loss: 0.145097225904 Validation Accuracy:  0.773599803448
Epoch 83, CIFAR-10 Batch 1:  Loss: 0.219836533070 Validation Accuracy:  0.782599866390
Epoch 83, CIFAR-10 Batch 2:  Loss: 0.162142172456 Validation Accuracy:  0.791999816895
Epoch 83, CIFAR-10 Batch 3:  Loss: 0.145299017429 Validation Accuracy:  0.789799809456
Epoch 83, CIFAR-10 Batch 4:  Loss: 0.145406275988 Validation Accuracy:  0.771599769592
Epoch 83, CIFAR-10 Batch 5:  Loss: 0.152363151312 Validation Accuracy:  0.767399847507
Epoch 84, CIFAR-10 Batch 1:  Loss: 0.181291505694 Validation Accuracy:  0.781399786472
Epoch 84, CIFAR-10 Batch 2:  Loss: 0.152197957039 Validation Accuracy:  0.786999762058
Epoch 84, CIFAR-10 Batch 3:  Loss: 0.129651576281 Validation Accuracy:  0.784799754620
Epoch 84, CIFAR-10 Batch 4:  Loss: 0.151433572173 Validation Accuracy:  0.771799862385
Epoch 84, CIFAR-10 Batch 5:  Loss: 0.130957037210 Validation Accuracy:  0.781599760056
Epoch 85, CIFAR-10 Batch 1:  Loss: 0.189393103123 Validation Accuracy:  0.779399812222
Epoch 85, CIFAR-10 Batch 2:  Loss: 0.143039196730 Validation Accuracy:  0.794599831104
Epoch 85, CIFAR-10 Batch 3:  Loss: 0.131480351090 Validation Accuracy:  0.784999847412
Epoch 85, CIFAR-10 Batch 4:  Loss: 0.149207353592 Validation Accuracy:  0.760599851608
Epoch 85, CIFAR-10 Batch 5:  Loss: 0.139317929745 Validation Accuracy:  0.788399815559
Epoch 86, CIFAR-10 Batch 1:  Loss: 0.190126359463 Validation Accuracy:  0.782199859619
Epoch 86, CIFAR-10 Batch 2:  Loss: 0.141981467605 Validation Accuracy:  0.795399844646
Epoch 86, CIFAR-10 Batch 3:  Loss: 0.126654073596 Validation Accuracy:  0.789199769497
Epoch 86, CIFAR-10 Batch 4:  Loss: 0.145551234484 Validation Accuracy:  0.771199822426
Epoch 86, CIFAR-10 Batch 5:  Loss: 0.123365938663 Validation Accuracy:  0.775199770927
Epoch 87, CIFAR-10 Batch 1:  Loss: 0.151751905680 Validation Accuracy:  0.780599892139
Epoch 87, CIFAR-10 Batch 2:  Loss: 0.124099321663 Validation Accuracy:  0.786799788475
Epoch 87, CIFAR-10 Batch 3:  Loss: 0.114840567112 Validation Accuracy:  0.788799762726
Epoch 87, CIFAR-10 Batch 4:  Loss: 0.166815146804 Validation Accuracy:  0.763999819756
Epoch 87, CIFAR-10 Batch 5:  Loss: 0.131059527397 Validation Accuracy:  0.774799823761
Epoch 88, CIFAR-10 Batch 1:  Loss: 0.153161883354 Validation Accuracy:  0.788399815559
Epoch 88, CIFAR-10 Batch 2:  Loss: 0.157030478120 Validation Accuracy:  0.790799796581
Epoch 88, CIFAR-10 Batch 3:  Loss: 0.108482256532 Validation Accuracy:  0.790599822998
Epoch 88, CIFAR-10 Batch 4:  Loss: 0.133207261562 Validation Accuracy:  0.778599858284
Epoch 88, CIFAR-10 Batch 5:  Loss: 0.149530097842 Validation Accuracy:  0.776999890804
Epoch 89, CIFAR-10 Batch 1:  Loss: 0.155587911606 Validation Accuracy:  0.775999844074
Epoch 89, CIFAR-10 Batch 2:  Loss: 0.153325736523 Validation Accuracy:  0.790199875832
Epoch 89, CIFAR-10 Batch 3:  Loss: 0.099169574678 Validation Accuracy:  0.797599792480
Epoch 89, CIFAR-10 Batch 4:  Loss: 0.134188935161 Validation Accuracy:  0.769399762154
Epoch 89, CIFAR-10 Batch 5:  Loss: 0.162934273481 Validation Accuracy:  0.767799735069
Epoch 90, CIFAR-10 Batch 1:  Loss: 0.164539039135 Validation Accuracy:  0.782199859619
Epoch 90, CIFAR-10 Batch 2:  Loss: 0.125108525157 Validation Accuracy:  0.793199837208
Epoch 90, CIFAR-10 Batch 3:  Loss: 0.117085248232 Validation Accuracy:  0.784799873829
Epoch 90, CIFAR-10 Batch 4:  Loss: 0.127464681864 Validation Accuracy:  0.770199835300
Epoch 90, CIFAR-10 Batch 5:  Loss: 0.133765533566 Validation Accuracy:  0.776399791241
Epoch 91, CIFAR-10 Batch 1:  Loss: 0.155990302563 Validation Accuracy:  0.785599887371
Epoch 91, CIFAR-10 Batch 2:  Loss: 0.123291097581 Validation Accuracy:  0.787999868393
Epoch 91, CIFAR-10 Batch 3:  Loss: 0.114447727799 Validation Accuracy:  0.788199782372
Epoch 91, CIFAR-10 Batch 4:  Loss: 0.120356038213 Validation Accuracy:  0.780399858952
Epoch 91, CIFAR-10 Batch 5:  Loss: 0.125856876373 Validation Accuracy:  0.771799802780
Epoch 92, CIFAR-10 Batch 1:  Loss: 0.152572631836 Validation Accuracy:  0.786799848080
Epoch 92, CIFAR-10 Batch 2:  Loss: 0.136854603887 Validation Accuracy:  0.789999842644
Epoch 92, CIFAR-10 Batch 3:  Loss: 0.117403939366 Validation Accuracy:  0.786599814892
Epoch 92, CIFAR-10 Batch 4:  Loss: 0.121025711298 Validation Accuracy:  0.780399799347
Epoch 92, CIFAR-10 Batch 5:  Loss: 0.116754665971 Validation Accuracy:  0.784199833870
Epoch 93, CIFAR-10 Batch 1:  Loss: 0.159309953451 Validation Accuracy:  0.788999855518
Epoch 93, CIFAR-10 Batch 2:  Loss: 0.114615358412 Validation Accuracy:  0.794199764729
Epoch 93, CIFAR-10 Batch 3:  Loss: 0.102545626462 Validation Accuracy:  0.789399862289
Epoch 93, CIFAR-10 Batch 4:  Loss: 0.122456312180 Validation Accuracy:  0.774799823761
Epoch 93, CIFAR-10 Batch 5:  Loss: 0.118498422205 Validation Accuracy:  0.781999826431
Epoch 94, CIFAR-10 Batch 1:  Loss: 0.163438454270 Validation Accuracy:  0.785599827766
Epoch 94, CIFAR-10 Batch 2:  Loss: 0.123165667057 Validation Accuracy:  0.792799830437
Epoch 94, CIFAR-10 Batch 3:  Loss: 0.094882883132 Validation Accuracy:  0.798599839211
Epoch 94, CIFAR-10 Batch 4:  Loss: 0.119427457452 Validation Accuracy:  0.782599806786
Epoch 94, CIFAR-10 Batch 5:  Loss: 0.130544751883 Validation Accuracy:  0.768199801445
Epoch 95, CIFAR-10 Batch 1:  Loss: 0.151971802115 Validation Accuracy:  0.791999757290
Epoch 95, CIFAR-10 Batch 2:  Loss: 0.116479769349 Validation Accuracy:  0.791199803352
Epoch 95, CIFAR-10 Batch 3:  Loss: 0.112711608410 Validation Accuracy:  0.792399883270
Epoch 95, CIFAR-10 Batch 4:  Loss: 0.104433387518 Validation Accuracy:  0.775799870491
Epoch 95, CIFAR-10 Batch 5:  Loss: 0.102633856237 Validation Accuracy:  0.788199782372
Epoch 96, CIFAR-10 Batch 1:  Loss: 0.157021850348 Validation Accuracy:  0.784999847412
Epoch 96, CIFAR-10 Batch 2:  Loss: 0.110262922943 Validation Accuracy:  0.799399793148
Epoch 96, CIFAR-10 Batch 3:  Loss: 0.095860198140 Validation Accuracy:  0.791999816895
Epoch 96, CIFAR-10 Batch 4:  Loss: 0.106657013297 Validation Accuracy:  0.780799806118
Epoch 96, CIFAR-10 Batch 5:  Loss: 0.112817183137 Validation Accuracy:  0.775399744511
Epoch 97, CIFAR-10 Batch 1:  Loss: 0.163261592388 Validation Accuracy:  0.776399791241
Epoch 97, CIFAR-10 Batch 2:  Loss: 0.103425942361 Validation Accuracy:  0.791999757290
Epoch 97, CIFAR-10 Batch 3:  Loss: 0.090342760086 Validation Accuracy:  0.797799825668
Epoch 97, CIFAR-10 Batch 4:  Loss: 0.100946322083 Validation Accuracy:  0.784999787807
Epoch 97, CIFAR-10 Batch 5:  Loss: 0.120601497591 Validation Accuracy:  0.788599848747
Epoch 98, CIFAR-10 Batch 1:  Loss: 0.140753775835 Validation Accuracy:  0.783799827099
Epoch 98, CIFAR-10 Batch 2:  Loss: 0.099869899452 Validation Accuracy:  0.789799809456
Epoch 98, CIFAR-10 Batch 3:  Loss: 0.090405419469 Validation Accuracy:  0.797999799252
Epoch 98, CIFAR-10 Batch 4:  Loss: 0.096689939499 Validation Accuracy:  0.784199833870
Epoch 98, CIFAR-10 Batch 5:  Loss: 0.133292764425 Validation Accuracy:  0.775199830532
Epoch 99, CIFAR-10 Batch 1:  Loss: 0.136652305722 Validation Accuracy:  0.789999842644
Epoch 99, CIFAR-10 Batch 2:  Loss: 0.121014818549 Validation Accuracy:  0.794199824333
Epoch 99, CIFAR-10 Batch 3:  Loss: 0.112211070955 Validation Accuracy:  0.769199788570
Epoch 99, CIFAR-10 Batch 4:  Loss: 0.093977369368 Validation Accuracy:  0.788999736309
Epoch 99, CIFAR-10 Batch 5:  Loss: 0.126274645329 Validation Accuracy:  0.784399867058
Epoch 100, CIFAR-10 Batch 1:  Loss: 0.138055473566 Validation Accuracy:  0.783599853516
Epoch 100, CIFAR-10 Batch 2:  Loss: 0.122476316988 Validation Accuracy:  0.786799788475
Epoch 100, CIFAR-10 Batch 3:  Loss: 0.088367089629 Validation Accuracy:  0.793199777603
Epoch 100, CIFAR-10 Batch 4:  Loss: 0.110124394298 Validation Accuracy:  0.782199859619
Epoch 100, CIFAR-10 Batch 5:  Loss: 0.119136333466 Validation Accuracy:  0.788199841976
Epoch 101, CIFAR-10 Batch 1:  Loss: 0.140166804194 Validation Accuracy:  0.789199829102
Epoch 101, CIFAR-10 Batch 2:  Loss: 0.095778286457 Validation Accuracy:  0.790399789810
Epoch 101, CIFAR-10 Batch 3:  Loss: 0.098131366074 Validation Accuracy:  0.788599789143
Epoch 101, CIFAR-10 Batch 4:  Loss: 0.110715992749 Validation Accuracy:  0.787599802017
Epoch 101, CIFAR-10 Batch 5:  Loss: 0.107971251011 Validation Accuracy:  0.784399807453
Epoch 102, CIFAR-10 Batch 1:  Loss: 0.122965633869 Validation Accuracy:  0.793399810791
Epoch 102, CIFAR-10 Batch 2:  Loss: 0.094599880278 Validation Accuracy:  0.785599768162
Epoch 102, CIFAR-10 Batch 3:  Loss: 0.072866737843 Validation Accuracy:  0.796199858189
Epoch 102, CIFAR-10 Batch 4:  Loss: 0.093410469592 Validation Accuracy:  0.789999842644
Epoch 102, CIFAR-10 Batch 5:  Loss: 0.108626507223 Validation Accuracy:  0.788799762726
Epoch 103, CIFAR-10 Batch 1:  Loss: 0.124599158764 Validation Accuracy:  0.792399764061
Epoch 103, CIFAR-10 Batch 2:  Loss: 0.103047750890 Validation Accuracy:  0.790999889374
Epoch 103, CIFAR-10 Batch 3:  Loss: 0.088128335774 Validation Accuracy:  0.801199793816
Epoch 103, CIFAR-10 Batch 4:  Loss: 0.105461314321 Validation Accuracy:  0.792199790478
Epoch 103, CIFAR-10 Batch 5:  Loss: 0.096086546779 Validation Accuracy:  0.791999816895
Epoch 104, CIFAR-10 Batch 1:  Loss: 0.125077754259 Validation Accuracy:  0.786599814892
Epoch 104, CIFAR-10 Batch 2:  Loss: 0.106114797294 Validation Accuracy:  0.792799830437
Epoch 104, CIFAR-10 Batch 3:  Loss: 0.079996898770 Validation Accuracy:  0.797799766064
Epoch 104, CIFAR-10 Batch 4:  Loss: 0.089760363102 Validation Accuracy:  0.782999813557
Epoch 104, CIFAR-10 Batch 5:  Loss: 0.116832681000 Validation Accuracy:  0.776799857616
Epoch 105, CIFAR-10 Batch 1:  Loss: 0.127891525626 Validation Accuracy:  0.788599908352
Epoch 105, CIFAR-10 Batch 2:  Loss: 0.090464040637 Validation Accuracy:  0.789799809456
Epoch 105, CIFAR-10 Batch 3:  Loss: 0.088420227170 Validation Accuracy:  0.794199883938
Epoch 105, CIFAR-10 Batch 4:  Loss: 0.090994261205 Validation Accuracy:  0.780199825764
Epoch 105, CIFAR-10 Batch 5:  Loss: 0.112215593457 Validation Accuracy:  0.788599848747
Epoch 106, CIFAR-10 Batch 1:  Loss: 0.124370776117 Validation Accuracy:  0.788799762726
Epoch 106, CIFAR-10 Batch 2:  Loss: 0.097315579653 Validation Accuracy:  0.797799825668
Epoch 106, CIFAR-10 Batch 3:  Loss: 0.072962015867 Validation Accuracy:  0.787799835205
Epoch 106, CIFAR-10 Batch 4:  Loss: 0.090929843485 Validation Accuracy:  0.790599763393
Epoch 106, CIFAR-10 Batch 5:  Loss: 0.101120926440 Validation Accuracy:  0.789599835873
Epoch 107, CIFAR-10 Batch 1:  Loss: 0.128511399031 Validation Accuracy:  0.788199841976
Epoch 107, CIFAR-10 Batch 2:  Loss: 0.100245267153 Validation Accuracy:  0.796599805355
Epoch 107, CIFAR-10 Batch 3:  Loss: 0.105315670371 Validation Accuracy:  0.782599866390
Epoch 107, CIFAR-10 Batch 4:  Loss: 0.083455398679 Validation Accuracy:  0.780399918556
Epoch 107, CIFAR-10 Batch 5:  Loss: 0.108408495784 Validation Accuracy:  0.789399802685
Epoch 108, CIFAR-10 Batch 1:  Loss: 0.112488180399 Validation Accuracy:  0.796399772167
Epoch 108, CIFAR-10 Batch 2:  Loss: 0.090489491820 Validation Accuracy:  0.798599720001
Epoch 108, CIFAR-10 Batch 3:  Loss: 0.072793669999 Validation Accuracy:  0.794799804688
Epoch 108, CIFAR-10 Batch 4:  Loss: 0.082434684038 Validation Accuracy:  0.787599742413
Epoch 108, CIFAR-10 Batch 5:  Loss: 0.127865865827 Validation Accuracy:  0.777799785137
Epoch 109, CIFAR-10 Batch 1:  Loss: 0.119999736547 Validation Accuracy:  0.790599822998
Epoch 109, CIFAR-10 Batch 2:  Loss: 0.106131926179 Validation Accuracy:  0.789999842644
Epoch 109, CIFAR-10 Batch 3:  Loss: 0.077356293797 Validation Accuracy:  0.798999845982
Epoch 109, CIFAR-10 Batch 4:  Loss: 0.080625072122 Validation Accuracy:  0.800199866295
Epoch 109, CIFAR-10 Batch 5:  Loss: 0.093015231192 Validation Accuracy:  0.788799822330
Epoch 110, CIFAR-10 Batch 1:  Loss: 0.103567212820 Validation Accuracy:  0.789999842644
Epoch 110, CIFAR-10 Batch 2:  Loss: 0.087229818106 Validation Accuracy:  0.792599856853
Epoch 110, CIFAR-10 Batch 3:  Loss: 0.082365043461 Validation Accuracy:  0.793799757957
Epoch 110, CIFAR-10 Batch 4:  Loss: 0.076323002577 Validation Accuracy:  0.784599840641
Epoch 110, CIFAR-10 Batch 5:  Loss: 0.088896512985 Validation Accuracy:  0.790399789810
Epoch 111, CIFAR-10 Batch 1:  Loss: 0.123642809689 Validation Accuracy:  0.790999829769
Epoch 111, CIFAR-10 Batch 2:  Loss: 0.086029767990 Validation Accuracy:  0.785199820995
Epoch 111, CIFAR-10 Batch 3:  Loss: 0.081354081631 Validation Accuracy:  0.794999837875
Epoch 111, CIFAR-10 Batch 4:  Loss: 0.095035627484 Validation Accuracy:  0.777599811554
Epoch 111, CIFAR-10 Batch 5:  Loss: 0.111384332180 Validation Accuracy:  0.776199817657
Epoch 112, CIFAR-10 Batch 1:  Loss: 0.108585439622 Validation Accuracy:  0.788999855518
Epoch 112, CIFAR-10 Batch 2:  Loss: 0.099510930479 Validation Accuracy:  0.792199790478
Epoch 112, CIFAR-10 Batch 3:  Loss: 0.086810201406 Validation Accuracy:  0.795999765396
Epoch 112, CIFAR-10 Batch 4:  Loss: 0.073484867811 Validation Accuracy:  0.791199803352
Epoch 112, CIFAR-10 Batch 5:  Loss: 0.082145988941 Validation Accuracy:  0.788199841976
Epoch 113, CIFAR-10 Batch 1:  Loss: 0.109537795186 Validation Accuracy:  0.791599810123
Epoch 113, CIFAR-10 Batch 2:  Loss: 0.087698362768 Validation Accuracy:  0.791599810123
Epoch 113, CIFAR-10 Batch 3:  Loss: 0.082665264606 Validation Accuracy:  0.789399862289
Epoch 113, CIFAR-10 Batch 4:  Loss: 0.087408423424 Validation Accuracy:  0.794999837875
Epoch 113, CIFAR-10 Batch 5:  Loss: 0.096932381392 Validation Accuracy:  0.785999834538
Epoch 114, CIFAR-10 Batch 1:  Loss: 0.107851915061 Validation Accuracy:  0.793399810791
Epoch 114, CIFAR-10 Batch 2:  Loss: 0.090136207640 Validation Accuracy:  0.778199791908
Epoch 114, CIFAR-10 Batch 3:  Loss: 0.066253691912 Validation Accuracy:  0.797799766064
Epoch 114, CIFAR-10 Batch 4:  Loss: 0.075864113867 Validation Accuracy:  0.794399857521
Epoch 114, CIFAR-10 Batch 5:  Loss: 0.111096635461 Validation Accuracy:  0.780599832535
Epoch 115, CIFAR-10 Batch 1:  Loss: 0.102037161589 Validation Accuracy:  0.793199837208
Epoch 115, CIFAR-10 Batch 2:  Loss: 0.069536611438 Validation Accuracy:  0.804999768734
Epoch 115, CIFAR-10 Batch 3:  Loss: 0.072309799492 Validation Accuracy:  0.790199875832
Epoch 115, CIFAR-10 Batch 4:  Loss: 0.075503721833 Validation Accuracy:  0.789999902248
Epoch 115, CIFAR-10 Batch 5:  Loss: 0.090073317289 Validation Accuracy:  0.783799886703
Epoch 116, CIFAR-10 Batch 1:  Loss: 0.116660609841 Validation Accuracy:  0.785599768162
Epoch 116, CIFAR-10 Batch 2:  Loss: 0.072029992938 Validation Accuracy:  0.792999863625
Epoch 116, CIFAR-10 Batch 3:  Loss: 0.076337240636 Validation Accuracy:  0.793599784374
Epoch 116, CIFAR-10 Batch 4:  Loss: 0.089210465550 Validation Accuracy:  0.777999818325
Epoch 116, CIFAR-10 Batch 5:  Loss: 0.091651052237 Validation Accuracy:  0.781199872494
Epoch 117, CIFAR-10 Batch 1:  Loss: 0.102171704173 Validation Accuracy:  0.787799835205
Epoch 117, CIFAR-10 Batch 2:  Loss: 0.070313759148 Validation Accuracy:  0.797999799252
Epoch 117, CIFAR-10 Batch 3:  Loss: 0.064549066126 Validation Accuracy:  0.798799872398
Epoch 117, CIFAR-10 Batch 4:  Loss: 0.065177179873 Validation Accuracy:  0.789999842644
Epoch 117, CIFAR-10 Batch 5:  Loss: 0.087513357401 Validation Accuracy:  0.791399836540
Epoch 118, CIFAR-10 Batch 1:  Loss: 0.104187607765 Validation Accuracy:  0.788199841976
Epoch 118, CIFAR-10 Batch 2:  Loss: 0.066011115909 Validation Accuracy:  0.785199820995
Epoch 118, CIFAR-10 Batch 3:  Loss: 0.062497887760 Validation Accuracy:  0.798599779606
Epoch 118, CIFAR-10 Batch 4:  Loss: 0.068183533847 Validation Accuracy:  0.791999816895
Epoch 118, CIFAR-10 Batch 5:  Loss: 0.082940623164 Validation Accuracy:  0.786199808121
Epoch 119, CIFAR-10 Batch 1:  Loss: 0.102268710732 Validation Accuracy:  0.793799757957
Epoch 119, CIFAR-10 Batch 2:  Loss: 0.081581667066 Validation Accuracy:  0.784599840641
Epoch 119, CIFAR-10 Batch 3:  Loss: 0.063801102340 Validation Accuracy:  0.795799851418
Epoch 119, CIFAR-10 Batch 4:  Loss: 0.071841739118 Validation Accuracy:  0.793199777603
Epoch 119, CIFAR-10 Batch 5:  Loss: 0.089389279485 Validation Accuracy:  0.786199867725
Epoch 120, CIFAR-10 Batch 1:  Loss: 0.098721556365 Validation Accuracy:  0.792199790478
Epoch 120, CIFAR-10 Batch 2:  Loss: 0.081800788641 Validation Accuracy:  0.794999837875
Epoch 120, CIFAR-10 Batch 3:  Loss: 0.051763482392 Validation Accuracy:  0.797999858856
Epoch 120, CIFAR-10 Batch 4:  Loss: 0.071526974440 Validation Accuracy:  0.791999816895
Epoch 120, CIFAR-10 Batch 5:  Loss: 0.073511950672 Validation Accuracy:  0.795199811459
Epoch 121, CIFAR-10 Batch 1:  Loss: 0.101423062384 Validation Accuracy:  0.794399797916
Epoch 121, CIFAR-10 Batch 2:  Loss: 0.091313287616 Validation Accuracy:  0.795799791813
Epoch 121, CIFAR-10 Batch 3:  Loss: 0.059690047055 Validation Accuracy:  0.797199785709
Epoch 121, CIFAR-10 Batch 4:  Loss: 0.077623449266 Validation Accuracy:  0.789999783039
Epoch 121, CIFAR-10 Batch 5:  Loss: 0.084751576185 Validation Accuracy:  0.792399823666
Epoch 122, CIFAR-10 Batch 1:  Loss: 0.087748833001 Validation Accuracy:  0.789799809456
Epoch 122, CIFAR-10 Batch 2:  Loss: 0.070287048817 Validation Accuracy:  0.789199829102
Epoch 122, CIFAR-10 Batch 3:  Loss: 0.059514649212 Validation Accuracy:  0.790199875832
Epoch 122, CIFAR-10 Batch 4:  Loss: 0.078862920403 Validation Accuracy:  0.789599776268
Epoch 122, CIFAR-10 Batch 5:  Loss: 0.075889170170 Validation Accuracy:  0.789199829102
Epoch 123, CIFAR-10 Batch 1:  Loss: 0.093459278345 Validation Accuracy:  0.790799736977
Epoch 123, CIFAR-10 Batch 2:  Loss: 0.066013984382 Validation Accuracy:  0.789999783039
Epoch 123, CIFAR-10 Batch 3:  Loss: 0.074380785227 Validation Accuracy:  0.795199751854
Epoch 123, CIFAR-10 Batch 4:  Loss: 0.063388966024 Validation Accuracy:  0.797199845314
Epoch 123, CIFAR-10 Batch 5:  Loss: 0.079443044960 Validation Accuracy:  0.791199862957
Epoch 124, CIFAR-10 Batch 1:  Loss: 0.104594737291 Validation Accuracy:  0.793799817562
Epoch 124, CIFAR-10 Batch 2:  Loss: 0.069325134158 Validation Accuracy:  0.794999778271
Epoch 124, CIFAR-10 Batch 3:  Loss: 0.076421983540 Validation Accuracy:  0.795199811459
Epoch 124, CIFAR-10 Batch 4:  Loss: 0.061329640448 Validation Accuracy:  0.801999807358
Epoch 124, CIFAR-10 Batch 5:  Loss: 0.077956378460 Validation Accuracy:  0.792599797249
Epoch 125, CIFAR-10 Batch 1:  Loss: 0.099486112595 Validation Accuracy:  0.791599869728
Epoch 125, CIFAR-10 Batch 2:  Loss: 0.075262516737 Validation Accuracy:  0.793399810791
Epoch 125, CIFAR-10 Batch 3:  Loss: 0.053085006773 Validation Accuracy:  0.791199803352
Epoch 125, CIFAR-10 Batch 4:  Loss: 0.070250272751 Validation Accuracy:  0.789199829102
Epoch 125, CIFAR-10 Batch 5:  Loss: 0.068519048393 Validation Accuracy:  0.788599729538
Epoch 126, CIFAR-10 Batch 1:  Loss: 0.094445437193 Validation Accuracy:  0.792799770832
Epoch 126, CIFAR-10 Batch 2:  Loss: 0.075857840478 Validation Accuracy:  0.785599768162
Epoch 126, CIFAR-10 Batch 3:  Loss: 0.059225812554 Validation Accuracy:  0.801199793816
Epoch 126, CIFAR-10 Batch 4:  Loss: 0.052606955171 Validation Accuracy:  0.798399865627
Epoch 126, CIFAR-10 Batch 5:  Loss: 0.086177885532 Validation Accuracy:  0.785399854183
Epoch 127, CIFAR-10 Batch 1:  Loss: 0.085455775261 Validation Accuracy:  0.804399847984
Epoch 127, CIFAR-10 Batch 2:  Loss: 0.071719996631 Validation Accuracy:  0.790599763393
Epoch 127, CIFAR-10 Batch 3:  Loss: 0.054505363107 Validation Accuracy:  0.793399870396
Epoch 127, CIFAR-10 Batch 4:  Loss: 0.057489678264 Validation Accuracy:  0.791199862957
Epoch 127, CIFAR-10 Batch 5:  Loss: 0.093158349395 Validation Accuracy:  0.774799823761
Epoch 128, CIFAR-10 Batch 1:  Loss: 0.094169728458 Validation Accuracy:  0.795999765396
Epoch 128, CIFAR-10 Batch 2:  Loss: 0.066569924355 Validation Accuracy:  0.796599745750
Epoch 128, CIFAR-10 Batch 3:  Loss: 0.050805103034 Validation Accuracy:  0.801399827003
Epoch 128, CIFAR-10 Batch 4:  Loss: 0.058887381107 Validation Accuracy:  0.787799835205
Epoch 128, CIFAR-10 Batch 5:  Loss: 0.084777131677 Validation Accuracy:  0.781799793243
Epoch 129, CIFAR-10 Batch 1:  Loss: 0.094615854323 Validation Accuracy:  0.793599843979
Epoch 129, CIFAR-10 Batch 2:  Loss: 0.057354878634 Validation Accuracy:  0.794799864292
Epoch 129, CIFAR-10 Batch 3:  Loss: 0.044547367841 Validation Accuracy:  0.794399797916
Epoch 129, CIFAR-10 Batch 4:  Loss: 0.077636525035 Validation Accuracy:  0.785399794579
Epoch 129, CIFAR-10 Batch 5:  Loss: 0.092844419181 Validation Accuracy:  0.781799852848
Epoch 130, CIFAR-10 Batch 1:  Loss: 0.088758178055 Validation Accuracy:  0.799199819565
Epoch 130, CIFAR-10 Batch 2:  Loss: 0.060393325984 Validation Accuracy:  0.796799778938
Epoch 130, CIFAR-10 Batch 3:  Loss: 0.049900010228 Validation Accuracy:  0.800799846649
Epoch 130, CIFAR-10 Batch 4:  Loss: 0.054531253874 Validation Accuracy:  0.792999744415
Epoch 130, CIFAR-10 Batch 5:  Loss: 0.050098642707 Validation Accuracy:  0.795999825001
Epoch 131, CIFAR-10 Batch 1:  Loss: 0.075154706836 Validation Accuracy:  0.796599864960
Epoch 131, CIFAR-10 Batch 2:  Loss: 0.050485964864 Validation Accuracy:  0.797799825668
Epoch 131, CIFAR-10 Batch 3:  Loss: 0.048979762942 Validation Accuracy:  0.796399712563
Epoch 131, CIFAR-10 Batch 4:  Loss: 0.063261523843 Validation Accuracy:  0.788399815559
Epoch 131, CIFAR-10 Batch 5:  Loss: 0.075765095651 Validation Accuracy:  0.790799796581
Epoch 132, CIFAR-10 Batch 1:  Loss: 0.107691206038 Validation Accuracy:  0.791599750519
Epoch 132, CIFAR-10 Batch 2:  Loss: 0.052626185119 Validation Accuracy:  0.793799757957
Epoch 132, CIFAR-10 Batch 3:  Loss: 0.053158111870 Validation Accuracy:  0.791399776936
Epoch 132, CIFAR-10 Batch 4:  Loss: 0.064970642328 Validation Accuracy:  0.788399815559
Epoch 132, CIFAR-10 Batch 5:  Loss: 0.065171614289 Validation Accuracy:  0.784599840641
Epoch 133, CIFAR-10 Batch 1:  Loss: 0.084272548556 Validation Accuracy:  0.793599784374
Epoch 133, CIFAR-10 Batch 2:  Loss: 0.063922606409 Validation Accuracy:  0.796199917793
Epoch 133, CIFAR-10 Batch 3:  Loss: 0.060899004340 Validation Accuracy:  0.794799745083
Epoch 133, CIFAR-10 Batch 4:  Loss: 0.051084689796 Validation Accuracy:  0.789799749851
Epoch 133, CIFAR-10 Batch 5:  Loss: 0.065627515316 Validation Accuracy:  0.792399764061
Epoch 134, CIFAR-10 Batch 1:  Loss: 0.102901637554 Validation Accuracy:  0.797399759293
Epoch 134, CIFAR-10 Batch 2:  Loss: 0.058984719217 Validation Accuracy:  0.798999786377
Epoch 134, CIFAR-10 Batch 3:  Loss: 0.051211774349 Validation Accuracy:  0.795399785042
Epoch 134, CIFAR-10 Batch 4:  Loss: 0.052430413663 Validation Accuracy:  0.790599822998
Epoch 134, CIFAR-10 Batch 5:  Loss: 0.064584888518 Validation Accuracy:  0.785799741745
Epoch 135, CIFAR-10 Batch 1:  Loss: 0.076259434223 Validation Accuracy:  0.798999845982
Epoch 135, CIFAR-10 Batch 2:  Loss: 0.068743571639 Validation Accuracy:  0.786799907684
Epoch 135, CIFAR-10 Batch 3:  Loss: 0.053594794124 Validation Accuracy:  0.790399849415
Epoch 135, CIFAR-10 Batch 4:  Loss: 0.053108118474 Validation Accuracy:  0.789199769497
Epoch 135, CIFAR-10 Batch 5:  Loss: 0.067796416581 Validation Accuracy:  0.782999813557
Epoch 136, CIFAR-10 Batch 1:  Loss: 0.105554856360 Validation Accuracy:  0.795799791813
Epoch 136, CIFAR-10 Batch 2:  Loss: 0.055089656264 Validation Accuracy:  0.798199772835
Epoch 136, CIFAR-10 Batch 3:  Loss: 0.044564507902 Validation Accuracy:  0.797799825668
Epoch 136, CIFAR-10 Batch 4:  Loss: 0.057356406003 Validation Accuracy:  0.794999837875
Epoch 136, CIFAR-10 Batch 5:  Loss: 0.059625554830 Validation Accuracy:  0.792199790478
Epoch 137, CIFAR-10 Batch 1:  Loss: 0.091898441315 Validation Accuracy:  0.790199875832
Epoch 137, CIFAR-10 Batch 2:  Loss: 0.060077767819 Validation Accuracy:  0.795799791813
Epoch 137, CIFAR-10 Batch 3:  Loss: 0.048145845532 Validation Accuracy:  0.803599774837
Epoch 137, CIFAR-10 Batch 4:  Loss: 0.053717192262 Validation Accuracy:  0.790199756622
Epoch 137, CIFAR-10 Batch 5:  Loss: 0.058141767979 Validation Accuracy:  0.787199795246
Epoch 138, CIFAR-10 Batch 1:  Loss: 0.071351557970 Validation Accuracy:  0.797999858856
Epoch 138, CIFAR-10 Batch 2:  Loss: 0.058279842138 Validation Accuracy:  0.789199829102
Epoch 138, CIFAR-10 Batch 3:  Loss: 0.052094452083 Validation Accuracy:  0.788199782372
Epoch 138, CIFAR-10 Batch 4:  Loss: 0.060129828751 Validation Accuracy:  0.785199820995
Epoch 138, CIFAR-10 Batch 5:  Loss: 0.052529163659 Validation Accuracy:  0.798199772835
Epoch 139, CIFAR-10 Batch 1:  Loss: 0.060819491744 Validation Accuracy:  0.801199793816
Epoch 139, CIFAR-10 Batch 2:  Loss: 0.055884752423 Validation Accuracy:  0.794199824333
Epoch 139, CIFAR-10 Batch 3:  Loss: 0.057128988206 Validation Accuracy:  0.796599745750
Epoch 139, CIFAR-10 Batch 4:  Loss: 0.046796537936 Validation Accuracy:  0.794999837875
Epoch 139, CIFAR-10 Batch 5:  Loss: 0.053542416543 Validation Accuracy:  0.791799783707
Epoch 140, CIFAR-10 Batch 1:  Loss: 0.077829875052 Validation Accuracy:  0.798999786377
Epoch 140, CIFAR-10 Batch 2:  Loss: 0.056087061763 Validation Accuracy:  0.797399759293
Epoch 140, CIFAR-10 Batch 3:  Loss: 0.040213488042 Validation Accuracy:  0.795399785042
Epoch 140, CIFAR-10 Batch 4:  Loss: 0.048247165978 Validation Accuracy:  0.793799817562
Epoch 140, CIFAR-10 Batch 5:  Loss: 0.064733669162 Validation Accuracy:  0.780599772930
Epoch 141, CIFAR-10 Batch 1:  Loss: 0.080635137856 Validation Accuracy:  0.795999825001
Epoch 141, CIFAR-10 Batch 2:  Loss: 0.065808072686 Validation Accuracy:  0.784999847412
Epoch 141, CIFAR-10 Batch 3:  Loss: 0.054816842079 Validation Accuracy:  0.795599758625
Epoch 141, CIFAR-10 Batch 4:  Loss: 0.054302886128 Validation Accuracy:  0.789999783039
Epoch 141, CIFAR-10 Batch 5:  Loss: 0.059605427086 Validation Accuracy:  0.791799783707
Epoch 142, CIFAR-10 Batch 1:  Loss: 0.070381812751 Validation Accuracy:  0.796599805355
Epoch 142, CIFAR-10 Batch 2:  Loss: 0.061043299735 Validation Accuracy:  0.789599835873
Epoch 142, CIFAR-10 Batch 3:  Loss: 0.045518174767 Validation Accuracy:  0.797399818897
Epoch 142, CIFAR-10 Batch 4:  Loss: 0.052015811205 Validation Accuracy:  0.788999855518
Epoch 142, CIFAR-10 Batch 5:  Loss: 0.055909067392 Validation Accuracy:  0.789799869061
Epoch 143, CIFAR-10 Batch 1:  Loss: 0.092622309923 Validation Accuracy:  0.795199811459
Epoch 143, CIFAR-10 Batch 2:  Loss: 0.057967867702 Validation Accuracy:  0.790399789810
Epoch 143, CIFAR-10 Batch 3:  Loss: 0.043744057417 Validation Accuracy:  0.798199772835
Epoch 143, CIFAR-10 Batch 4:  Loss: 0.048500698060 Validation Accuracy:  0.802999794483
Epoch 143, CIFAR-10 Batch 5:  Loss: 0.063232026994 Validation Accuracy:  0.787199795246
Epoch 144, CIFAR-10 Batch 1:  Loss: 0.083653621376 Validation Accuracy:  0.794799864292
Epoch 144, CIFAR-10 Batch 2:  Loss: 0.087717235088 Validation Accuracy:  0.785399794579
Epoch 144, CIFAR-10 Batch 3:  Loss: 0.057716280222 Validation Accuracy:  0.791399836540
Epoch 144, CIFAR-10 Batch 4:  Loss: 0.049566559494 Validation Accuracy:  0.794999837875
Epoch 144, CIFAR-10 Batch 5:  Loss: 0.058422442526 Validation Accuracy:  0.790399849415
Epoch 145, CIFAR-10 Batch 1:  Loss: 0.077878423035 Validation Accuracy:  0.786999881268
Epoch 145, CIFAR-10 Batch 2:  Loss: 0.060889370739 Validation Accuracy:  0.790599763393
Epoch 145, CIFAR-10 Batch 3:  Loss: 0.040707346052 Validation Accuracy:  0.798399746418
Epoch 145, CIFAR-10 Batch 4:  Loss: 0.054656572640 Validation Accuracy:  0.801199793816
Epoch 145, CIFAR-10 Batch 5:  Loss: 0.071618348360 Validation Accuracy:  0.779199779034
Epoch 146, CIFAR-10 Batch 1:  Loss: 0.072394497693 Validation Accuracy:  0.791999757290
Epoch 146, CIFAR-10 Batch 2:  Loss: 0.061363451183 Validation Accuracy:  0.794199824333
Epoch 146, CIFAR-10 Batch 3:  Loss: 0.070998154581 Validation Accuracy:  0.773799836636
Epoch 146, CIFAR-10 Batch 4:  Loss: 0.051344141364 Validation Accuracy:  0.784999847412
Epoch 146, CIFAR-10 Batch 5:  Loss: 0.055386755615 Validation Accuracy:  0.793799877167
Epoch 147, CIFAR-10 Batch 1:  Loss: 0.090166971087 Validation Accuracy:  0.787199795246
Epoch 147, CIFAR-10 Batch 2:  Loss: 0.052984938025 Validation Accuracy:  0.794399797916
Epoch 147, CIFAR-10 Batch 3:  Loss: 0.040731161833 Validation Accuracy:  0.789999783039
Epoch 147, CIFAR-10 Batch 4:  Loss: 0.052784353495 Validation Accuracy:  0.792799830437
Epoch 147, CIFAR-10 Batch 5:  Loss: 0.049727767706 Validation Accuracy:  0.798399806023
Epoch 148, CIFAR-10 Batch 1:  Loss: 0.066581055522 Validation Accuracy:  0.794999837875
Epoch 148, CIFAR-10 Batch 2:  Loss: 0.051277317107 Validation Accuracy:  0.787399828434
Epoch 148, CIFAR-10 Batch 3:  Loss: 0.055078782141 Validation Accuracy:  0.788399815559
Epoch 148, CIFAR-10 Batch 4:  Loss: 0.053307253867 Validation Accuracy:  0.792799830437
Epoch 148, CIFAR-10 Batch 5:  Loss: 0.055063262582 Validation Accuracy:  0.782199859619
Epoch 149, CIFAR-10 Batch 1:  Loss: 0.067533150315 Validation Accuracy:  0.792799830437
Epoch 149, CIFAR-10 Batch 2:  Loss: 0.047513153404 Validation Accuracy:  0.787199854851
Epoch 149, CIFAR-10 Batch 3:  Loss: 0.052310347557 Validation Accuracy:  0.787199795246
Epoch 149, CIFAR-10 Batch 4:  Loss: 0.047447197139 Validation Accuracy:  0.795599818230
Epoch 149, CIFAR-10 Batch 5:  Loss: 0.048922229558 Validation Accuracy:  0.791399776936
Epoch 150, CIFAR-10 Batch 1:  Loss: 0.064670644701 Validation Accuracy:  0.797999799252
Epoch 150, CIFAR-10 Batch 2:  Loss: 0.048357892781 Validation Accuracy:  0.793999791145
Epoch 150, CIFAR-10 Batch 3:  Loss: 0.038494098932 Validation Accuracy:  0.795599758625
Epoch 150, CIFAR-10 Batch 4:  Loss: 0.058504715562 Validation Accuracy:  0.786999821663
Epoch 150, CIFAR-10 Batch 5:  Loss: 0.046565413475 Validation Accuracy:  0.785999834538
Epoch 151, CIFAR-10 Batch 1:  Loss: 0.085869379342 Validation Accuracy:  0.785999834538
Epoch 151, CIFAR-10 Batch 2:  Loss: 0.054358113557 Validation Accuracy:  0.795199751854
Epoch 151, CIFAR-10 Batch 3:  Loss: 0.043398242444 Validation Accuracy:  0.792599797249
Epoch 151, CIFAR-10 Batch 4:  Loss: 0.043627250940 Validation Accuracy:  0.794399857521
Epoch 151, CIFAR-10 Batch 5:  Loss: 0.051187548786 Validation Accuracy:  0.791199803352
Epoch 152, CIFAR-10 Batch 1:  Loss: 0.069967254996 Validation Accuracy:  0.789799869061
Epoch 152, CIFAR-10 Batch 2:  Loss: 0.052251279354 Validation Accuracy:  0.790399849415
Epoch 152, CIFAR-10 Batch 3:  Loss: 0.048884745687 Validation Accuracy:  0.798999786377
Epoch 152, CIFAR-10 Batch 4:  Loss: 0.046331729740 Validation Accuracy:  0.790999829769
Epoch 152, CIFAR-10 Batch 5:  Loss: 0.051782794297 Validation Accuracy:  0.791599810123
Epoch 153, CIFAR-10 Batch 1:  Loss: 0.086913570762 Validation Accuracy:  0.786399781704
Epoch 153, CIFAR-10 Batch 2:  Loss: 0.066767215729 Validation Accuracy:  0.792799830437
Epoch 153, CIFAR-10 Batch 3:  Loss: 0.043668493629 Validation Accuracy:  0.791599869728
Epoch 153, CIFAR-10 Batch 4:  Loss: 0.049859449267 Validation Accuracy:  0.795399785042
Epoch 153, CIFAR-10 Batch 5:  Loss: 0.039900474250 Validation Accuracy:  0.789599895477
Epoch 154, CIFAR-10 Batch 1:  Loss: 0.074962064624 Validation Accuracy:  0.794399857521
Epoch 154, CIFAR-10 Batch 2:  Loss: 0.062207214534 Validation Accuracy:  0.786399841309
Epoch 154, CIFAR-10 Batch 3:  Loss: 0.063211850822 Validation Accuracy:  0.786999940872
Epoch 154, CIFAR-10 Batch 4:  Loss: 0.041763249785 Validation Accuracy:  0.790599882603
Epoch 154, CIFAR-10 Batch 5:  Loss: 0.044520542026 Validation Accuracy:  0.792999804020
Epoch 155, CIFAR-10 Batch 1:  Loss: 0.069976374507 Validation Accuracy:  0.797399818897
Epoch 155, CIFAR-10 Batch 2:  Loss: 0.053767893463 Validation Accuracy:  0.786799788475
Epoch 155, CIFAR-10 Batch 3:  Loss: 0.051784582436 Validation Accuracy:  0.785199880600
Epoch 155, CIFAR-10 Batch 4:  Loss: 0.042938977480 Validation Accuracy:  0.790399789810
Epoch 155, CIFAR-10 Batch 5:  Loss: 0.053636871278 Validation Accuracy:  0.786399841309
Epoch 156, CIFAR-10 Batch 1:  Loss: 0.054627314210 Validation Accuracy:  0.801999866962
Epoch 156, CIFAR-10 Batch 2:  Loss: 0.048121694475 Validation Accuracy:  0.791799783707
Epoch 156, CIFAR-10 Batch 3:  Loss: 0.040084894747 Validation Accuracy:  0.791999757290
Epoch 156, CIFAR-10 Batch 4:  Loss: 0.034331843257 Validation Accuracy:  0.799999833107
Epoch 156, CIFAR-10 Batch 5:  Loss: 0.043856777251 Validation Accuracy:  0.791799783707
Epoch 157, CIFAR-10 Batch 1:  Loss: 0.053162820637 Validation Accuracy:  0.795999825001
Epoch 157, CIFAR-10 Batch 2:  Loss: 0.055350888520 Validation Accuracy:  0.792199850082
Epoch 157, CIFAR-10 Batch 3:  Loss: 0.037475626916 Validation Accuracy:  0.798399806023
Epoch 157, CIFAR-10 Batch 4:  Loss: 0.049360223114 Validation Accuracy:  0.789599835873
Epoch 157, CIFAR-10 Batch 5:  Loss: 0.044565450400 Validation Accuracy:  0.797799885273
Epoch 158, CIFAR-10 Batch 1:  Loss: 0.061140850186 Validation Accuracy:  0.802799820900
Epoch 158, CIFAR-10 Batch 2:  Loss: 0.051971711218 Validation Accuracy:  0.788599789143
Epoch 158, CIFAR-10 Batch 3:  Loss: 0.050624184310 Validation Accuracy:  0.795199811459
Epoch 158, CIFAR-10 Batch 4:  Loss: 0.050562508404 Validation Accuracy:  0.790199816227
Epoch 158, CIFAR-10 Batch 5:  Loss: 0.043909549713 Validation Accuracy:  0.796399831772
Epoch 159, CIFAR-10 Batch 1:  Loss: 0.062983505428 Validation Accuracy:  0.793399751186
Epoch 159, CIFAR-10 Batch 2:  Loss: 0.039368577302 Validation Accuracy:  0.799999833107
Epoch 159, CIFAR-10 Batch 3:  Loss: 0.053023252636 Validation Accuracy:  0.788599789143
Epoch 159, CIFAR-10 Batch 4:  Loss: 0.046167016029 Validation Accuracy:  0.803799867630
Epoch 159, CIFAR-10 Batch 5:  Loss: 0.046917982399 Validation Accuracy:  0.796599864960
Epoch 160, CIFAR-10 Batch 1:  Loss: 0.066383101046 Validation Accuracy:  0.796599805355
Epoch 160, CIFAR-10 Batch 2:  Loss: 0.057458959520 Validation Accuracy:  0.797399818897
Epoch 160, CIFAR-10 Batch 3:  Loss: 0.040241181850 Validation Accuracy:  0.804199814796
Epoch 160, CIFAR-10 Batch 4:  Loss: 0.042209632695 Validation Accuracy:  0.796999871731
Epoch 160, CIFAR-10 Batch 5:  Loss: 0.044011637568 Validation Accuracy:  0.795199871063
Epoch 161, CIFAR-10 Batch 1:  Loss: 0.058438323438 Validation Accuracy:  0.799999833107
Epoch 161, CIFAR-10 Batch 2:  Loss: 0.037404105067 Validation Accuracy:  0.798199892044
Epoch 161, CIFAR-10 Batch 3:  Loss: 0.048808529973 Validation Accuracy:  0.789199829102
Epoch 161, CIFAR-10 Batch 4:  Loss: 0.040080051869 Validation Accuracy:  0.795399844646
Epoch 161, CIFAR-10 Batch 5:  Loss: 0.043090093881 Validation Accuracy:  0.799999833107
Epoch 162, CIFAR-10 Batch 1:  Loss: 0.060022644699 Validation Accuracy:  0.794399857521
Epoch 162, CIFAR-10 Batch 2:  Loss: 0.069730192423 Validation Accuracy:  0.789799809456
Epoch 162, CIFAR-10 Batch 3:  Loss: 0.038430590183 Validation Accuracy:  0.794399857521
Epoch 162, CIFAR-10 Batch 4:  Loss: 0.044774752110 Validation Accuracy:  0.798999786377
Epoch 162, CIFAR-10 Batch 5:  Loss: 0.068218320608 Validation Accuracy:  0.776799798012
Epoch 163, CIFAR-10 Batch 1:  Loss: 0.069501556456 Validation Accuracy:  0.798399865627
Epoch 163, CIFAR-10 Batch 2:  Loss: 0.051098912954 Validation Accuracy:  0.790199816227
Epoch 163, CIFAR-10 Batch 3:  Loss: 0.035507958382 Validation Accuracy:  0.795599818230
Epoch 163, CIFAR-10 Batch 4:  Loss: 0.049000963569 Validation Accuracy:  0.797199845314
Epoch 163, CIFAR-10 Batch 5:  Loss: 0.048160880804 Validation Accuracy:  0.792999863625
Epoch 164, CIFAR-10 Batch 1:  Loss: 0.059265404940 Validation Accuracy:  0.797999858856
Epoch 164, CIFAR-10 Batch 2:  Loss: 0.047304201871 Validation Accuracy:  0.798199832439
Epoch 164, CIFAR-10 Batch 3:  Loss: 0.041871685535 Validation Accuracy:  0.789999783039
Epoch 164, CIFAR-10 Batch 4:  Loss: 0.043298617005 Validation Accuracy:  0.794599890709
Epoch 164, CIFAR-10 Batch 5:  Loss: 0.040339350700 Validation Accuracy:  0.786599814892
Epoch 165, CIFAR-10 Batch 1:  Loss: 0.051258079708 Validation Accuracy:  0.797599852085
Epoch 165, CIFAR-10 Batch 2:  Loss: 0.038194701076 Validation Accuracy:  0.796999812126
Epoch 165, CIFAR-10 Batch 3:  Loss: 0.043418504298 Validation Accuracy:  0.788599848747
Epoch 165, CIFAR-10 Batch 4:  Loss: 0.044775150716 Validation Accuracy:  0.790799856186
Epoch 165, CIFAR-10 Batch 5:  Loss: 0.048982478678 Validation Accuracy:  0.791799783707
Epoch 166, CIFAR-10 Batch 1:  Loss: 0.043891347945 Validation Accuracy:  0.802599787712
Epoch 166, CIFAR-10 Batch 2:  Loss: 0.048037998378 Validation Accuracy:  0.801999807358
Epoch 166, CIFAR-10 Batch 3:  Loss: 0.037603009492 Validation Accuracy:  0.791799783707
Epoch 166, CIFAR-10 Batch 4:  Loss: 0.039642766118 Validation Accuracy:  0.789799809456
Epoch 166, CIFAR-10 Batch 5:  Loss: 0.042355686426 Validation Accuracy:  0.799399852753
Epoch 167, CIFAR-10 Batch 1:  Loss: 0.048038929701 Validation Accuracy:  0.801599800587
Epoch 167, CIFAR-10 Batch 2:  Loss: 0.056397862732 Validation Accuracy:  0.802199780941
Epoch 167, CIFAR-10 Batch 3:  Loss: 0.043357726187 Validation Accuracy:  0.790199816227
Epoch 167, CIFAR-10 Batch 4:  Loss: 0.040429309011 Validation Accuracy:  0.799399793148
Epoch 167, CIFAR-10 Batch 5:  Loss: 0.044699758291 Validation Accuracy:  0.789399862289
Epoch 168, CIFAR-10 Batch 1:  Loss: 0.062161851674 Validation Accuracy:  0.780799806118
Epoch 168, CIFAR-10 Batch 2:  Loss: 0.055087469518 Validation Accuracy:  0.792799830437
Epoch 168, CIFAR-10 Batch 3:  Loss: 0.045294176787 Validation Accuracy:  0.795199811459
Epoch 168, CIFAR-10 Batch 4:  Loss: 0.035907424986 Validation Accuracy:  0.794999778271
Epoch 168, CIFAR-10 Batch 5:  Loss: 0.042422302067 Validation Accuracy:  0.794799804688
Epoch 169, CIFAR-10 Batch 1:  Loss: 0.058711059391 Validation Accuracy:  0.793799757957
Epoch 169, CIFAR-10 Batch 2:  Loss: 0.039301667362 Validation Accuracy:  0.794999837875
Epoch 169, CIFAR-10 Batch 3:  Loss: 0.044416129589 Validation Accuracy:  0.798999786377
Epoch 169, CIFAR-10 Batch 4:  Loss: 0.049145750701 Validation Accuracy:  0.791799843311
Epoch 169, CIFAR-10 Batch 5:  Loss: 0.042327906936 Validation Accuracy:  0.800399780273
Epoch 170, CIFAR-10 Batch 1:  Loss: 0.052375417203 Validation Accuracy:  0.804599821568
Epoch 170, CIFAR-10 Batch 2:  Loss: 0.044934716076 Validation Accuracy:  0.796799838543
Epoch 170, CIFAR-10 Batch 3:  Loss: 0.034937255085 Validation Accuracy:  0.793999791145
Epoch 170, CIFAR-10 Batch 4:  Loss: 0.044768944383 Validation Accuracy:  0.794399797916
Epoch 170, CIFAR-10 Batch 5:  Loss: 0.039391167462 Validation Accuracy:  0.796399831772
Epoch 171, CIFAR-10 Batch 1:  Loss: 0.057974029332 Validation Accuracy:  0.795599758625
Epoch 171, CIFAR-10 Batch 2:  Loss: 0.042152650654 Validation Accuracy:  0.801799774170
Epoch 171, CIFAR-10 Batch 3:  Loss: 0.043691251427 Validation Accuracy:  0.799999833107
Epoch 171, CIFAR-10 Batch 4:  Loss: 0.039739325643 Validation Accuracy:  0.797199845314
Epoch 171, CIFAR-10 Batch 5:  Loss: 0.027663430199 Validation Accuracy:  0.810199797153
Epoch 172, CIFAR-10 Batch 1:  Loss: 0.049626544118 Validation Accuracy:  0.800199806690
Epoch 172, CIFAR-10 Batch 2:  Loss: 0.038130123168 Validation Accuracy:  0.800199866295
Epoch 172, CIFAR-10 Batch 3:  Loss: 0.033213876188 Validation Accuracy:  0.801199853420
Epoch 172, CIFAR-10 Batch 4:  Loss: 0.042054846883 Validation Accuracy:  0.790799856186
Epoch 172, CIFAR-10 Batch 5:  Loss: 0.039253521711 Validation Accuracy:  0.795599877834
Epoch 173, CIFAR-10 Batch 1:  Loss: 0.066894784570 Validation Accuracy:  0.790799856186
Epoch 173, CIFAR-10 Batch 2:  Loss: 0.051628492773 Validation Accuracy:  0.796199798584
Epoch 173, CIFAR-10 Batch 3:  Loss: 0.030672945082 Validation Accuracy:  0.798799812794
Epoch 173, CIFAR-10 Batch 4:  Loss: 0.041313752532 Validation Accuracy:  0.802199840546
Epoch 173, CIFAR-10 Batch 5:  Loss: 0.036047853529 Validation Accuracy:  0.795399844646
Epoch 174, CIFAR-10 Batch 1:  Loss: 0.055252678692 Validation Accuracy:  0.803199768066
Epoch 174, CIFAR-10 Batch 2:  Loss: 0.033881612122 Validation Accuracy:  0.806199789047
Epoch 174, CIFAR-10 Batch 3:  Loss: 0.030367534608 Validation Accuracy:  0.800199806690
Epoch 174, CIFAR-10 Batch 4:  Loss: 0.036594025791 Validation Accuracy:  0.792999804020
Epoch 174, CIFAR-10 Batch 5:  Loss: 0.033473037183 Validation Accuracy:  0.793799817562
Epoch 175, CIFAR-10 Batch 1:  Loss: 0.051504775882 Validation Accuracy:  0.802799761295
Epoch 175, CIFAR-10 Batch 2:  Loss: 0.034635327756 Validation Accuracy:  0.803399860859
Epoch 175, CIFAR-10 Batch 3:  Loss: 0.029328241944 Validation Accuracy:  0.808599829674
Epoch 175, CIFAR-10 Batch 4:  Loss: 0.043339081109 Validation Accuracy:  0.800199806690
Epoch 175, CIFAR-10 Batch 5:  Loss: 0.041921079159 Validation Accuracy:  0.794999778271
Epoch 176, CIFAR-10 Batch 1:  Loss: 0.050309602171 Validation Accuracy:  0.803999722004
Epoch 176, CIFAR-10 Batch 2:  Loss: 0.030934397131 Validation Accuracy:  0.805199801922
Epoch 176, CIFAR-10 Batch 3:  Loss: 0.033031482249 Validation Accuracy:  0.799999833107
Epoch 176, CIFAR-10 Batch 4:  Loss: 0.042538736016 Validation Accuracy:  0.786999821663
Epoch 176, CIFAR-10 Batch 5:  Loss: 0.047042760998 Validation Accuracy:  0.783399820328
Epoch 177, CIFAR-10 Batch 1:  Loss: 0.045161362737 Validation Accuracy:  0.800199806690
Epoch 177, CIFAR-10 Batch 2:  Loss: 0.051641914994 Validation Accuracy:  0.799999773502
Epoch 177, CIFAR-10 Batch 3:  Loss: 0.041832521558 Validation Accuracy:  0.788199782372
Epoch 177, CIFAR-10 Batch 4:  Loss: 0.045652519912 Validation Accuracy:  0.793599843979
Epoch 177, CIFAR-10 Batch 5:  Loss: 0.036770034581 Validation Accuracy:  0.792999804020
Epoch 178, CIFAR-10 Batch 1:  Loss: 0.050118338317 Validation Accuracy:  0.797799825668
Epoch 178, CIFAR-10 Batch 2:  Loss: 0.042259540409 Validation Accuracy:  0.790799796581
Epoch 178, CIFAR-10 Batch 3:  Loss: 0.029699988663 Validation Accuracy:  0.797999799252
Epoch 178, CIFAR-10 Batch 4:  Loss: 0.040134377778 Validation Accuracy:  0.794199824333
Epoch 178, CIFAR-10 Batch 5:  Loss: 0.040134072304 Validation Accuracy:  0.788999855518
Epoch 179, CIFAR-10 Batch 1:  Loss: 0.044671222568 Validation Accuracy:  0.799999833107
Epoch 179, CIFAR-10 Batch 2:  Loss: 0.042622163892 Validation Accuracy:  0.803399860859
Epoch 179, CIFAR-10 Batch 3:  Loss: 0.030074736103 Validation Accuracy:  0.791799783707
Epoch 179, CIFAR-10 Batch 4:  Loss: 0.040306866169 Validation Accuracy:  0.804199814796
Epoch 179, CIFAR-10 Batch 5:  Loss: 0.037352532148 Validation Accuracy:  0.791199862957
Epoch 180, CIFAR-10 Batch 1:  Loss: 0.054712895304 Validation Accuracy:  0.798799812794
Epoch 180, CIFAR-10 Batch 2:  Loss: 0.039349623024 Validation Accuracy:  0.793799817562
Epoch 180, CIFAR-10 Batch 3:  Loss: 0.039333250374 Validation Accuracy:  0.789999783039
Epoch 180, CIFAR-10 Batch 4:  Loss: 0.040991827846 Validation Accuracy:  0.786999821663
Epoch 180, CIFAR-10 Batch 5:  Loss: 0.035416740924 Validation Accuracy:  0.796799778938
Epoch 181, CIFAR-10 Batch 1:  Loss: 0.054572030902 Validation Accuracy:  0.794999778271
Epoch 181, CIFAR-10 Batch 2:  Loss: 0.042547468096 Validation Accuracy:  0.791199803352
Epoch 181, CIFAR-10 Batch 3:  Loss: 0.031025400385 Validation Accuracy:  0.801599740982
Epoch 181, CIFAR-10 Batch 4:  Loss: 0.036893691868 Validation Accuracy:  0.801999807358
Epoch 181, CIFAR-10 Batch 5:  Loss: 0.036709409207 Validation Accuracy:  0.790799856186
Epoch 182, CIFAR-10 Batch 1:  Loss: 0.048246812075 Validation Accuracy:  0.800399839878
Epoch 182, CIFAR-10 Batch 2:  Loss: 0.035401567817 Validation Accuracy:  0.802799820900
Epoch 182, CIFAR-10 Batch 3:  Loss: 0.041720826179 Validation Accuracy:  0.801599860191
Epoch 182, CIFAR-10 Batch 4:  Loss: 0.040692448616 Validation Accuracy:  0.796199798584
Epoch 182, CIFAR-10 Batch 5:  Loss: 0.037288557738 Validation Accuracy:  0.793999850750
Epoch 183, CIFAR-10 Batch 1:  Loss: 0.040408574045 Validation Accuracy:  0.806599855423
Epoch 183, CIFAR-10 Batch 2:  Loss: 0.029058031738 Validation Accuracy:  0.801199793816
Epoch 183, CIFAR-10 Batch 3:  Loss: 0.032097544521 Validation Accuracy:  0.802999794483
Epoch 183, CIFAR-10 Batch 4:  Loss: 0.050540540367 Validation Accuracy:  0.785399854183
Epoch 183, CIFAR-10 Batch 5:  Loss: 0.043128039688 Validation Accuracy:  0.796999812126
Epoch 184, CIFAR-10 Batch 1:  Loss: 0.059435110539 Validation Accuracy:  0.800599813461
Epoch 184, CIFAR-10 Batch 2:  Loss: 0.041205845773 Validation Accuracy:  0.796799838543
Epoch 184, CIFAR-10 Batch 3:  Loss: 0.033891409636 Validation Accuracy:  0.806399822235
Epoch 184, CIFAR-10 Batch 4:  Loss: 0.037581104785 Validation Accuracy:  0.803599774837
Epoch 184, CIFAR-10 Batch 5:  Loss: 0.037682004273 Validation Accuracy:  0.797799825668
Epoch 185, CIFAR-10 Batch 1:  Loss: 0.057960189879 Validation Accuracy:  0.793199777603
Epoch 185, CIFAR-10 Batch 2:  Loss: 0.030490901321 Validation Accuracy:  0.797599792480
Epoch 185, CIFAR-10 Batch 3:  Loss: 0.032197397202 Validation Accuracy:  0.793199777603
Epoch 185, CIFAR-10 Batch 4:  Loss: 0.040181949735 Validation Accuracy:  0.801799833775
Epoch 185, CIFAR-10 Batch 5:  Loss: 0.031632095575 Validation Accuracy:  0.804799795151
Epoch 186, CIFAR-10 Batch 1:  Loss: 0.052907615900 Validation Accuracy:  0.804999768734
Epoch 186, CIFAR-10 Batch 2:  Loss: 0.036725088954 Validation Accuracy:  0.794999837875
Epoch 186, CIFAR-10 Batch 3:  Loss: 0.033474963158 Validation Accuracy:  0.806399822235
Epoch 186, CIFAR-10 Batch 4:  Loss: 0.037930704653 Validation Accuracy:  0.796999812126
Epoch 186, CIFAR-10 Batch 5:  Loss: 0.034863810986 Validation Accuracy:  0.798999845982
Epoch 187, CIFAR-10 Batch 1:  Loss: 0.052473802119 Validation Accuracy:  0.804799854755
Epoch 187, CIFAR-10 Batch 2:  Loss: 0.036719504744 Validation Accuracy:  0.797599852085
Epoch 187, CIFAR-10 Batch 3:  Loss: 0.024611769244 Validation Accuracy:  0.798199772835
Epoch 187, CIFAR-10 Batch 4:  Loss: 0.039889499545 Validation Accuracy:  0.799199819565
Epoch 187, CIFAR-10 Batch 5:  Loss: 0.047099351883 Validation Accuracy:  0.799199819565
Epoch 188, CIFAR-10 Batch 1:  Loss: 0.048319172114 Validation Accuracy:  0.802599847317
Epoch 188, CIFAR-10 Batch 2:  Loss: 0.045321717858 Validation Accuracy:  0.799399793148
Epoch 188, CIFAR-10 Batch 3:  Loss: 0.035323008895 Validation Accuracy:  0.800599753857
Epoch 188, CIFAR-10 Batch 4:  Loss: 0.039901103824 Validation Accuracy:  0.799399793148
Epoch 188, CIFAR-10 Batch 5:  Loss: 0.031599014997 Validation Accuracy:  0.799799859524
Epoch 189, CIFAR-10 Batch 1:  Loss: 0.049372881651 Validation Accuracy:  0.803999781609
Epoch 189, CIFAR-10 Batch 2:  Loss: 0.035065580159 Validation Accuracy:  0.805199801922
Epoch 189, CIFAR-10 Batch 3:  Loss: 0.032574761659 Validation Accuracy:  0.794999778271
Epoch 189, CIFAR-10 Batch 4:  Loss: 0.039471670985 Validation Accuracy:  0.803399741650
Epoch 189, CIFAR-10 Batch 5:  Loss: 0.037755247205 Validation Accuracy:  0.789599895477
Epoch 190, CIFAR-10 Batch 1:  Loss: 0.044599778950 Validation Accuracy:  0.804399847984
Epoch 190, CIFAR-10 Batch 2:  Loss: 0.040791470557 Validation Accuracy:  0.795599818230
Epoch 190, CIFAR-10 Batch 3:  Loss: 0.036557305604 Validation Accuracy:  0.794999837875
Epoch 190, CIFAR-10 Batch 4:  Loss: 0.044122904539 Validation Accuracy:  0.803999781609
Epoch 190, CIFAR-10 Batch 5:  Loss: 0.029852746055 Validation Accuracy:  0.798599779606
Epoch 191, CIFAR-10 Batch 1:  Loss: 0.044741377234 Validation Accuracy:  0.797999739647
Epoch 191, CIFAR-10 Batch 2:  Loss: 0.036868810654 Validation Accuracy:  0.796799778938
Epoch 191, CIFAR-10 Batch 3:  Loss: 0.019381793216 Validation Accuracy:  0.801799774170
Epoch 191, CIFAR-10 Batch 4:  Loss: 0.036802377552 Validation Accuracy:  0.800799787045
Epoch 191, CIFAR-10 Batch 5:  Loss: 0.029804114252 Validation Accuracy:  0.795199811459
Epoch 192, CIFAR-10 Batch 1:  Loss: 0.051647771150 Validation Accuracy:  0.803399801254
Epoch 192, CIFAR-10 Batch 2:  Loss: 0.041431881487 Validation Accuracy:  0.799399793148
Epoch 192, CIFAR-10 Batch 3:  Loss: 0.031681630760 Validation Accuracy:  0.798799753189
Epoch 192, CIFAR-10 Batch 4:  Loss: 0.040355537087 Validation Accuracy:  0.803799748421
Epoch 192, CIFAR-10 Batch 5:  Loss: 0.035733968019 Validation Accuracy:  0.798999786377
Epoch 193, CIFAR-10 Batch 1:  Loss: 0.061235174537 Validation Accuracy:  0.790799796581
Epoch 193, CIFAR-10 Batch 2:  Loss: 0.041608382016 Validation Accuracy:  0.795999765396
Epoch 193, CIFAR-10 Batch 3:  Loss: 0.031010497361 Validation Accuracy:  0.804199814796
Epoch 193, CIFAR-10 Batch 4:  Loss: 0.041067998856 Validation Accuracy:  0.791599810123
Epoch 193, CIFAR-10 Batch 5:  Loss: 0.035052686930 Validation Accuracy:  0.801399827003
Epoch 194, CIFAR-10 Batch 1:  Loss: 0.053269013762 Validation Accuracy:  0.799399793148
Epoch 194, CIFAR-10 Batch 2:  Loss: 0.038987100124 Validation Accuracy:  0.802599787712
Epoch 194, CIFAR-10 Batch 3:  Loss: 0.030706392601 Validation Accuracy:  0.804599761963
Epoch 194, CIFAR-10 Batch 4:  Loss: 0.034130088985 Validation Accuracy:  0.806799829006
Epoch 194, CIFAR-10 Batch 5:  Loss: 0.029171936214 Validation Accuracy:  0.809399783611
Epoch 195, CIFAR-10 Batch 1:  Loss: 0.044521693140 Validation Accuracy:  0.800199806690
Epoch 195, CIFAR-10 Batch 2:  Loss: 0.037810996175 Validation Accuracy:  0.796599864960
Epoch 195, CIFAR-10 Batch 3:  Loss: 0.029456306249 Validation Accuracy:  0.800999760628
Epoch 195, CIFAR-10 Batch 4:  Loss: 0.049284167588 Validation Accuracy:  0.798999786377
Epoch 195, CIFAR-10 Batch 5:  Loss: 0.029528496787 Validation Accuracy:  0.796799838543
Epoch 196, CIFAR-10 Batch 1:  Loss: 0.051525279880 Validation Accuracy:  0.787799894810
Epoch 196, CIFAR-10 Batch 2:  Loss: 0.043432742357 Validation Accuracy:  0.796599864960
Epoch 196, CIFAR-10 Batch 3:  Loss: 0.022753134370 Validation Accuracy:  0.804999828339
Epoch 196, CIFAR-10 Batch 4:  Loss: 0.032439455390 Validation Accuracy:  0.799199819565
Epoch 196, CIFAR-10 Batch 5:  Loss: 0.032026220113 Validation Accuracy:  0.804199755192
Epoch 197, CIFAR-10 Batch 1:  Loss: 0.043518777937 Validation Accuracy:  0.804599821568
Epoch 197, CIFAR-10 Batch 2:  Loss: 0.038016244769 Validation Accuracy:  0.796799838543
Epoch 197, CIFAR-10 Batch 3:  Loss: 0.026239195839 Validation Accuracy:  0.797799825668
Epoch 197, CIFAR-10 Batch 4:  Loss: 0.031855482608 Validation Accuracy:  0.803399801254
Epoch 197, CIFAR-10 Batch 5:  Loss: 0.039906308055 Validation Accuracy:  0.793799877167
Epoch 198, CIFAR-10 Batch 1:  Loss: 0.040127124637 Validation Accuracy:  0.805199801922
Epoch 198, CIFAR-10 Batch 2:  Loss: 0.026148188859 Validation Accuracy:  0.803399801254
Epoch 198, CIFAR-10 Batch 3:  Loss: 0.031678937376 Validation Accuracy:  0.802599847317
Epoch 198, CIFAR-10 Batch 4:  Loss: 0.035261917859 Validation Accuracy:  0.806399822235
Epoch 198, CIFAR-10 Batch 5:  Loss: 0.033777333796 Validation Accuracy:  0.803999841213
Epoch 199, CIFAR-10 Batch 1:  Loss: 0.039749197662 Validation Accuracy:  0.805199861526
Epoch 199, CIFAR-10 Batch 2:  Loss: 0.028901858255 Validation Accuracy:  0.802199780941
Epoch 199, CIFAR-10 Batch 3:  Loss: 0.028019156307 Validation Accuracy:  0.803599834442
Epoch 199, CIFAR-10 Batch 4:  Loss: 0.032696623355 Validation Accuracy:  0.802599847317
Epoch 199, CIFAR-10 Batch 5:  Loss: 0.035146750510 Validation Accuracy:  0.798799812794
Epoch 200, CIFAR-10 Batch 1:  Loss: 0.049179513007 Validation Accuracy:  0.806199789047
Epoch 200, CIFAR-10 Batch 2:  Loss: 0.036230109632 Validation Accuracy:  0.799799859524
Epoch 200, CIFAR-10 Batch 3:  Loss: 0.022133305669 Validation Accuracy:  0.808599829674
Epoch 200, CIFAR-10 Batch 4:  Loss: 0.031957477331 Validation Accuracy:  0.810799777508
Epoch 200, CIFAR-10 Batch 5:  Loss: 0.028386402875 Validation Accuracy:  0.801399827003

Checkpoint

The model has been saved to disk.

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_test.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for test_feature_batch, test_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: test_feature_batch, loaded_y: test_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


Testing Accuracy: 0.7881170382165605

Why 50-80% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 80%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.


In [ ]: