Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

# Use Floyd's cifar-10 dataset if present
floyd_cifar10_location = '/input/cifar-10/python.tar.gz'
if isfile(floyd_cifar10_location):
    tar_gz_path = floyd_cifar10_location
else:
    tar_gz_path = 'cifar-10-python.tar.gz'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(tar_gz_path):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            tar_gz_path,
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open(tar_gz_path) as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


All files found!

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

  • airplane
  • automobile
  • bird
  • cat
  • deer
  • dog
  • frog
  • horse
  • ship
  • truck

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.


In [2]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id = 1
sample_id = 5
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 1:
Samples: 10000
Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}
First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]

Example of Image 5:
Image - Min Value: 0 Max Value: 252
Image - Shape: (32, 32, 3)
Label - Label Id: 1 Name: automobile

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.


In [3]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    # TODO: Implement Function
    a = 0
    b = 255
    return (x-a)/(b-a)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_normalize(normalize)


Tests Passed

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.

Hint: Don't reinvent the wheel.


In [4]:
def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    # TODO: Implement Function
    num = len(x)
    arr = np.zeros((num, 10))
    for i, xl in enumerate(x):
        arr[i][xl] = 1
    return arr


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_one_hot_encode(one_hot_encode)


Tests Passed

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.


In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [6]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

Note: If you're finding it hard to dedicate enough time for this course each week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages to build each layer, except the layers you build in the "Convolutional and Max Pooling Layer" section. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

However, if you would like to get the most out of this course, try to solve all the problems without using anything from the TF Layers packages. You can still use classes from other packages that happen to have the same name as ones you find in TF Layers! For example, instead of using the TF Layers version of the conv2d class, tf.layers.conv2d, you would want to use the TF Neural Network version of conv2d, tf.nn.conv2d.

Let's begin!

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

  • Implement neural_net_image_input
    • Return a TF Placeholder
    • Set the shape using image_shape with batch size set to None.
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_label_input
    • Return a TF Placeholder
    • Set the shape using n_classes with batch size set to None.
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_keep_prob_input
    • Return a TF Placeholder for dropout keep probability.
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.

These names will be used at the end of the project to load your saved model.

Note: None for shapes in TensorFlow allow for a dynamic size.


In [7]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a batch of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    # TODO: Implement Function
    #s = np.ones((1,1))
    #s[0][0]=None
    #print(s)
    #s = np.c_[s, image_shape]
    c = np.concatenate(([None], image_shape))
    x = tf.placeholder(dtype=tf.float32, shape=c, name='x') #shape=c
    print(x)
    return x


def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    y = tf.placeholder(dtype=tf.float32, shape=([None, n_classes]), name='y')
    print(y)
    return y


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    z = tf.placeholder(dtype=tf.float32, name='keep_prob')
    return z


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Tensor("x:0", shape=(?, 32, 32, 3), dtype=float32)
Image Input Tests Passed.
Tensor("y:0", shape=(?, 10), dtype=float32)
Label Input Tests Passed.
Keep Prob Tests Passed.

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • Apply a convolution to x_tensor using weight and conv_strides.
    • We recommend you use same padding, but you're welcome to use any padding.
  • Add bias
  • Add a nonlinear activation to the convolution.
  • Apply Max Pooling using pool_ksize and pool_strides.
    • We recommend you use same padding, but you're welcome to use any padding.

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer, but you can still use TensorFlow's Neural Network package. You may still use the shortcut option for all the other layers.


In [8]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_ksize: kernal size 2-D Tuple for the convolutional layer
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """
    # TODO: Implement Function
    
    print(x_tensor)
    #print(conv_num_outputs)
    print(conv_ksize)
    #print(conv_strides)
    #print(pool_ksize)
    #print(pool_strides)
    dimension = x_tensor.get_shape().as_list()
    shape = list(conv_ksize + (dimension[-1],) + (conv_num_outputs,))
    print(shape)
    weights = tf.Variable(tf.truncated_normal(shape, 0, 0.1))
    
    bias = tf.Variable(tf.zeros(conv_num_outputs))
    
    #print(conv_num_outputs)
    #print(weights)
    #print(bias)
   
    conv_layer = tf.nn.conv2d(x_tensor, weights, strides = list((1,)+conv_strides+(1,)), padding='SAME')
    conv_layer = tf.nn.bias_add(conv_layer, bias)
    conv_layer = tf.nn.relu(conv_layer)
    
    conv_layer = tf.nn.max_pool(
        conv_layer,
        ksize=[1, pool_ksize[0], pool_ksize[1], 1],
        strides=[1, pool_strides[0], pool_strides[1], 1],
        padding='SAME')
    #print("max pool conv_layer")
    #print(conv_layer)
    return conv_layer


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_con_pool(conv2d_maxpool)


Tensor("Placeholder:0", shape=(?, 32, 32, 5), dtype=float32)
(2, 2)
[2, 2, 5, 10]
Tests Passed

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [9]:
from numpy import prod
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    # TODO: Implement Function
    dimension = x_tensor.get_shape().as_list()    
    return tf.reshape(x_tensor,[-1,prod(dimension[1:])])


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_flatten(flatten)


Tests Passed

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [10]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    print("Full_conn Variables")
    #print(x_tensor)
    dimension = x_tensor.get_shape().as_list()
    shape = list( (dimension[-1],) + (num_outputs,))
    #print(shape)
    weight = tf.Variable(tf.truncated_normal(shape,0,0.1))
    bias = tf.Variable(tf.zeros(num_outputs))
    return tf.nn.relu(tf.add(tf.matmul(x_tensor,weight), bias))


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_fully_conn(fully_conn)


Full_conn Variables
Tests Passed

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.

Note: Activation, softmax, or cross entropy should not be applied to this.


In [11]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    #print(x_tensor)
    #print(num_outputs)
    #print("Full_conn Variables")
    
    dimension = x_tensor.get_shape().as_list()
    shape = list( (dimension[-1],) + (num_outputs,))
    #print(shape)
    weight = tf.Variable(tf.truncated_normal(shape,0,0.01))
    bias = tf.Variable(tf.zeros(num_outputs))
    return tf.add(tf.matmul(x_tensor,weight), bias)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_output(output)


Tests Passed

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • Apply a Flatten Layer
  • Apply 1, 2, or 3 Fully Connected Layers
  • Apply an Output Layer
  • Return the output
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.

In [12]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """
    # TODO: Apply 1, 2, or 3 Convolution and Max Pool layers
    #    Play around with different number of outputs, kernel size and stride
    # Function Definition from Above:
    #print(x)
    #print(keep_prob)
    
    n_class = 18
    conNet = conv2d_maxpool(x, conv_num_outputs=n_class, conv_ksize=(4, 4), conv_strides=(2, 2), pool_ksize=(8, 8), pool_strides=(2, 2))
    conNet = tf.nn.dropout(conNet, keep_prob)
    #conNet_Max_2 = conv2d_maxpool(conNet_Max_1, n_class, [1, 1], [2, 2], [2, 2], [2, 2], weights['wc2'], biases['bc2'])

    # TODO: Apply a Flatten Layer
    # Function Definition from Above:
    conNet = flatten(conNet)
    
    

    # TODO: Apply 1, 2, or 3 Fully Connected Layers
    #    Play around with different number of outputs
    # Function Definition from Above:
    conNet = fully_conn(conNet, 384)
    
    conNet = tf.nn.dropout(conNet, keep_prob)
    
    # TODO: Apply an Output Layer
    #    Set this to the number of classes
    # Function Definition from Above:
    conNet = output(conNet, 10)
    
    
    # TODO: return output
    return conNet


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""

##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

#print(accuracy)

tests.test_conv_net(conv_net)


Tensor("x:0", shape=(?, 32, 32, 3), dtype=float32)
Tensor("y:0", shape=(?, 10), dtype=float32)
Tensor("x:0", shape=(?, 32, 32, 3), dtype=float32)
(4, 4)
[4, 4, 3, 18]
Full_conn Variables
Tensor("Placeholder:0", shape=(?, 32, 32, 3), dtype=float32)
(4, 4)
[4, 4, 3, 18]
Full_conn Variables
Neural Network Built!

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

  • x for image input
  • y for labels
  • keep_prob for keep probability for dropout

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

Note: Nothing needs to be returned. This function is only optimizing the neural network.


In [13]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    # TODO: Implement Function
    
    session.run(optimizer, feed_dict={x:feature_batch, y:label_batch, keep_prob:keep_probability})
        
        


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_train_nn(train_neural_network)


Tests Passed

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.


In [14]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """
    # TODO: Implement Function
    loss = session.run(cost, feed_dict={x:feature_batch, y:label_batch, keep_prob:1.0})
    valid_acc = sess.run(accuracy, feed_dict={
                x: valid_features,
                y: valid_labels,
                keep_prob: 1.})
    print('Loss: {:>10.4f} Validation Accuracy: {:.6f}'.format(
                loss,
                valid_acc))

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
    • 64
    • 128
    • 256
    • ...
  • Set keep_probability to the probability of keeping a node using dropout

In [15]:
# TODO: Tune Parameters
epochs = 128
batch_size = 256
keep_probability = 0.5

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.


In [16]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  Loss:     2.1080 Validation Accuracy: 0.252800
Epoch  2, CIFAR-10 Batch 1:  Loss:     1.9354 Validation Accuracy: 0.348400
Epoch  3, CIFAR-10 Batch 1:  Loss:     1.8102 Validation Accuracy: 0.405800
Epoch  4, CIFAR-10 Batch 1:  Loss:     1.6975 Validation Accuracy: 0.425800
Epoch  5, CIFAR-10 Batch 1:  Loss:     1.5902 Validation Accuracy: 0.439200
Epoch  6, CIFAR-10 Batch 1:  Loss:     1.5137 Validation Accuracy: 0.458200
Epoch  7, CIFAR-10 Batch 1:  Loss:     1.4278 Validation Accuracy: 0.469200
Epoch  8, CIFAR-10 Batch 1:  Loss:     1.3748 Validation Accuracy: 0.472600
Epoch  9, CIFAR-10 Batch 1:  Loss:     1.3333 Validation Accuracy: 0.471600
Epoch 10, CIFAR-10 Batch 1:  Loss:     1.2662 Validation Accuracy: 0.484200
Epoch 11, CIFAR-10 Batch 1:  Loss:     1.2060 Validation Accuracy: 0.487200
Epoch 12, CIFAR-10 Batch 1:  Loss:     1.1792 Validation Accuracy: 0.488200
Epoch 13, CIFAR-10 Batch 1:  Loss:     1.1296 Validation Accuracy: 0.499000
Epoch 14, CIFAR-10 Batch 1:  Loss:     1.0964 Validation Accuracy: 0.498800
Epoch 15, CIFAR-10 Batch 1:  Loss:     1.0889 Validation Accuracy: 0.500600
Epoch 16, CIFAR-10 Batch 1:  Loss:     1.0572 Validation Accuracy: 0.510000
Epoch 17, CIFAR-10 Batch 1:  Loss:     1.0158 Validation Accuracy: 0.513400
Epoch 18, CIFAR-10 Batch 1:  Loss:     0.9964 Validation Accuracy: 0.516800
Epoch 19, CIFAR-10 Batch 1:  Loss:     1.0025 Validation Accuracy: 0.516000
Epoch 20, CIFAR-10 Batch 1:  Loss:     0.9440 Validation Accuracy: 0.521200
Epoch 21, CIFAR-10 Batch 1:  Loss:     0.9151 Validation Accuracy: 0.525400
Epoch 22, CIFAR-10 Batch 1:  Loss:     0.9301 Validation Accuracy: 0.533600
Epoch 23, CIFAR-10 Batch 1:  Loss:     0.9154 Validation Accuracy: 0.535600
Epoch 24, CIFAR-10 Batch 1:  Loss:     0.8779 Validation Accuracy: 0.537800
Epoch 25, CIFAR-10 Batch 1:  Loss:     0.8808 Validation Accuracy: 0.538600
Epoch 26, CIFAR-10 Batch 1:  Loss:     0.8577 Validation Accuracy: 0.546400
Epoch 27, CIFAR-10 Batch 1:  Loss:     0.8471 Validation Accuracy: 0.545400
Epoch 28, CIFAR-10 Batch 1:  Loss:     0.8409 Validation Accuracy: 0.541400
Epoch 29, CIFAR-10 Batch 1:  Loss:     0.8113 Validation Accuracy: 0.547400
Epoch 30, CIFAR-10 Batch 1:  Loss:     0.7835 Validation Accuracy: 0.551000
Epoch 31, CIFAR-10 Batch 1:  Loss:     0.7936 Validation Accuracy: 0.547800
Epoch 32, CIFAR-10 Batch 1:  Loss:     0.7527 Validation Accuracy: 0.558400
Epoch 33, CIFAR-10 Batch 1:  Loss:     0.7161 Validation Accuracy: 0.554400
Epoch 34, CIFAR-10 Batch 1:  Loss:     0.7443 Validation Accuracy: 0.567600
Epoch 35, CIFAR-10 Batch 1:  Loss:     0.7053 Validation Accuracy: 0.571000
Epoch 36, CIFAR-10 Batch 1:  Loss:     0.7023 Validation Accuracy: 0.566800
Epoch 37, CIFAR-10 Batch 1:  Loss:     0.6783 Validation Accuracy: 0.567000
Epoch 38, CIFAR-10 Batch 1:  Loss:     0.6515 Validation Accuracy: 0.564200
Epoch 39, CIFAR-10 Batch 1:  Loss:     0.6593 Validation Accuracy: 0.567800
Epoch 40, CIFAR-10 Batch 1:  Loss:     0.6617 Validation Accuracy: 0.567200
Epoch 41, CIFAR-10 Batch 1:  Loss:     0.6289 Validation Accuracy: 0.570600
Epoch 42, CIFAR-10 Batch 1:  Loss:     0.6274 Validation Accuracy: 0.572800
Epoch 43, CIFAR-10 Batch 1:  Loss:     0.5931 Validation Accuracy: 0.571400
Epoch 44, CIFAR-10 Batch 1:  Loss:     0.5893 Validation Accuracy: 0.573400
Epoch 45, CIFAR-10 Batch 1:  Loss:     0.5721 Validation Accuracy: 0.576200
Epoch 46, CIFAR-10 Batch 1:  Loss:     0.6080 Validation Accuracy: 0.575200
Epoch 47, CIFAR-10 Batch 1:  Loss:     0.5667 Validation Accuracy: 0.576800
Epoch 48, CIFAR-10 Batch 1:  Loss:     0.5660 Validation Accuracy: 0.577600
Epoch 49, CIFAR-10 Batch 1:  Loss:     0.5394 Validation Accuracy: 0.579400
Epoch 50, CIFAR-10 Batch 1:  Loss:     0.5290 Validation Accuracy: 0.576400
Epoch 51, CIFAR-10 Batch 1:  Loss:     0.5050 Validation Accuracy: 0.582200
Epoch 52, CIFAR-10 Batch 1:  Loss:     0.5079 Validation Accuracy: 0.576800
Epoch 53, CIFAR-10 Batch 1:  Loss:     0.4803 Validation Accuracy: 0.584200
Epoch 54, CIFAR-10 Batch 1:  Loss:     0.4857 Validation Accuracy: 0.573400
Epoch 55, CIFAR-10 Batch 1:  Loss:     0.4724 Validation Accuracy: 0.578800
Epoch 56, CIFAR-10 Batch 1:  Loss:     0.4729 Validation Accuracy: 0.573600
Epoch 57, CIFAR-10 Batch 1:  Loss:     0.4641 Validation Accuracy: 0.582400
Epoch 58, CIFAR-10 Batch 1:  Loss:     0.4455 Validation Accuracy: 0.586200
Epoch 59, CIFAR-10 Batch 1:  Loss:     0.4484 Validation Accuracy: 0.586600
Epoch 60, CIFAR-10 Batch 1:  Loss:     0.4274 Validation Accuracy: 0.582200
Epoch 61, CIFAR-10 Batch 1:  Loss:     0.4271 Validation Accuracy: 0.578000
Epoch 62, CIFAR-10 Batch 1:  Loss:     0.3964 Validation Accuracy: 0.581600
Epoch 63, CIFAR-10 Batch 1:  Loss:     0.4112 Validation Accuracy: 0.580400
Epoch 64, CIFAR-10 Batch 1:  Loss:     0.4071 Validation Accuracy: 0.579600
Epoch 65, CIFAR-10 Batch 1:  Loss:     0.3833 Validation Accuracy: 0.585800
Epoch 66, CIFAR-10 Batch 1:  Loss:     0.3769 Validation Accuracy: 0.587000
Epoch 67, CIFAR-10 Batch 1:  Loss:     0.3673 Validation Accuracy: 0.580600
Epoch 68, CIFAR-10 Batch 1:  Loss:     0.3576 Validation Accuracy: 0.589600
Epoch 69, CIFAR-10 Batch 1:  Loss:     0.3787 Validation Accuracy: 0.584000
Epoch 70, CIFAR-10 Batch 1:  Loss:     0.3591 Validation Accuracy: 0.581200
Epoch 71, CIFAR-10 Batch 1:  Loss:     0.3268 Validation Accuracy: 0.584400
Epoch 72, CIFAR-10 Batch 1:  Loss:     0.3395 Validation Accuracy: 0.581200
Epoch 73, CIFAR-10 Batch 1:  Loss:     0.3324 Validation Accuracy: 0.586200
Epoch 74, CIFAR-10 Batch 1:  Loss:     0.3249 Validation Accuracy: 0.579200
Epoch 75, CIFAR-10 Batch 1:  Loss:     0.3050 Validation Accuracy: 0.592200
Epoch 76, CIFAR-10 Batch 1:  Loss:     0.2871 Validation Accuracy: 0.591400
Epoch 77, CIFAR-10 Batch 1:  Loss:     0.2845 Validation Accuracy: 0.584000
Epoch 78, CIFAR-10 Batch 1:  Loss:     0.2605 Validation Accuracy: 0.594600
Epoch 79, CIFAR-10 Batch 1:  Loss:     0.2700 Validation Accuracy: 0.589400
Epoch 80, CIFAR-10 Batch 1:  Loss:     0.2610 Validation Accuracy: 0.592800
Epoch 81, CIFAR-10 Batch 1:  Loss:     0.2574 Validation Accuracy: 0.590200
Epoch 82, CIFAR-10 Batch 1:  Loss:     0.2598 Validation Accuracy: 0.584800
Epoch 83, CIFAR-10 Batch 1:  Loss:     0.2557 Validation Accuracy: 0.587600
Epoch 84, CIFAR-10 Batch 1:  Loss:     0.2460 Validation Accuracy: 0.590400
Epoch 85, CIFAR-10 Batch 1:  Loss:     0.2344 Validation Accuracy: 0.588800
Epoch 86, CIFAR-10 Batch 1:  Loss:     0.2235 Validation Accuracy: 0.591600
Epoch 87, CIFAR-10 Batch 1:  Loss:     0.2293 Validation Accuracy: 0.587400
Epoch 88, CIFAR-10 Batch 1:  Loss:     0.2312 Validation Accuracy: 0.585200
Epoch 89, CIFAR-10 Batch 1:  Loss:     0.2085 Validation Accuracy: 0.590400
Epoch 90, CIFAR-10 Batch 1:  Loss:     0.2022 Validation Accuracy: 0.593800
Epoch 91, CIFAR-10 Batch 1:  Loss:     0.1923 Validation Accuracy: 0.589800
Epoch 92, CIFAR-10 Batch 1:  Loss:     0.1945 Validation Accuracy: 0.587400
Epoch 93, CIFAR-10 Batch 1:  Loss:     0.1932 Validation Accuracy: 0.587800
Epoch 94, CIFAR-10 Batch 1:  Loss:     0.1888 Validation Accuracy: 0.592000
Epoch 95, CIFAR-10 Batch 1:  Loss:     0.1839 Validation Accuracy: 0.589000
Epoch 96, CIFAR-10 Batch 1:  Loss:     0.1738 Validation Accuracy: 0.596400
Epoch 97, CIFAR-10 Batch 1:  Loss:     0.1725 Validation Accuracy: 0.585800
Epoch 98, CIFAR-10 Batch 1:  Loss:     0.1776 Validation Accuracy: 0.583600
Epoch 99, CIFAR-10 Batch 1:  Loss:     0.1600 Validation Accuracy: 0.592600
Epoch 100, CIFAR-10 Batch 1:  Loss:     0.1687 Validation Accuracy: 0.590200
Epoch 101, CIFAR-10 Batch 1:  Loss:     0.1535 Validation Accuracy: 0.593000
Epoch 102, CIFAR-10 Batch 1:  Loss:     0.1538 Validation Accuracy: 0.593000
Epoch 103, CIFAR-10 Batch 1:  Loss:     0.1581 Validation Accuracy: 0.578600
Epoch 104, CIFAR-10 Batch 1:  Loss:     0.1728 Validation Accuracy: 0.575800
Epoch 105, CIFAR-10 Batch 1:  Loss:     0.1479 Validation Accuracy: 0.589800
Epoch 106, CIFAR-10 Batch 1:  Loss:     0.1433 Validation Accuracy: 0.580600
Epoch 107, CIFAR-10 Batch 1:  Loss:     0.1559 Validation Accuracy: 0.590200
Epoch 108, CIFAR-10 Batch 1:  Loss:     0.1395 Validation Accuracy: 0.598000
Epoch 109, CIFAR-10 Batch 1:  Loss:     0.1268 Validation Accuracy: 0.589400
Epoch 110, CIFAR-10 Batch 1:  Loss:     0.1355 Validation Accuracy: 0.588800
Epoch 111, CIFAR-10 Batch 1:  Loss:     0.1291 Validation Accuracy: 0.584200
Epoch 112, CIFAR-10 Batch 1:  Loss:     0.1380 Validation Accuracy: 0.589800
Epoch 113, CIFAR-10 Batch 1:  Loss:     0.1388 Validation Accuracy: 0.596600
Epoch 114, CIFAR-10 Batch 1:  Loss:     0.1135 Validation Accuracy: 0.595000
Epoch 115, CIFAR-10 Batch 1:  Loss:     0.1119 Validation Accuracy: 0.591800
Epoch 116, CIFAR-10 Batch 1:  Loss:     0.1229 Validation Accuracy: 0.588400
Epoch 117, CIFAR-10 Batch 1:  Loss:     0.1213 Validation Accuracy: 0.582600
Epoch 118, CIFAR-10 Batch 1:  Loss:     0.1096 Validation Accuracy: 0.589200
Epoch 119, CIFAR-10 Batch 1:  Loss:     0.1161 Validation Accuracy: 0.587400
Epoch 120, CIFAR-10 Batch 1:  Loss:     0.1156 Validation Accuracy: 0.595600
Epoch 121, CIFAR-10 Batch 1:  Loss:     0.1098 Validation Accuracy: 0.594000
Epoch 122, CIFAR-10 Batch 1:  Loss:     0.1019 Validation Accuracy: 0.588000
Epoch 123, CIFAR-10 Batch 1:  Loss:     0.0920 Validation Accuracy: 0.586200
Epoch 124, CIFAR-10 Batch 1:  Loss:     0.1082 Validation Accuracy: 0.583200
Epoch 125, CIFAR-10 Batch 1:  Loss:     0.0920 Validation Accuracy: 0.600200
Epoch 126, CIFAR-10 Batch 1:  Loss:     0.0948 Validation Accuracy: 0.592600
Epoch 127, CIFAR-10 Batch 1:  Loss:     0.0909 Validation Accuracy: 0.588600
Epoch 128, CIFAR-10 Batch 1:  Loss:     0.0805 Validation Accuracy: 0.593600

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.


In [17]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  Loss:     2.1251 Validation Accuracy: 0.260000
Epoch  1, CIFAR-10 Batch 2:  Loss:     1.8865 Validation Accuracy: 0.325800
Epoch  1, CIFAR-10 Batch 3:  Loss:     1.6439 Validation Accuracy: 0.365400
Epoch  1, CIFAR-10 Batch 4:  Loss:     1.6670 Validation Accuracy: 0.411600
Epoch  1, CIFAR-10 Batch 5:  Loss:     1.6809 Validation Accuracy: 0.432600
Epoch  2, CIFAR-10 Batch 1:  Loss:     1.6583 Validation Accuracy: 0.443000
Epoch  2, CIFAR-10 Batch 2:  Loss:     1.5293 Validation Accuracy: 0.441400
Epoch  2, CIFAR-10 Batch 3:  Loss:     1.3920 Validation Accuracy: 0.454400
Epoch  2, CIFAR-10 Batch 4:  Loss:     1.6262 Validation Accuracy: 0.473600
Epoch  2, CIFAR-10 Batch 5:  Loss:     1.5336 Validation Accuracy: 0.482600
Epoch  3, CIFAR-10 Batch 1:  Loss:     1.4758 Validation Accuracy: 0.484800
Epoch  3, CIFAR-10 Batch 2:  Loss:     1.4105 Validation Accuracy: 0.478600
Epoch  3, CIFAR-10 Batch 3:  Loss:     1.2605 Validation Accuracy: 0.492400
Epoch  3, CIFAR-10 Batch 4:  Loss:     1.5656 Validation Accuracy: 0.503200
Epoch  3, CIFAR-10 Batch 5:  Loss:     1.4440 Validation Accuracy: 0.508600
Epoch  4, CIFAR-10 Batch 1:  Loss:     1.3597 Validation Accuracy: 0.504000
Epoch  4, CIFAR-10 Batch 2:  Loss:     1.3687 Validation Accuracy: 0.503600
Epoch  4, CIFAR-10 Batch 3:  Loss:     1.1687 Validation Accuracy: 0.509200
Epoch  4, CIFAR-10 Batch 4:  Loss:     1.5099 Validation Accuracy: 0.524000
Epoch  4, CIFAR-10 Batch 5:  Loss:     1.3790 Validation Accuracy: 0.518200
Epoch  5, CIFAR-10 Batch 1:  Loss:     1.3447 Validation Accuracy: 0.525600
Epoch  5, CIFAR-10 Batch 2:  Loss:     1.2770 Validation Accuracy: 0.516600
Epoch  5, CIFAR-10 Batch 3:  Loss:     1.1021 Validation Accuracy: 0.528000
Epoch  5, CIFAR-10 Batch 4:  Loss:     1.4187 Validation Accuracy: 0.543800
Epoch  5, CIFAR-10 Batch 5:  Loss:     1.2777 Validation Accuracy: 0.538200
Epoch  6, CIFAR-10 Batch 1:  Loss:     1.2721 Validation Accuracy: 0.541200
Epoch  6, CIFAR-10 Batch 2:  Loss:     1.2428 Validation Accuracy: 0.542400
Epoch  6, CIFAR-10 Batch 3:  Loss:     1.0593 Validation Accuracy: 0.539600
Epoch  6, CIFAR-10 Batch 4:  Loss:     1.3648 Validation Accuracy: 0.556000
Epoch  6, CIFAR-10 Batch 5:  Loss:     1.2445 Validation Accuracy: 0.543800
Epoch  7, CIFAR-10 Batch 1:  Loss:     1.2238 Validation Accuracy: 0.552600
Epoch  7, CIFAR-10 Batch 2:  Loss:     1.2357 Validation Accuracy: 0.547000
Epoch  7, CIFAR-10 Batch 3:  Loss:     1.0269 Validation Accuracy: 0.546600
Epoch  7, CIFAR-10 Batch 4:  Loss:     1.3180 Validation Accuracy: 0.557400
Epoch  7, CIFAR-10 Batch 5:  Loss:     1.1989 Validation Accuracy: 0.556400
Epoch  8, CIFAR-10 Batch 1:  Loss:     1.2079 Validation Accuracy: 0.555000
Epoch  8, CIFAR-10 Batch 2:  Loss:     1.2210 Validation Accuracy: 0.566600
Epoch  8, CIFAR-10 Batch 3:  Loss:     0.9959 Validation Accuracy: 0.559800
Epoch  8, CIFAR-10 Batch 4:  Loss:     1.2717 Validation Accuracy: 0.561400
Epoch  8, CIFAR-10 Batch 5:  Loss:     1.1600 Validation Accuracy: 0.559000
Epoch  9, CIFAR-10 Batch 1:  Loss:     1.1618 Validation Accuracy: 0.564200
Epoch  9, CIFAR-10 Batch 2:  Loss:     1.1579 Validation Accuracy: 0.562000
Epoch  9, CIFAR-10 Batch 3:  Loss:     0.9575 Validation Accuracy: 0.568600
Epoch  9, CIFAR-10 Batch 4:  Loss:     1.2266 Validation Accuracy: 0.573600
Epoch  9, CIFAR-10 Batch 5:  Loss:     1.1235 Validation Accuracy: 0.569800
Epoch 10, CIFAR-10 Batch 1:  Loss:     1.1471 Validation Accuracy: 0.562800
Epoch 10, CIFAR-10 Batch 2:  Loss:     1.1587 Validation Accuracy: 0.579400
Epoch 10, CIFAR-10 Batch 3:  Loss:     0.9252 Validation Accuracy: 0.565800
Epoch 10, CIFAR-10 Batch 4:  Loss:     1.2012 Validation Accuracy: 0.578600
Epoch 10, CIFAR-10 Batch 5:  Loss:     1.0763 Validation Accuracy: 0.571000
Epoch 11, CIFAR-10 Batch 1:  Loss:     1.0883 Validation Accuracy: 0.573200
Epoch 11, CIFAR-10 Batch 2:  Loss:     1.1056 Validation Accuracy: 0.574800
Epoch 11, CIFAR-10 Batch 3:  Loss:     0.9225 Validation Accuracy: 0.573800
Epoch 11, CIFAR-10 Batch 4:  Loss:     1.1471 Validation Accuracy: 0.579800
Epoch 11, CIFAR-10 Batch 5:  Loss:     1.0266 Validation Accuracy: 0.577200
Epoch 12, CIFAR-10 Batch 1:  Loss:     1.0329 Validation Accuracy: 0.580000
Epoch 12, CIFAR-10 Batch 2:  Loss:     1.0526 Validation Accuracy: 0.581400
Epoch 12, CIFAR-10 Batch 3:  Loss:     0.9061 Validation Accuracy: 0.578600
Epoch 12, CIFAR-10 Batch 4:  Loss:     1.1015 Validation Accuracy: 0.587600
Epoch 12, CIFAR-10 Batch 5:  Loss:     1.0110 Validation Accuracy: 0.581000
Epoch 13, CIFAR-10 Batch 1:  Loss:     1.0412 Validation Accuracy: 0.583000
Epoch 13, CIFAR-10 Batch 2:  Loss:     1.0547 Validation Accuracy: 0.583400
Epoch 13, CIFAR-10 Batch 3:  Loss:     0.8655 Validation Accuracy: 0.589800
Epoch 13, CIFAR-10 Batch 4:  Loss:     1.0458 Validation Accuracy: 0.585600
Epoch 13, CIFAR-10 Batch 5:  Loss:     0.9892 Validation Accuracy: 0.589200
Epoch 14, CIFAR-10 Batch 1:  Loss:     1.0288 Validation Accuracy: 0.586000
Epoch 14, CIFAR-10 Batch 2:  Loss:     1.0028 Validation Accuracy: 0.588400
Epoch 14, CIFAR-10 Batch 3:  Loss:     0.8726 Validation Accuracy: 0.587200
Epoch 14, CIFAR-10 Batch 4:  Loss:     1.0268 Validation Accuracy: 0.592800
Epoch 14, CIFAR-10 Batch 5:  Loss:     0.9810 Validation Accuracy: 0.592800
Epoch 15, CIFAR-10 Batch 1:  Loss:     0.9887 Validation Accuracy: 0.591600
Epoch 15, CIFAR-10 Batch 2:  Loss:     0.9618 Validation Accuracy: 0.588400
Epoch 15, CIFAR-10 Batch 3:  Loss:     0.8660 Validation Accuracy: 0.582400
Epoch 15, CIFAR-10 Batch 4:  Loss:     0.9819 Validation Accuracy: 0.596400
Epoch 15, CIFAR-10 Batch 5:  Loss:     0.9435 Validation Accuracy: 0.590600
Epoch 16, CIFAR-10 Batch 1:  Loss:     0.9863 Validation Accuracy: 0.602000
Epoch 16, CIFAR-10 Batch 2:  Loss:     0.9143 Validation Accuracy: 0.596400
Epoch 16, CIFAR-10 Batch 3:  Loss:     0.8218 Validation Accuracy: 0.596000
Epoch 16, CIFAR-10 Batch 4:  Loss:     0.9473 Validation Accuracy: 0.601800
Epoch 16, CIFAR-10 Batch 5:  Loss:     0.9074 Validation Accuracy: 0.597200
Epoch 17, CIFAR-10 Batch 1:  Loss:     0.9479 Validation Accuracy: 0.600600
Epoch 17, CIFAR-10 Batch 2:  Loss:     0.9233 Validation Accuracy: 0.600200
Epoch 17, CIFAR-10 Batch 3:  Loss:     0.8401 Validation Accuracy: 0.589400
Epoch 17, CIFAR-10 Batch 4:  Loss:     0.9121 Validation Accuracy: 0.608400
Epoch 17, CIFAR-10 Batch 5:  Loss:     0.8795 Validation Accuracy: 0.604000
Epoch 18, CIFAR-10 Batch 1:  Loss:     0.9287 Validation Accuracy: 0.610600
Epoch 18, CIFAR-10 Batch 2:  Loss:     0.8647 Validation Accuracy: 0.601200
Epoch 18, CIFAR-10 Batch 3:  Loss:     0.7938 Validation Accuracy: 0.611200
Epoch 18, CIFAR-10 Batch 4:  Loss:     0.8840 Validation Accuracy: 0.606800
Epoch 18, CIFAR-10 Batch 5:  Loss:     0.8744 Validation Accuracy: 0.598600
Epoch 19, CIFAR-10 Batch 1:  Loss:     0.9426 Validation Accuracy: 0.611800
Epoch 19, CIFAR-10 Batch 2:  Loss:     0.8378 Validation Accuracy: 0.607000
Epoch 19, CIFAR-10 Batch 3:  Loss:     0.7886 Validation Accuracy: 0.600000
Epoch 19, CIFAR-10 Batch 4:  Loss:     0.8730 Validation Accuracy: 0.610600
Epoch 19, CIFAR-10 Batch 5:  Loss:     0.8519 Validation Accuracy: 0.600600
Epoch 20, CIFAR-10 Batch 1:  Loss:     0.9100 Validation Accuracy: 0.608800
Epoch 20, CIFAR-10 Batch 2:  Loss:     0.8435 Validation Accuracy: 0.606600
Epoch 20, CIFAR-10 Batch 3:  Loss:     0.7775 Validation Accuracy: 0.610400
Epoch 20, CIFAR-10 Batch 4:  Loss:     0.8455 Validation Accuracy: 0.610400
Epoch 20, CIFAR-10 Batch 5:  Loss:     0.8282 Validation Accuracy: 0.607800
Epoch 21, CIFAR-10 Batch 1:  Loss:     0.8896 Validation Accuracy: 0.612200
Epoch 21, CIFAR-10 Batch 2:  Loss:     0.8419 Validation Accuracy: 0.607200
Epoch 21, CIFAR-10 Batch 3:  Loss:     0.7171 Validation Accuracy: 0.611600
Epoch 21, CIFAR-10 Batch 4:  Loss:     0.8090 Validation Accuracy: 0.616800
Epoch 21, CIFAR-10 Batch 5:  Loss:     0.8100 Validation Accuracy: 0.619600
Epoch 22, CIFAR-10 Batch 1:  Loss:     0.9141 Validation Accuracy: 0.611000
Epoch 22, CIFAR-10 Batch 2:  Loss:     0.8174 Validation Accuracy: 0.612000
Epoch 22, CIFAR-10 Batch 3:  Loss:     0.7147 Validation Accuracy: 0.607200
Epoch 22, CIFAR-10 Batch 4:  Loss:     0.7696 Validation Accuracy: 0.611600
Epoch 22, CIFAR-10 Batch 5:  Loss:     0.7886 Validation Accuracy: 0.609200
Epoch 23, CIFAR-10 Batch 1:  Loss:     0.8743 Validation Accuracy: 0.620400
Epoch 23, CIFAR-10 Batch 2:  Loss:     0.7851 Validation Accuracy: 0.616200
Epoch 23, CIFAR-10 Batch 3:  Loss:     0.7034 Validation Accuracy: 0.606000
Epoch 23, CIFAR-10 Batch 4:  Loss:     0.7830 Validation Accuracy: 0.622600
Epoch 23, CIFAR-10 Batch 5:  Loss:     0.7493 Validation Accuracy: 0.610800
Epoch 24, CIFAR-10 Batch 1:  Loss:     0.8696 Validation Accuracy: 0.625000
Epoch 24, CIFAR-10 Batch 2:  Loss:     0.7612 Validation Accuracy: 0.618000
Epoch 24, CIFAR-10 Batch 3:  Loss:     0.7069 Validation Accuracy: 0.619400
Epoch 24, CIFAR-10 Batch 4:  Loss:     0.7554 Validation Accuracy: 0.616000
Epoch 24, CIFAR-10 Batch 5:  Loss:     0.7852 Validation Accuracy: 0.620600
Epoch 25, CIFAR-10 Batch 1:  Loss:     0.8470 Validation Accuracy: 0.623000
Epoch 25, CIFAR-10 Batch 2:  Loss:     0.7628 Validation Accuracy: 0.617000
Epoch 25, CIFAR-10 Batch 3:  Loss:     0.6406 Validation Accuracy: 0.619400
Epoch 25, CIFAR-10 Batch 4:  Loss:     0.7220 Validation Accuracy: 0.615200
Epoch 25, CIFAR-10 Batch 5:  Loss:     0.7403 Validation Accuracy: 0.615400
Epoch 26, CIFAR-10 Batch 1:  Loss:     0.8324 Validation Accuracy: 0.626200
Epoch 26, CIFAR-10 Batch 2:  Loss:     0.7324 Validation Accuracy: 0.620400
Epoch 26, CIFAR-10 Batch 3:  Loss:     0.6675 Validation Accuracy: 0.617600
Epoch 26, CIFAR-10 Batch 4:  Loss:     0.7454 Validation Accuracy: 0.630600
Epoch 26, CIFAR-10 Batch 5:  Loss:     0.7301 Validation Accuracy: 0.622800
Epoch 27, CIFAR-10 Batch 1:  Loss:     0.8091 Validation Accuracy: 0.627200
Epoch 27, CIFAR-10 Batch 2:  Loss:     0.7343 Validation Accuracy: 0.624600
Epoch 27, CIFAR-10 Batch 3:  Loss:     0.6451 Validation Accuracy: 0.621000
Epoch 27, CIFAR-10 Batch 4:  Loss:     0.7134 Validation Accuracy: 0.626000
Epoch 27, CIFAR-10 Batch 5:  Loss:     0.7182 Validation Accuracy: 0.615000
Epoch 28, CIFAR-10 Batch 1:  Loss:     0.8018 Validation Accuracy: 0.620800
Epoch 28, CIFAR-10 Batch 2:  Loss:     0.7065 Validation Accuracy: 0.623200
Epoch 28, CIFAR-10 Batch 3:  Loss:     0.6448 Validation Accuracy: 0.628800
Epoch 28, CIFAR-10 Batch 4:  Loss:     0.7068 Validation Accuracy: 0.622000
Epoch 28, CIFAR-10 Batch 5:  Loss:     0.6949 Validation Accuracy: 0.616200
Epoch 29, CIFAR-10 Batch 1:  Loss:     0.7801 Validation Accuracy: 0.621600
Epoch 29, CIFAR-10 Batch 2:  Loss:     0.7140 Validation Accuracy: 0.625000
Epoch 29, CIFAR-10 Batch 3:  Loss:     0.6275 Validation Accuracy: 0.618600
Epoch 29, CIFAR-10 Batch 4:  Loss:     0.6989 Validation Accuracy: 0.627800
Epoch 29, CIFAR-10 Batch 5:  Loss:     0.6743 Validation Accuracy: 0.624000
Epoch 30, CIFAR-10 Batch 1:  Loss:     0.7563 Validation Accuracy: 0.634800
Epoch 30, CIFAR-10 Batch 2:  Loss:     0.6975 Validation Accuracy: 0.629000
Epoch 30, CIFAR-10 Batch 3:  Loss:     0.6334 Validation Accuracy: 0.627400
Epoch 30, CIFAR-10 Batch 4:  Loss:     0.7097 Validation Accuracy: 0.634600
Epoch 30, CIFAR-10 Batch 5:  Loss:     0.6690 Validation Accuracy: 0.631400
Epoch 31, CIFAR-10 Batch 1:  Loss:     0.7426 Validation Accuracy: 0.633800
Epoch 31, CIFAR-10 Batch 2:  Loss:     0.7109 Validation Accuracy: 0.631200
Epoch 31, CIFAR-10 Batch 3:  Loss:     0.6247 Validation Accuracy: 0.631000
Epoch 31, CIFAR-10 Batch 4:  Loss:     0.6407 Validation Accuracy: 0.635800
Epoch 31, CIFAR-10 Batch 5:  Loss:     0.6513 Validation Accuracy: 0.630400
Epoch 32, CIFAR-10 Batch 1:  Loss:     0.7515 Validation Accuracy: 0.630400
Epoch 32, CIFAR-10 Batch 2:  Loss:     0.6591 Validation Accuracy: 0.628000
Epoch 32, CIFAR-10 Batch 3:  Loss:     0.6325 Validation Accuracy: 0.623800
Epoch 32, CIFAR-10 Batch 4:  Loss:     0.6298 Validation Accuracy: 0.628200
Epoch 32, CIFAR-10 Batch 5:  Loss:     0.6525 Validation Accuracy: 0.634800
Epoch 33, CIFAR-10 Batch 1:  Loss:     0.7484 Validation Accuracy: 0.628800
Epoch 33, CIFAR-10 Batch 2:  Loss:     0.6757 Validation Accuracy: 0.630800
Epoch 33, CIFAR-10 Batch 3:  Loss:     0.6370 Validation Accuracy: 0.627200
Epoch 33, CIFAR-10 Batch 4:  Loss:     0.6069 Validation Accuracy: 0.631600
Epoch 33, CIFAR-10 Batch 5:  Loss:     0.6185 Validation Accuracy: 0.634400
Epoch 34, CIFAR-10 Batch 1:  Loss:     0.7342 Validation Accuracy: 0.640200
Epoch 34, CIFAR-10 Batch 2:  Loss:     0.6139 Validation Accuracy: 0.625800
Epoch 34, CIFAR-10 Batch 3:  Loss:     0.5830 Validation Accuracy: 0.635200
Epoch 34, CIFAR-10 Batch 4:  Loss:     0.6210 Validation Accuracy: 0.641200
Epoch 34, CIFAR-10 Batch 5:  Loss:     0.6186 Validation Accuracy: 0.627800
Epoch 35, CIFAR-10 Batch 1:  Loss:     0.7303 Validation Accuracy: 0.637600
Epoch 35, CIFAR-10 Batch 2:  Loss:     0.6006 Validation Accuracy: 0.631800
Epoch 35, CIFAR-10 Batch 3:  Loss:     0.5665 Validation Accuracy: 0.633400
Epoch 35, CIFAR-10 Batch 4:  Loss:     0.5878 Validation Accuracy: 0.637800
Epoch 35, CIFAR-10 Batch 5:  Loss:     0.6261 Validation Accuracy: 0.639800
Epoch 36, CIFAR-10 Batch 1:  Loss:     0.7397 Validation Accuracy: 0.631200
Epoch 36, CIFAR-10 Batch 2:  Loss:     0.6283 Validation Accuracy: 0.638000
Epoch 36, CIFAR-10 Batch 3:  Loss:     0.5933 Validation Accuracy: 0.626400
Epoch 36, CIFAR-10 Batch 4:  Loss:     0.5988 Validation Accuracy: 0.642600
Epoch 36, CIFAR-10 Batch 5:  Loss:     0.6412 Validation Accuracy: 0.631400
Epoch 37, CIFAR-10 Batch 1:  Loss:     0.7117 Validation Accuracy: 0.644200
Epoch 37, CIFAR-10 Batch 2:  Loss:     0.6052 Validation Accuracy: 0.633000
Epoch 37, CIFAR-10 Batch 3:  Loss:     0.6080 Validation Accuracy: 0.626000
Epoch 37, CIFAR-10 Batch 4:  Loss:     0.5761 Validation Accuracy: 0.641400
Epoch 37, CIFAR-10 Batch 5:  Loss:     0.6021 Validation Accuracy: 0.626800
Epoch 38, CIFAR-10 Batch 1:  Loss:     0.7082 Validation Accuracy: 0.641000
Epoch 38, CIFAR-10 Batch 2:  Loss:     0.5724 Validation Accuracy: 0.638200
Epoch 38, CIFAR-10 Batch 3:  Loss:     0.5391 Validation Accuracy: 0.639600
Epoch 38, CIFAR-10 Batch 4:  Loss:     0.5377 Validation Accuracy: 0.637800
Epoch 38, CIFAR-10 Batch 5:  Loss:     0.5884 Validation Accuracy: 0.637200
Epoch 39, CIFAR-10 Batch 1:  Loss:     0.6934 Validation Accuracy: 0.641400
Epoch 39, CIFAR-10 Batch 2:  Loss:     0.5815 Validation Accuracy: 0.637800
Epoch 39, CIFAR-10 Batch 3:  Loss:     0.5479 Validation Accuracy: 0.635600
Epoch 39, CIFAR-10 Batch 4:  Loss:     0.5343 Validation Accuracy: 0.641200
Epoch 39, CIFAR-10 Batch 5:  Loss:     0.5836 Validation Accuracy: 0.638000
Epoch 40, CIFAR-10 Batch 1:  Loss:     0.6722 Validation Accuracy: 0.636800
Epoch 40, CIFAR-10 Batch 2:  Loss:     0.5894 Validation Accuracy: 0.640600
Epoch 40, CIFAR-10 Batch 3:  Loss:     0.5298 Validation Accuracy: 0.633800
Epoch 40, CIFAR-10 Batch 4:  Loss:     0.5489 Validation Accuracy: 0.642000
Epoch 40, CIFAR-10 Batch 5:  Loss:     0.5584 Validation Accuracy: 0.634800
Epoch 41, CIFAR-10 Batch 1:  Loss:     0.6546 Validation Accuracy: 0.638800
Epoch 41, CIFAR-10 Batch 2:  Loss:     0.5567 Validation Accuracy: 0.645600
Epoch 41, CIFAR-10 Batch 3:  Loss:     0.5091 Validation Accuracy: 0.639000
Epoch 41, CIFAR-10 Batch 4:  Loss:     0.5376 Validation Accuracy: 0.643400
Epoch 41, CIFAR-10 Batch 5:  Loss:     0.5281 Validation Accuracy: 0.643600
Epoch 42, CIFAR-10 Batch 1:  Loss:     0.6584 Validation Accuracy: 0.640000
Epoch 42, CIFAR-10 Batch 2:  Loss:     0.5320 Validation Accuracy: 0.642000
Epoch 42, CIFAR-10 Batch 3:  Loss:     0.4977 Validation Accuracy: 0.639000
Epoch 42, CIFAR-10 Batch 4:  Loss:     0.5237 Validation Accuracy: 0.649000
Epoch 42, CIFAR-10 Batch 5:  Loss:     0.5382 Validation Accuracy: 0.637400
Epoch 43, CIFAR-10 Batch 1:  Loss:     0.6524 Validation Accuracy: 0.644200
Epoch 43, CIFAR-10 Batch 2:  Loss:     0.5174 Validation Accuracy: 0.642400
Epoch 43, CIFAR-10 Batch 3:  Loss:     0.4964 Validation Accuracy: 0.635000
Epoch 43, CIFAR-10 Batch 4:  Loss:     0.5337 Validation Accuracy: 0.641200
Epoch 43, CIFAR-10 Batch 5:  Loss:     0.4994 Validation Accuracy: 0.645000
Epoch 44, CIFAR-10 Batch 1:  Loss:     0.6323 Validation Accuracy: 0.644600
Epoch 44, CIFAR-10 Batch 2:  Loss:     0.5440 Validation Accuracy: 0.637200
Epoch 44, CIFAR-10 Batch 3:  Loss:     0.4852 Validation Accuracy: 0.638800
Epoch 44, CIFAR-10 Batch 4:  Loss:     0.5203 Validation Accuracy: 0.644200
Epoch 44, CIFAR-10 Batch 5:  Loss:     0.5262 Validation Accuracy: 0.639400
Epoch 45, CIFAR-10 Batch 1:  Loss:     0.6575 Validation Accuracy: 0.644000
Epoch 45, CIFAR-10 Batch 2:  Loss:     0.5377 Validation Accuracy: 0.643800
Epoch 45, CIFAR-10 Batch 3:  Loss:     0.4723 Validation Accuracy: 0.636600
Epoch 45, CIFAR-10 Batch 4:  Loss:     0.5169 Validation Accuracy: 0.643600
Epoch 45, CIFAR-10 Batch 5:  Loss:     0.4983 Validation Accuracy: 0.641600
Epoch 46, CIFAR-10 Batch 1:  Loss:     0.6373 Validation Accuracy: 0.645400
Epoch 46, CIFAR-10 Batch 2:  Loss:     0.5172 Validation Accuracy: 0.642400
Epoch 46, CIFAR-10 Batch 3:  Loss:     0.5020 Validation Accuracy: 0.634800
Epoch 46, CIFAR-10 Batch 4:  Loss:     0.5145 Validation Accuracy: 0.645400
Epoch 46, CIFAR-10 Batch 5:  Loss:     0.4852 Validation Accuracy: 0.646800
Epoch 47, CIFAR-10 Batch 1:  Loss:     0.6285 Validation Accuracy: 0.646000
Epoch 47, CIFAR-10 Batch 2:  Loss:     0.4900 Validation Accuracy: 0.648200
Epoch 47, CIFAR-10 Batch 3:  Loss:     0.4710 Validation Accuracy: 0.639800
Epoch 47, CIFAR-10 Batch 4:  Loss:     0.4824 Validation Accuracy: 0.650400
Epoch 47, CIFAR-10 Batch 5:  Loss:     0.4814 Validation Accuracy: 0.645600
Epoch 48, CIFAR-10 Batch 1:  Loss:     0.6225 Validation Accuracy: 0.644000
Epoch 48, CIFAR-10 Batch 2:  Loss:     0.5492 Validation Accuracy: 0.645200
Epoch 48, CIFAR-10 Batch 3:  Loss:     0.4890 Validation Accuracy: 0.641200
Epoch 48, CIFAR-10 Batch 4:  Loss:     0.4872 Validation Accuracy: 0.651200
Epoch 48, CIFAR-10 Batch 5:  Loss:     0.4782 Validation Accuracy: 0.646800
Epoch 49, CIFAR-10 Batch 1:  Loss:     0.5837 Validation Accuracy: 0.643200
Epoch 49, CIFAR-10 Batch 2:  Loss:     0.4839 Validation Accuracy: 0.639600
Epoch 49, CIFAR-10 Batch 3:  Loss:     0.4658 Validation Accuracy: 0.648400
Epoch 49, CIFAR-10 Batch 4:  Loss:     0.4995 Validation Accuracy: 0.650000
Epoch 49, CIFAR-10 Batch 5:  Loss:     0.4612 Validation Accuracy: 0.648200
Epoch 50, CIFAR-10 Batch 1:  Loss:     0.5956 Validation Accuracy: 0.653600
Epoch 50, CIFAR-10 Batch 2:  Loss:     0.4881 Validation Accuracy: 0.643200
Epoch 50, CIFAR-10 Batch 3:  Loss:     0.4515 Validation Accuracy: 0.645800
Epoch 50, CIFAR-10 Batch 4:  Loss:     0.4609 Validation Accuracy: 0.651000
Epoch 50, CIFAR-10 Batch 5:  Loss:     0.4584 Validation Accuracy: 0.651200
Epoch 51, CIFAR-10 Batch 1:  Loss:     0.5925 Validation Accuracy: 0.653200
Epoch 51, CIFAR-10 Batch 2:  Loss:     0.4439 Validation Accuracy: 0.643600
Epoch 51, CIFAR-10 Batch 3:  Loss:     0.4506 Validation Accuracy: 0.644800
Epoch 51, CIFAR-10 Batch 4:  Loss:     0.4872 Validation Accuracy: 0.654800
Epoch 51, CIFAR-10 Batch 5:  Loss:     0.4670 Validation Accuracy: 0.647800
Epoch 52, CIFAR-10 Batch 1:  Loss:     0.5724 Validation Accuracy: 0.649200
Epoch 52, CIFAR-10 Batch 2:  Loss:     0.4517 Validation Accuracy: 0.640200
Epoch 52, CIFAR-10 Batch 3:  Loss:     0.4683 Validation Accuracy: 0.639000
Epoch 52, CIFAR-10 Batch 4:  Loss:     0.4718 Validation Accuracy: 0.651600
Epoch 52, CIFAR-10 Batch 5:  Loss:     0.4751 Validation Accuracy: 0.652600
Epoch 53, CIFAR-10 Batch 1:  Loss:     0.5728 Validation Accuracy: 0.652400
Epoch 53, CIFAR-10 Batch 2:  Loss:     0.4734 Validation Accuracy: 0.644400
Epoch 53, CIFAR-10 Batch 3:  Loss:     0.4563 Validation Accuracy: 0.642200
Epoch 53, CIFAR-10 Batch 4:  Loss:     0.4397 Validation Accuracy: 0.651200
Epoch 53, CIFAR-10 Batch 5:  Loss:     0.4862 Validation Accuracy: 0.644800
Epoch 54, CIFAR-10 Batch 1:  Loss:     0.5771 Validation Accuracy: 0.650000
Epoch 54, CIFAR-10 Batch 2:  Loss:     0.4605 Validation Accuracy: 0.643200
Epoch 54, CIFAR-10 Batch 3:  Loss:     0.4432 Validation Accuracy: 0.640400
Epoch 54, CIFAR-10 Batch 4:  Loss:     0.4420 Validation Accuracy: 0.655000
Epoch 54, CIFAR-10 Batch 5:  Loss:     0.4245 Validation Accuracy: 0.656000
Epoch 55, CIFAR-10 Batch 1:  Loss:     0.5591 Validation Accuracy: 0.654600
Epoch 55, CIFAR-10 Batch 2:  Loss:     0.4523 Validation Accuracy: 0.646400
Epoch 55, CIFAR-10 Batch 3:  Loss:     0.4346 Validation Accuracy: 0.644600
Epoch 55, CIFAR-10 Batch 4:  Loss:     0.4273 Validation Accuracy: 0.652800
Epoch 55, CIFAR-10 Batch 5:  Loss:     0.4523 Validation Accuracy: 0.649200
Epoch 56, CIFAR-10 Batch 1:  Loss:     0.5757 Validation Accuracy: 0.648200
Epoch 56, CIFAR-10 Batch 2:  Loss:     0.4333 Validation Accuracy: 0.644800
Epoch 56, CIFAR-10 Batch 3:  Loss:     0.4360 Validation Accuracy: 0.649200
Epoch 56, CIFAR-10 Batch 4:  Loss:     0.4324 Validation Accuracy: 0.650200
Epoch 56, CIFAR-10 Batch 5:  Loss:     0.4469 Validation Accuracy: 0.649400
Epoch 57, CIFAR-10 Batch 1:  Loss:     0.5766 Validation Accuracy: 0.645400
Epoch 57, CIFAR-10 Batch 2:  Loss:     0.4449 Validation Accuracy: 0.648000
Epoch 57, CIFAR-10 Batch 3:  Loss:     0.4439 Validation Accuracy: 0.646600
Epoch 57, CIFAR-10 Batch 4:  Loss:     0.4109 Validation Accuracy: 0.655000
Epoch 57, CIFAR-10 Batch 5:  Loss:     0.4422 Validation Accuracy: 0.647600
Epoch 58, CIFAR-10 Batch 1:  Loss:     0.5493 Validation Accuracy: 0.659000
Epoch 58, CIFAR-10 Batch 2:  Loss:     0.4468 Validation Accuracy: 0.652000
Epoch 58, CIFAR-10 Batch 3:  Loss:     0.4403 Validation Accuracy: 0.649800
Epoch 58, CIFAR-10 Batch 4:  Loss:     0.4169 Validation Accuracy: 0.658400
Epoch 58, CIFAR-10 Batch 5:  Loss:     0.4285 Validation Accuracy: 0.656400
Epoch 59, CIFAR-10 Batch 1:  Loss:     0.5400 Validation Accuracy: 0.651200
Epoch 59, CIFAR-10 Batch 2:  Loss:     0.4500 Validation Accuracy: 0.647800
Epoch 59, CIFAR-10 Batch 3:  Loss:     0.4202 Validation Accuracy: 0.644800
Epoch 59, CIFAR-10 Batch 4:  Loss:     0.4188 Validation Accuracy: 0.650800
Epoch 59, CIFAR-10 Batch 5:  Loss:     0.3942 Validation Accuracy: 0.649600
Epoch 60, CIFAR-10 Batch 1:  Loss:     0.5379 Validation Accuracy: 0.657800
Epoch 60, CIFAR-10 Batch 2:  Loss:     0.4325 Validation Accuracy: 0.645000
Epoch 60, CIFAR-10 Batch 3:  Loss:     0.4193 Validation Accuracy: 0.649800
Epoch 60, CIFAR-10 Batch 4:  Loss:     0.3931 Validation Accuracy: 0.651000
Epoch 60, CIFAR-10 Batch 5:  Loss:     0.4111 Validation Accuracy: 0.655800
Epoch 61, CIFAR-10 Batch 1:  Loss:     0.5222 Validation Accuracy: 0.657200
Epoch 61, CIFAR-10 Batch 2:  Loss:     0.4381 Validation Accuracy: 0.649400
Epoch 61, CIFAR-10 Batch 3:  Loss:     0.4368 Validation Accuracy: 0.651000
Epoch 61, CIFAR-10 Batch 4:  Loss:     0.3982 Validation Accuracy: 0.654600
Epoch 61, CIFAR-10 Batch 5:  Loss:     0.4079 Validation Accuracy: 0.649800
Epoch 62, CIFAR-10 Batch 1:  Loss:     0.5331 Validation Accuracy: 0.655000
Epoch 62, CIFAR-10 Batch 2:  Loss:     0.4518 Validation Accuracy: 0.648800
Epoch 62, CIFAR-10 Batch 3:  Loss:     0.4223 Validation Accuracy: 0.648000
Epoch 62, CIFAR-10 Batch 4:  Loss:     0.3964 Validation Accuracy: 0.657800
Epoch 62, CIFAR-10 Batch 5:  Loss:     0.4037 Validation Accuracy: 0.650600
Epoch 63, CIFAR-10 Batch 1:  Loss:     0.5108 Validation Accuracy: 0.657000
Epoch 63, CIFAR-10 Batch 2:  Loss:     0.4499 Validation Accuracy: 0.652600
Epoch 63, CIFAR-10 Batch 3:  Loss:     0.3676 Validation Accuracy: 0.651400
Epoch 63, CIFAR-10 Batch 4:  Loss:     0.3977 Validation Accuracy: 0.654800
Epoch 63, CIFAR-10 Batch 5:  Loss:     0.3714 Validation Accuracy: 0.655000
Epoch 64, CIFAR-10 Batch 1:  Loss:     0.5234 Validation Accuracy: 0.657000
Epoch 64, CIFAR-10 Batch 2:  Loss:     0.4206 Validation Accuracy: 0.656400
Epoch 64, CIFAR-10 Batch 3:  Loss:     0.3856 Validation Accuracy: 0.655400
Epoch 64, CIFAR-10 Batch 4:  Loss:     0.3733 Validation Accuracy: 0.655600
Epoch 64, CIFAR-10 Batch 5:  Loss:     0.3695 Validation Accuracy: 0.655600
Epoch 65, CIFAR-10 Batch 1:  Loss:     0.5083 Validation Accuracy: 0.652600
Epoch 65, CIFAR-10 Batch 2:  Loss:     0.3917 Validation Accuracy: 0.652600
Epoch 65, CIFAR-10 Batch 3:  Loss:     0.3888 Validation Accuracy: 0.650000
Epoch 65, CIFAR-10 Batch 4:  Loss:     0.3913 Validation Accuracy: 0.654400
Epoch 65, CIFAR-10 Batch 5:  Loss:     0.3856 Validation Accuracy: 0.655400
Epoch 66, CIFAR-10 Batch 1:  Loss:     0.5089 Validation Accuracy: 0.657800
Epoch 66, CIFAR-10 Batch 2:  Loss:     0.3990 Validation Accuracy: 0.652800
Epoch 66, CIFAR-10 Batch 3:  Loss:     0.3752 Validation Accuracy: 0.654000
Epoch 66, CIFAR-10 Batch 4:  Loss:     0.3742 Validation Accuracy: 0.662000
Epoch 66, CIFAR-10 Batch 5:  Loss:     0.3824 Validation Accuracy: 0.657200
Epoch 67, CIFAR-10 Batch 1:  Loss:     0.4884 Validation Accuracy: 0.655200
Epoch 67, CIFAR-10 Batch 2:  Loss:     0.3925 Validation Accuracy: 0.655200
Epoch 67, CIFAR-10 Batch 3:  Loss:     0.3708 Validation Accuracy: 0.657400
Epoch 67, CIFAR-10 Batch 4:  Loss:     0.3733 Validation Accuracy: 0.653000
Epoch 67, CIFAR-10 Batch 5:  Loss:     0.3751 Validation Accuracy: 0.649800
Epoch 68, CIFAR-10 Batch 1:  Loss:     0.5118 Validation Accuracy: 0.662200
Epoch 68, CIFAR-10 Batch 2:  Loss:     0.4003 Validation Accuracy: 0.650600
Epoch 68, CIFAR-10 Batch 3:  Loss:     0.3745 Validation Accuracy: 0.647600
Epoch 68, CIFAR-10 Batch 4:  Loss:     0.3723 Validation Accuracy: 0.656400
Epoch 68, CIFAR-10 Batch 5:  Loss:     0.3643 Validation Accuracy: 0.654200
Epoch 69, CIFAR-10 Batch 1:  Loss:     0.4904 Validation Accuracy: 0.656800
Epoch 69, CIFAR-10 Batch 2:  Loss:     0.4018 Validation Accuracy: 0.650800
Epoch 69, CIFAR-10 Batch 3:  Loss:     0.3680 Validation Accuracy: 0.657400
Epoch 69, CIFAR-10 Batch 4:  Loss:     0.3702 Validation Accuracy: 0.651200
Epoch 69, CIFAR-10 Batch 5:  Loss:     0.3647 Validation Accuracy: 0.655000
Epoch 70, CIFAR-10 Batch 1:  Loss:     0.5057 Validation Accuracy: 0.657800
Epoch 70, CIFAR-10 Batch 2:  Loss:     0.3871 Validation Accuracy: 0.650600
Epoch 70, CIFAR-10 Batch 3:  Loss:     0.3482 Validation Accuracy: 0.652600
Epoch 70, CIFAR-10 Batch 4:  Loss:     0.3451 Validation Accuracy: 0.655600
Epoch 70, CIFAR-10 Batch 5:  Loss:     0.3528 Validation Accuracy: 0.658000
Epoch 71, CIFAR-10 Batch 1:  Loss:     0.4763 Validation Accuracy: 0.656800
Epoch 71, CIFAR-10 Batch 2:  Loss:     0.4015 Validation Accuracy: 0.647200
Epoch 71, CIFAR-10 Batch 3:  Loss:     0.3492 Validation Accuracy: 0.648600
Epoch 71, CIFAR-10 Batch 4:  Loss:     0.3592 Validation Accuracy: 0.658400
Epoch 71, CIFAR-10 Batch 5:  Loss:     0.3803 Validation Accuracy: 0.659200
Epoch 72, CIFAR-10 Batch 1:  Loss:     0.4515 Validation Accuracy: 0.656000
Epoch 72, CIFAR-10 Batch 2:  Loss:     0.3934 Validation Accuracy: 0.650000
Epoch 72, CIFAR-10 Batch 3:  Loss:     0.3622 Validation Accuracy: 0.655600
Epoch 72, CIFAR-10 Batch 4:  Loss:     0.3816 Validation Accuracy: 0.657600
Epoch 72, CIFAR-10 Batch 5:  Loss:     0.3586 Validation Accuracy: 0.655600
Epoch 73, CIFAR-10 Batch 1:  Loss:     0.4599 Validation Accuracy: 0.657200
Epoch 73, CIFAR-10 Batch 2:  Loss:     0.3816 Validation Accuracy: 0.655400
Epoch 73, CIFAR-10 Batch 3:  Loss:     0.3621 Validation Accuracy: 0.653800
Epoch 73, CIFAR-10 Batch 4:  Loss:     0.3449 Validation Accuracy: 0.650000
Epoch 73, CIFAR-10 Batch 5:  Loss:     0.3682 Validation Accuracy: 0.651800
Epoch 74, CIFAR-10 Batch 1:  Loss:     0.4574 Validation Accuracy: 0.658600
Epoch 74, CIFAR-10 Batch 2:  Loss:     0.3560 Validation Accuracy: 0.653600
Epoch 74, CIFAR-10 Batch 3:  Loss:     0.3424 Validation Accuracy: 0.658800
Epoch 74, CIFAR-10 Batch 4:  Loss:     0.3646 Validation Accuracy: 0.658800
Epoch 74, CIFAR-10 Batch 5:  Loss:     0.3645 Validation Accuracy: 0.657200
Epoch 75, CIFAR-10 Batch 1:  Loss:     0.4742 Validation Accuracy: 0.663400
Epoch 75, CIFAR-10 Batch 2:  Loss:     0.3629 Validation Accuracy: 0.654600
Epoch 75, CIFAR-10 Batch 3:  Loss:     0.3660 Validation Accuracy: 0.655600
Epoch 75, CIFAR-10 Batch 4:  Loss:     0.3261 Validation Accuracy: 0.655400
Epoch 75, CIFAR-10 Batch 5:  Loss:     0.3945 Validation Accuracy: 0.654600
Epoch 76, CIFAR-10 Batch 1:  Loss:     0.4314 Validation Accuracy: 0.661000
Epoch 76, CIFAR-10 Batch 2:  Loss:     0.3361 Validation Accuracy: 0.654600
Epoch 76, CIFAR-10 Batch 3:  Loss:     0.3294 Validation Accuracy: 0.660000
Epoch 76, CIFAR-10 Batch 4:  Loss:     0.3303 Validation Accuracy: 0.658000
Epoch 76, CIFAR-10 Batch 5:  Loss:     0.3719 Validation Accuracy: 0.657400
Epoch 77, CIFAR-10 Batch 1:  Loss:     0.4726 Validation Accuracy: 0.654400
Epoch 77, CIFAR-10 Batch 2:  Loss:     0.3661 Validation Accuracy: 0.656600
Epoch 77, CIFAR-10 Batch 3:  Loss:     0.3597 Validation Accuracy: 0.656400
Epoch 77, CIFAR-10 Batch 4:  Loss:     0.3283 Validation Accuracy: 0.652000
Epoch 77, CIFAR-10 Batch 5:  Loss:     0.3866 Validation Accuracy: 0.657000
Epoch 78, CIFAR-10 Batch 1:  Loss:     0.4654 Validation Accuracy: 0.661600
Epoch 78, CIFAR-10 Batch 2:  Loss:     0.3516 Validation Accuracy: 0.662600
Epoch 78, CIFAR-10 Batch 3:  Loss:     0.3473 Validation Accuracy: 0.659200
Epoch 78, CIFAR-10 Batch 4:  Loss:     0.3280 Validation Accuracy: 0.659600
Epoch 78, CIFAR-10 Batch 5:  Loss:     0.3775 Validation Accuracy: 0.661600
Epoch 79, CIFAR-10 Batch 1:  Loss:     0.4575 Validation Accuracy: 0.656800
Epoch 79, CIFAR-10 Batch 2:  Loss:     0.3691 Validation Accuracy: 0.659800
Epoch 79, CIFAR-10 Batch 3:  Loss:     0.3474 Validation Accuracy: 0.655600
Epoch 79, CIFAR-10 Batch 4:  Loss:     0.3433 Validation Accuracy: 0.665000
Epoch 79, CIFAR-10 Batch 5:  Loss:     0.3525 Validation Accuracy: 0.654400
Epoch 80, CIFAR-10 Batch 1:  Loss:     0.4598 Validation Accuracy: 0.657600
Epoch 80, CIFAR-10 Batch 2:  Loss:     0.3599 Validation Accuracy: 0.654000
Epoch 80, CIFAR-10 Batch 3:  Loss:     0.3435 Validation Accuracy: 0.652000
Epoch 80, CIFAR-10 Batch 4:  Loss:     0.3151 Validation Accuracy: 0.658800
Epoch 80, CIFAR-10 Batch 5:  Loss:     0.3523 Validation Accuracy: 0.649400
Epoch 81, CIFAR-10 Batch 1:  Loss:     0.4371 Validation Accuracy: 0.660600
Epoch 81, CIFAR-10 Batch 2:  Loss:     0.3415 Validation Accuracy: 0.655200
Epoch 81, CIFAR-10 Batch 3:  Loss:     0.3478 Validation Accuracy: 0.648800
Epoch 81, CIFAR-10 Batch 4:  Loss:     0.3293 Validation Accuracy: 0.655600
Epoch 81, CIFAR-10 Batch 5:  Loss:     0.3459 Validation Accuracy: 0.656200
Epoch 82, CIFAR-10 Batch 1:  Loss:     0.4343 Validation Accuracy: 0.656600
Epoch 82, CIFAR-10 Batch 2:  Loss:     0.3476 Validation Accuracy: 0.661600
Epoch 82, CIFAR-10 Batch 3:  Loss:     0.3406 Validation Accuracy: 0.652400
Epoch 82, CIFAR-10 Batch 4:  Loss:     0.3033 Validation Accuracy: 0.656600
Epoch 82, CIFAR-10 Batch 5:  Loss:     0.3623 Validation Accuracy: 0.661400
Epoch 83, CIFAR-10 Batch 1:  Loss:     0.4315 Validation Accuracy: 0.660000
Epoch 83, CIFAR-10 Batch 2:  Loss:     0.3522 Validation Accuracy: 0.656200
Epoch 83, CIFAR-10 Batch 3:  Loss:     0.3294 Validation Accuracy: 0.663800
Epoch 83, CIFAR-10 Batch 4:  Loss:     0.3084 Validation Accuracy: 0.655200
Epoch 83, CIFAR-10 Batch 5:  Loss:     0.3356 Validation Accuracy: 0.662800
Epoch 84, CIFAR-10 Batch 1:  Loss:     0.4287 Validation Accuracy: 0.654800
Epoch 84, CIFAR-10 Batch 2:  Loss:     0.3514 Validation Accuracy: 0.659200
Epoch 84, CIFAR-10 Batch 3:  Loss:     0.3217 Validation Accuracy: 0.661800
Epoch 84, CIFAR-10 Batch 4:  Loss:     0.3218 Validation Accuracy: 0.649200
Epoch 84, CIFAR-10 Batch 5:  Loss:     0.3436 Validation Accuracy: 0.655000
Epoch 85, CIFAR-10 Batch 1:  Loss:     0.4306 Validation Accuracy: 0.662800
Epoch 85, CIFAR-10 Batch 2:  Loss:     0.3263 Validation Accuracy: 0.655200
Epoch 85, CIFAR-10 Batch 3:  Loss:     0.3312 Validation Accuracy: 0.656200
Epoch 85, CIFAR-10 Batch 4:  Loss:     0.3324 Validation Accuracy: 0.654600
Epoch 85, CIFAR-10 Batch 5:  Loss:     0.3128 Validation Accuracy: 0.656000
Epoch 86, CIFAR-10 Batch 1:  Loss:     0.4156 Validation Accuracy: 0.656000
Epoch 86, CIFAR-10 Batch 2:  Loss:     0.3606 Validation Accuracy: 0.656400
Epoch 86, CIFAR-10 Batch 3:  Loss:     0.3016 Validation Accuracy: 0.657400
Epoch 86, CIFAR-10 Batch 4:  Loss:     0.3136 Validation Accuracy: 0.653400
Epoch 86, CIFAR-10 Batch 5:  Loss:     0.3328 Validation Accuracy: 0.654000
Epoch 87, CIFAR-10 Batch 1:  Loss:     0.4101 Validation Accuracy: 0.654600
Epoch 87, CIFAR-10 Batch 2:  Loss:     0.3200 Validation Accuracy: 0.648400
Epoch 87, CIFAR-10 Batch 3:  Loss:     0.3454 Validation Accuracy: 0.658400
Epoch 87, CIFAR-10 Batch 4:  Loss:     0.3219 Validation Accuracy: 0.654200
Epoch 87, CIFAR-10 Batch 5:  Loss:     0.3272 Validation Accuracy: 0.659200
Epoch 88, CIFAR-10 Batch 1:  Loss:     0.4109 Validation Accuracy: 0.666400
Epoch 88, CIFAR-10 Batch 2:  Loss:     0.3198 Validation Accuracy: 0.661000
Epoch 88, CIFAR-10 Batch 3:  Loss:     0.3312 Validation Accuracy: 0.650800
Epoch 88, CIFAR-10 Batch 4:  Loss:     0.3168 Validation Accuracy: 0.668600
Epoch 88, CIFAR-10 Batch 5:  Loss:     0.3380 Validation Accuracy: 0.657000
Epoch 89, CIFAR-10 Batch 1:  Loss:     0.3960 Validation Accuracy: 0.662400
Epoch 89, CIFAR-10 Batch 2:  Loss:     0.3107 Validation Accuracy: 0.662000
Epoch 89, CIFAR-10 Batch 3:  Loss:     0.2973 Validation Accuracy: 0.659200
Epoch 89, CIFAR-10 Batch 4:  Loss:     0.3062 Validation Accuracy: 0.659200
Epoch 89, CIFAR-10 Batch 5:  Loss:     0.3154 Validation Accuracy: 0.663200
Epoch 90, CIFAR-10 Batch 1:  Loss:     0.4221 Validation Accuracy: 0.657400
Epoch 90, CIFAR-10 Batch 2:  Loss:     0.3250 Validation Accuracy: 0.656400
Epoch 90, CIFAR-10 Batch 3:  Loss:     0.3303 Validation Accuracy: 0.655000
Epoch 90, CIFAR-10 Batch 4:  Loss:     0.2967 Validation Accuracy: 0.657200
Epoch 90, CIFAR-10 Batch 5:  Loss:     0.3409 Validation Accuracy: 0.655600
Epoch 91, CIFAR-10 Batch 1:  Loss:     0.3733 Validation Accuracy: 0.659800
Epoch 91, CIFAR-10 Batch 2:  Loss:     0.3313 Validation Accuracy: 0.658600
Epoch 91, CIFAR-10 Batch 3:  Loss:     0.3094 Validation Accuracy: 0.660200
Epoch 91, CIFAR-10 Batch 4:  Loss:     0.2990 Validation Accuracy: 0.664000
Epoch 91, CIFAR-10 Batch 5:  Loss:     0.3166 Validation Accuracy: 0.649600
Epoch 92, CIFAR-10 Batch 1:  Loss:     0.3992 Validation Accuracy: 0.665000
Epoch 92, CIFAR-10 Batch 2:  Loss:     0.3219 Validation Accuracy: 0.658800
Epoch 92, CIFAR-10 Batch 3:  Loss:     0.3182 Validation Accuracy: 0.655200
Epoch 92, CIFAR-10 Batch 4:  Loss:     0.2839 Validation Accuracy: 0.663200
Epoch 92, CIFAR-10 Batch 5:  Loss:     0.2903 Validation Accuracy: 0.655000
Epoch 93, CIFAR-10 Batch 1:  Loss:     0.3895 Validation Accuracy: 0.658400
Epoch 93, CIFAR-10 Batch 2:  Loss:     0.3498 Validation Accuracy: 0.658400
Epoch 93, CIFAR-10 Batch 3:  Loss:     0.3236 Validation Accuracy: 0.660800
Epoch 93, CIFAR-10 Batch 4:  Loss:     0.3063 Validation Accuracy: 0.660400
Epoch 93, CIFAR-10 Batch 5:  Loss:     0.3014 Validation Accuracy: 0.659400
Epoch 94, CIFAR-10 Batch 1:  Loss:     0.4026 Validation Accuracy: 0.660800
Epoch 94, CIFAR-10 Batch 2:  Loss:     0.3233 Validation Accuracy: 0.661800
Epoch 94, CIFAR-10 Batch 3:  Loss:     0.3182 Validation Accuracy: 0.662600
Epoch 94, CIFAR-10 Batch 4:  Loss:     0.2754 Validation Accuracy: 0.650400
Epoch 94, CIFAR-10 Batch 5:  Loss:     0.3445 Validation Accuracy: 0.661200
Epoch 95, CIFAR-10 Batch 1:  Loss:     0.3799 Validation Accuracy: 0.660200
Epoch 95, CIFAR-10 Batch 2:  Loss:     0.3237 Validation Accuracy: 0.657800
Epoch 95, CIFAR-10 Batch 3:  Loss:     0.3249 Validation Accuracy: 0.660400
Epoch 95, CIFAR-10 Batch 4:  Loss:     0.3011 Validation Accuracy: 0.654000
Epoch 95, CIFAR-10 Batch 5:  Loss:     0.3114 Validation Accuracy: 0.660200
Epoch 96, CIFAR-10 Batch 1:  Loss:     0.3852 Validation Accuracy: 0.663000
Epoch 96, CIFAR-10 Batch 2:  Loss:     0.3297 Validation Accuracy: 0.662200
Epoch 96, CIFAR-10 Batch 3:  Loss:     0.3103 Validation Accuracy: 0.658800
Epoch 96, CIFAR-10 Batch 4:  Loss:     0.3246 Validation Accuracy: 0.658600
Epoch 96, CIFAR-10 Batch 5:  Loss:     0.3046 Validation Accuracy: 0.657600
Epoch 97, CIFAR-10 Batch 1:  Loss:     0.3849 Validation Accuracy: 0.660800
Epoch 97, CIFAR-10 Batch 2:  Loss:     0.3030 Validation Accuracy: 0.657200
Epoch 97, CIFAR-10 Batch 3:  Loss:     0.3002 Validation Accuracy: 0.662800
Epoch 97, CIFAR-10 Batch 4:  Loss:     0.2958 Validation Accuracy: 0.661200
Epoch 97, CIFAR-10 Batch 5:  Loss:     0.2962 Validation Accuracy: 0.658800
Epoch 98, CIFAR-10 Batch 1:  Loss:     0.3788 Validation Accuracy: 0.661200
Epoch 98, CIFAR-10 Batch 2:  Loss:     0.3069 Validation Accuracy: 0.657400
Epoch 98, CIFAR-10 Batch 3:  Loss:     0.2910 Validation Accuracy: 0.662800
Epoch 98, CIFAR-10 Batch 4:  Loss:     0.3062 Validation Accuracy: 0.669800
Epoch 98, CIFAR-10 Batch 5:  Loss:     0.3035 Validation Accuracy: 0.659600
Epoch 99, CIFAR-10 Batch 1:  Loss:     0.3799 Validation Accuracy: 0.661000
Epoch 99, CIFAR-10 Batch 2:  Loss:     0.3309 Validation Accuracy: 0.657800
Epoch 99, CIFAR-10 Batch 3:  Loss:     0.2997 Validation Accuracy: 0.653000
Epoch 99, CIFAR-10 Batch 4:  Loss:     0.3129 Validation Accuracy: 0.659600
Epoch 99, CIFAR-10 Batch 5:  Loss:     0.3104 Validation Accuracy: 0.654000
Epoch 100, CIFAR-10 Batch 1:  Loss:     0.4037 Validation Accuracy: 0.662600
Epoch 100, CIFAR-10 Batch 2:  Loss:     0.3551 Validation Accuracy: 0.664000
Epoch 100, CIFAR-10 Batch 3:  Loss:     0.3024 Validation Accuracy: 0.659000
Epoch 100, CIFAR-10 Batch 4:  Loss:     0.3108 Validation Accuracy: 0.659400
Epoch 100, CIFAR-10 Batch 5:  Loss:     0.2868 Validation Accuracy: 0.659600
Epoch 101, CIFAR-10 Batch 1:  Loss:     0.3925 Validation Accuracy: 0.655200
Epoch 101, CIFAR-10 Batch 2:  Loss:     0.3229 Validation Accuracy: 0.659000
Epoch 101, CIFAR-10 Batch 3:  Loss:     0.3334 Validation Accuracy: 0.663600
Epoch 101, CIFAR-10 Batch 4:  Loss:     0.2781 Validation Accuracy: 0.662400
Epoch 101, CIFAR-10 Batch 5:  Loss:     0.2855 Validation Accuracy: 0.670000
Epoch 102, CIFAR-10 Batch 1:  Loss:     0.3651 Validation Accuracy: 0.668800
Epoch 102, CIFAR-10 Batch 2:  Loss:     0.3096 Validation Accuracy: 0.665000
Epoch 102, CIFAR-10 Batch 3:  Loss:     0.3006 Validation Accuracy: 0.664600
Epoch 102, CIFAR-10 Batch 4:  Loss:     0.2900 Validation Accuracy: 0.658400
Epoch 102, CIFAR-10 Batch 5:  Loss:     0.3149 Validation Accuracy: 0.660400
Epoch 103, CIFAR-10 Batch 1:  Loss:     0.3760 Validation Accuracy: 0.665400
Epoch 103, CIFAR-10 Batch 2:  Loss:     0.3424 Validation Accuracy: 0.664800
Epoch 103, CIFAR-10 Batch 3:  Loss:     0.2953 Validation Accuracy: 0.665200
Epoch 103, CIFAR-10 Batch 4:  Loss:     0.2610 Validation Accuracy: 0.657000
Epoch 103, CIFAR-10 Batch 5:  Loss:     0.3254 Validation Accuracy: 0.659800
Epoch 104, CIFAR-10 Batch 1:  Loss:     0.3745 Validation Accuracy: 0.663800
Epoch 104, CIFAR-10 Batch 2:  Loss:     0.3254 Validation Accuracy: 0.659800
Epoch 104, CIFAR-10 Batch 3:  Loss:     0.3058 Validation Accuracy: 0.653400
Epoch 104, CIFAR-10 Batch 4:  Loss:     0.2626 Validation Accuracy: 0.656800
Epoch 104, CIFAR-10 Batch 5:  Loss:     0.2932 Validation Accuracy: 0.662800
Epoch 105, CIFAR-10 Batch 1:  Loss:     0.3726 Validation Accuracy: 0.661600
Epoch 105, CIFAR-10 Batch 2:  Loss:     0.2891 Validation Accuracy: 0.657600
Epoch 105, CIFAR-10 Batch 3:  Loss:     0.3040 Validation Accuracy: 0.661600
Epoch 105, CIFAR-10 Batch 4:  Loss:     0.2618 Validation Accuracy: 0.662000
Epoch 105, CIFAR-10 Batch 5:  Loss:     0.3000 Validation Accuracy: 0.659600
Epoch 106, CIFAR-10 Batch 1:  Loss:     0.3573 Validation Accuracy: 0.657600
Epoch 106, CIFAR-10 Batch 2:  Loss:     0.3092 Validation Accuracy: 0.657400
Epoch 106, CIFAR-10 Batch 3:  Loss:     0.2916 Validation Accuracy: 0.656600
Epoch 106, CIFAR-10 Batch 4:  Loss:     0.2886 Validation Accuracy: 0.656400
Epoch 106, CIFAR-10 Batch 5:  Loss:     0.2979 Validation Accuracy: 0.655800
Epoch 107, CIFAR-10 Batch 1:  Loss:     0.3596 Validation Accuracy: 0.666600
Epoch 107, CIFAR-10 Batch 2:  Loss:     0.3259 Validation Accuracy: 0.658600
Epoch 107, CIFAR-10 Batch 3:  Loss:     0.2921 Validation Accuracy: 0.656800
Epoch 107, CIFAR-10 Batch 4:  Loss:     0.2752 Validation Accuracy: 0.657600
Epoch 107, CIFAR-10 Batch 5:  Loss:     0.3049 Validation Accuracy: 0.660400
Epoch 108, CIFAR-10 Batch 1:  Loss:     0.3672 Validation Accuracy: 0.660200
Epoch 108, CIFAR-10 Batch 2:  Loss:     0.3163 Validation Accuracy: 0.663400
Epoch 108, CIFAR-10 Batch 3:  Loss:     0.3267 Validation Accuracy: 0.651800
Epoch 108, CIFAR-10 Batch 4:  Loss:     0.2703 Validation Accuracy: 0.654800
Epoch 108, CIFAR-10 Batch 5:  Loss:     0.2880 Validation Accuracy: 0.658800
Epoch 109, CIFAR-10 Batch 1:  Loss:     0.3460 Validation Accuracy: 0.662200
Epoch 109, CIFAR-10 Batch 2:  Loss:     0.2965 Validation Accuracy: 0.660600
Epoch 109, CIFAR-10 Batch 3:  Loss:     0.2932 Validation Accuracy: 0.657200
Epoch 109, CIFAR-10 Batch 4:  Loss:     0.2822 Validation Accuracy: 0.658200
Epoch 109, CIFAR-10 Batch 5:  Loss:     0.2923 Validation Accuracy: 0.662000
Epoch 110, CIFAR-10 Batch 1:  Loss:     0.3651 Validation Accuracy: 0.659200
Epoch 110, CIFAR-10 Batch 2:  Loss:     0.3017 Validation Accuracy: 0.662600
Epoch 110, CIFAR-10 Batch 3:  Loss:     0.3039 Validation Accuracy: 0.651400
Epoch 110, CIFAR-10 Batch 4:  Loss:     0.2800 Validation Accuracy: 0.656400
Epoch 110, CIFAR-10 Batch 5:  Loss:     0.2862 Validation Accuracy: 0.657400
Epoch 111, CIFAR-10 Batch 1:  Loss:     0.3680 Validation Accuracy: 0.665400
Epoch 111, CIFAR-10 Batch 2:  Loss:     0.2895 Validation Accuracy: 0.659200
Epoch 111, CIFAR-10 Batch 3:  Loss:     0.2810 Validation Accuracy: 0.660600
Epoch 111, CIFAR-10 Batch 4:  Loss:     0.2681 Validation Accuracy: 0.665000
Epoch 111, CIFAR-10 Batch 5:  Loss:     0.3151 Validation Accuracy: 0.662000
Epoch 112, CIFAR-10 Batch 1:  Loss:     0.3509 Validation Accuracy: 0.669800
Epoch 112, CIFAR-10 Batch 2:  Loss:     0.3033 Validation Accuracy: 0.663200
Epoch 112, CIFAR-10 Batch 3:  Loss:     0.3061 Validation Accuracy: 0.660000
Epoch 112, CIFAR-10 Batch 4:  Loss:     0.2738 Validation Accuracy: 0.666800
Epoch 112, CIFAR-10 Batch 5:  Loss:     0.2846 Validation Accuracy: 0.666800
Epoch 113, CIFAR-10 Batch 1:  Loss:     0.3357 Validation Accuracy: 0.661200
Epoch 113, CIFAR-10 Batch 2:  Loss:     0.2705 Validation Accuracy: 0.660000
Epoch 113, CIFAR-10 Batch 3:  Loss:     0.2721 Validation Accuracy: 0.662000
Epoch 113, CIFAR-10 Batch 4:  Loss:     0.2694 Validation Accuracy: 0.668000
Epoch 113, CIFAR-10 Batch 5:  Loss:     0.2667 Validation Accuracy: 0.663800
Epoch 114, CIFAR-10 Batch 1:  Loss:     0.3342 Validation Accuracy: 0.664400
Epoch 114, CIFAR-10 Batch 2:  Loss:     0.2935 Validation Accuracy: 0.653800
Epoch 114, CIFAR-10 Batch 3:  Loss:     0.2751 Validation Accuracy: 0.659800
Epoch 114, CIFAR-10 Batch 4:  Loss:     0.2595 Validation Accuracy: 0.662000
Epoch 114, CIFAR-10 Batch 5:  Loss:     0.2839 Validation Accuracy: 0.662800
Epoch 115, CIFAR-10 Batch 1:  Loss:     0.3447 Validation Accuracy: 0.668600
Epoch 115, CIFAR-10 Batch 2:  Loss:     0.3080 Validation Accuracy: 0.661000
Epoch 115, CIFAR-10 Batch 3:  Loss:     0.2664 Validation Accuracy: 0.662800
Epoch 115, CIFAR-10 Batch 4:  Loss:     0.2534 Validation Accuracy: 0.658200
Epoch 115, CIFAR-10 Batch 5:  Loss:     0.2709 Validation Accuracy: 0.667600
Epoch 116, CIFAR-10 Batch 1:  Loss:     0.3426 Validation Accuracy: 0.656200
Epoch 116, CIFAR-10 Batch 2:  Loss:     0.2957 Validation Accuracy: 0.657000
Epoch 116, CIFAR-10 Batch 3:  Loss:     0.2692 Validation Accuracy: 0.662000
Epoch 116, CIFAR-10 Batch 4:  Loss:     0.2586 Validation Accuracy: 0.656200
Epoch 116, CIFAR-10 Batch 5:  Loss:     0.2701 Validation Accuracy: 0.658600
Epoch 117, CIFAR-10 Batch 1:  Loss:     0.3506 Validation Accuracy: 0.657600
Epoch 117, CIFAR-10 Batch 2:  Loss:     0.2794 Validation Accuracy: 0.656800
Epoch 117, CIFAR-10 Batch 3:  Loss:     0.2702 Validation Accuracy: 0.658200
Epoch 117, CIFAR-10 Batch 4:  Loss:     0.2304 Validation Accuracy: 0.651800
Epoch 117, CIFAR-10 Batch 5:  Loss:     0.2873 Validation Accuracy: 0.667400
Epoch 118, CIFAR-10 Batch 1:  Loss:     0.3397 Validation Accuracy: 0.662600
Epoch 118, CIFAR-10 Batch 2:  Loss:     0.2761 Validation Accuracy: 0.657600
Epoch 118, CIFAR-10 Batch 3:  Loss:     0.2738 Validation Accuracy: 0.657000
Epoch 118, CIFAR-10 Batch 4:  Loss:     0.2577 Validation Accuracy: 0.660000
Epoch 118, CIFAR-10 Batch 5:  Loss:     0.2568 Validation Accuracy: 0.663400
Epoch 119, CIFAR-10 Batch 1:  Loss:     0.3292 Validation Accuracy: 0.669400
Epoch 119, CIFAR-10 Batch 2:  Loss:     0.2696 Validation Accuracy: 0.663000
Epoch 119, CIFAR-10 Batch 3:  Loss:     0.2700 Validation Accuracy: 0.662200
Epoch 119, CIFAR-10 Batch 4:  Loss:     0.2395 Validation Accuracy: 0.658600
Epoch 119, CIFAR-10 Batch 5:  Loss:     0.2810 Validation Accuracy: 0.659000
Epoch 120, CIFAR-10 Batch 1:  Loss:     0.3293 Validation Accuracy: 0.668600
Epoch 120, CIFAR-10 Batch 2:  Loss:     0.2576 Validation Accuracy: 0.655600
Epoch 120, CIFAR-10 Batch 3:  Loss:     0.2521 Validation Accuracy: 0.656000
Epoch 120, CIFAR-10 Batch 4:  Loss:     0.2601 Validation Accuracy: 0.661600
Epoch 120, CIFAR-10 Batch 5:  Loss:     0.2727 Validation Accuracy: 0.665400
Epoch 121, CIFAR-10 Batch 1:  Loss:     0.3445 Validation Accuracy: 0.663600
Epoch 121, CIFAR-10 Batch 2:  Loss:     0.2737 Validation Accuracy: 0.661000
Epoch 121, CIFAR-10 Batch 3:  Loss:     0.2755 Validation Accuracy: 0.657600
Epoch 121, CIFAR-10 Batch 4:  Loss:     0.2624 Validation Accuracy: 0.663000
Epoch 121, CIFAR-10 Batch 5:  Loss:     0.2475 Validation Accuracy: 0.662200
Epoch 122, CIFAR-10 Batch 1:  Loss:     0.3513 Validation Accuracy: 0.669600
Epoch 122, CIFAR-10 Batch 2:  Loss:     0.2767 Validation Accuracy: 0.662400
Epoch 122, CIFAR-10 Batch 3:  Loss:     0.2659 Validation Accuracy: 0.661200
Epoch 122, CIFAR-10 Batch 4:  Loss:     0.2597 Validation Accuracy: 0.662400
Epoch 122, CIFAR-10 Batch 5:  Loss:     0.2696 Validation Accuracy: 0.653800
Epoch 123, CIFAR-10 Batch 1:  Loss:     0.3426 Validation Accuracy: 0.668600
Epoch 123, CIFAR-10 Batch 2:  Loss:     0.2707 Validation Accuracy: 0.666200
Epoch 123, CIFAR-10 Batch 3:  Loss:     0.2646 Validation Accuracy: 0.662800
Epoch 123, CIFAR-10 Batch 4:  Loss:     0.2471 Validation Accuracy: 0.658200
Epoch 123, CIFAR-10 Batch 5:  Loss:     0.2675 Validation Accuracy: 0.670400
Epoch 124, CIFAR-10 Batch 1:  Loss:     0.3360 Validation Accuracy: 0.666600
Epoch 124, CIFAR-10 Batch 2:  Loss:     0.2622 Validation Accuracy: 0.658600
Epoch 124, CIFAR-10 Batch 3:  Loss:     0.2591 Validation Accuracy: 0.661400
Epoch 124, CIFAR-10 Batch 4:  Loss:     0.2523 Validation Accuracy: 0.662200
Epoch 124, CIFAR-10 Batch 5:  Loss:     0.2677 Validation Accuracy: 0.658200
Epoch 125, CIFAR-10 Batch 1:  Loss:     0.3212 Validation Accuracy: 0.666200
Epoch 125, CIFAR-10 Batch 2:  Loss:     0.2896 Validation Accuracy: 0.667200
Epoch 125, CIFAR-10 Batch 3:  Loss:     0.2654 Validation Accuracy: 0.659600
Epoch 125, CIFAR-10 Batch 4:  Loss:     0.2408 Validation Accuracy: 0.659200
Epoch 125, CIFAR-10 Batch 5:  Loss:     0.2446 Validation Accuracy: 0.660800
Epoch 126, CIFAR-10 Batch 1:  Loss:     0.3193 Validation Accuracy: 0.659000
Epoch 126, CIFAR-10 Batch 2:  Loss:     0.2760 Validation Accuracy: 0.656400
Epoch 126, CIFAR-10 Batch 3:  Loss:     0.2521 Validation Accuracy: 0.656000
Epoch 126, CIFAR-10 Batch 4:  Loss:     0.2360 Validation Accuracy: 0.663200
Epoch 126, CIFAR-10 Batch 5:  Loss:     0.2740 Validation Accuracy: 0.658200
Epoch 127, CIFAR-10 Batch 1:  Loss:     0.3168 Validation Accuracy: 0.671200
Epoch 127, CIFAR-10 Batch 2:  Loss:     0.2588 Validation Accuracy: 0.663400
Epoch 127, CIFAR-10 Batch 3:  Loss:     0.2513 Validation Accuracy: 0.659200
Epoch 127, CIFAR-10 Batch 4:  Loss:     0.2384 Validation Accuracy: 0.652800
Epoch 127, CIFAR-10 Batch 5:  Loss:     0.2598 Validation Accuracy: 0.661400
Epoch 128, CIFAR-10 Batch 1:  Loss:     0.3207 Validation Accuracy: 0.663400
Epoch 128, CIFAR-10 Batch 2:  Loss:     0.2555 Validation Accuracy: 0.665400
Epoch 128, CIFAR-10 Batch 3:  Loss:     0.2568 Validation Accuracy: 0.655600
Epoch 128, CIFAR-10 Batch 4:  Loss:     0.2384 Validation Accuracy: 0.659200
Epoch 128, CIFAR-10 Batch 5:  Loss:     0.2368 Validation Accuracy: 0.659000

Checkpoint

The model has been saved to disk.

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.


In [29]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_test.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for test_feature_batch, test_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: test_feature_batch, loaded_y: test_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


INFO:tensorflow:Restoring parameters from ./image_classification
Testing Accuracy: 0.6513671875

Why 50-80% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 80%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.


In [ ]: