Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile('cifar-10-python.tar.gz'):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            'cifar-10-python.tar.gz',
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open('cifar-10-python.tar.gz') as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


CIFAR-10 Dataset: 171MB [09:38, 295KB/s]                               
All files found!

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

  • airplane
  • automobile
  • bird
  • cat
  • deer
  • dog
  • frog
  • horse
  • ship
  • truck

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.


In [2]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id = 1
sample_id = 5
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 1:
Samples: 10000
Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}
First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]

Example of Image 5:
Image - Min Value: 0 Max Value: 252
Image - Shape: (32, 32, 3)
Label - Label Id: 1 Name: automobile

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.


In [3]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    # TODO: Implement Function
    return x / 255


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_normalize(normalize)


Tests Passed

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.

Hint: Don't reinvent the wheel.


In [4]:
def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    # TODO: Implement Function
    one_hot = np.zeros((len(x), 10))
    for i in range(len(x)):
        one_hot[i][x[i]] = 1
    return one_hot


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_one_hot_encode(one_hot_encode)


Tests Passed

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.


In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [6]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

If you're finding it hard to dedicate enough time for this course a week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use TensorFlow Layers or TensorFlow Layers (contrib) to build each layer, except "Convolutional & Max Pooling" layer. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

If you would like to get the most of this course, try to solve all the problems without TF Layers. Let's begin!

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

  • Implement neural_net_image_input
    • Return a TF Placeholder
    • Set the shape using image_shape with batch size set to None.
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_label_input
    • Return a TF Placeholder
    • Set the shape using n_classes with batch size set to None.
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_keep_prob_input
    • Return a TF Placeholder for dropout keep probability.
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.

These names will be used at the end of the project to load your saved model.

Note: None for shapes in TensorFlow allow for a dynamic size.


In [7]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a batch of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32,
                          shape = [None, image_shape[0], image_shape[1], image_shape[2]],
                          name = 'x')


def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32,
                          shape = [None, n_classes],
                          name = 'y')


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32, name = 'keep_prob')


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Image Input Tests Passed.
Label Input Tests Passed.
Keep Prob Tests Passed.

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • Apply a convolution to x_tensor using weight and conv_strides.
    • We recommend you use same padding, but you're welcome to use any padding.
  • Add bias
  • Add a nonlinear activation to the convolution.
  • Apply Max Pooling using pool_ksize and pool_strides.
    • We recommend you use same padding, but you're welcome to use any padding.

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer. You're free to use any TensorFlow package for all the other layers.


In [8]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """
    # TODO: Implement Function
    
    # Input/Image
    input = x_tensor
    
    # Weight and bias
    weight = tf.Variable(tf.truncated_normal(
        [conv_ksize[0], conv_ksize[1], x_tensor.get_shape().as_list()[-1], conv_num_outputs], stddev=0.1))
    bias = tf.Variable(tf.zeros(conv_num_outputs))
    
    # Apply Convolution
    conv_layer = tf.nn.conv2d(input, weight, strides=[1, conv_strides[0], conv_strides[1], 1], padding='SAME')
    # Add bias
    conv_layer = tf.nn.bias_add(conv_layer, bias)
    # Apply activation function
    conv_layer = tf.nn.relu(conv_layer)
    
    # Apply maxpooling
    conv_layer = tf.nn.max_pool(
        conv_layer,
        ksize=[1, pool_ksize[0], pool_ksize[1], 1],
        strides=[1, pool_strides[0], pool_strides[1], 1],
        padding='SAME')
    return conv_layer 


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_con_pool(conv2d_maxpool)


Tests Passed

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.


In [9]:
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    # TODO: Implement Function
    batch_size = x_tensor.get_shape().as_list()[0]# i tried as_list()[] 
    width = x_tensor.get_shape().as_list()[1]
    height = x_tensor.get_shape().as_list()[2]
    depth = x_tensor.get_shape().as_list()[3]

    image_flat_size = width * height * depth

    return tf.contrib.layers.flatten(x_tensor, [batch_size, image_flat_size])


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_flatten(flatten)


Tests Passed

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.


In [10]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    return tf.contrib.layers.fully_connected(x_tensor, num_outputs)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_fully_conn(fully_conn)


Tests Passed

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.

Note: Activation, softmax, or cross entropy shouldn't be applied to this.


In [11]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    weights = tf.Variable(tf.truncated_normal([x_tensor.get_shape().as_list()[1], num_outputs])) 
    mul = tf.matmul(x_tensor, weights, name='mul')
    bias = tf.Variable(tf.zeros(num_outputs))
    
    return tf.add(mul, bias)

#     y = tf.add(mul, bias)
#     fc = tf.contrib.layers.fully_connected(y, num_outputs)
#     return fc

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_output(output)


Tests Passed

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • Apply a Flatten Layer
  • Apply 1, 2, or 3 Fully Connected Layers
  • Apply an Output Layer
  • Return the output
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.

In [12]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """
    # TODO: Apply 1, 2, or 3 Convolution and Max Pool layers
    #    Play around with different number of outputs, kernel size and stride
    # Function Definition from Above:
    #    conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides)
    conv_num_outputs = 10
    conv_ksize = [2, 2]
    conv_strides = [2, 2]
    pool_ksize = [2, 2]
    pool_strides = [2, 2]
    conv_layer = conv2d_maxpool(x, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides)

    # TODO: Apply a Flatten Layer
    # Function Definition from Above:
    #   flatten(x_tensor)
    conv_layer = flatten(conv_layer)
    

    # TODO: Apply 1, 2, or 3 Fully Connected Layers
    #    Play around with different number of outputs
    # Function Definition from Above:
    #   fully_conn(x_tensor, num_outputs)
    num_outputs = 10
    conv_layer = fully_conn(conv_layer, num_outputs)
    
    # TODO: Apply an Output Layer
    #    Set this to the number of classes
    # Function Definition from Above:
    #   output(x_tensor, num_outputs)
    num_classes = 10
    conv_layer = output(conv_layer, num_classes)
    
    # TODO: return output
    return conv_layer


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""

##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

tests.test_conv_net(conv_net)


Neural Network Built!

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

  • x for image input
  • y for labels
  • keep_prob for keep probability for dropout

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

Note: Nothing needs to be returned. This function is only optimizing the neural network.


In [13]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    # TODO: Implement Function
    session.run(optimizer, feed_dict={x: feature_batch, y: label_batch, keep_prob: keep_probability})


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_train_nn(train_neural_network)


Tests Passed

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.


In [14]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """
    # TODO: Implement Function
    loss = session.run(cost, feed_dict={x: feature_batch, y: label_batch, keep_prob: 1.0})
    valid_acc = session.run(accuracy, feed_dict={x: valid_features, y: valid_labels, keep_prob: 1.0})
    print('Loss: {:>10.4f} Accuracy: {:.6f}'.format(loss, valid_acc))

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
    • 64
    • 128
    • 256
    • ...
  • Set keep_probability to the probability of keeping a node using dropout

In [23]:
# TODO: Tune Parameters
epochs = 100
batch_size = 256
keep_probability = 0.2

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.


In [24]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  Loss:     2.2889 Accuracy: 0.176600
Epoch  2, CIFAR-10 Batch 1:  Loss:     2.2921 Accuracy: 0.207600
Epoch  3, CIFAR-10 Batch 1:  Loss:     2.2796 Accuracy: 0.215200
Epoch  4, CIFAR-10 Batch 1:  Loss:     2.2641 Accuracy: 0.217800
Epoch  5, CIFAR-10 Batch 1:  Loss:     2.2354 Accuracy: 0.225000
Epoch  6, CIFAR-10 Batch 1:  Loss:     2.2033 Accuracy: 0.236400
Epoch  7, CIFAR-10 Batch 1:  Loss:     2.1698 Accuracy: 0.249200
Epoch  8, CIFAR-10 Batch 1:  Loss:     2.1310 Accuracy: 0.268400
Epoch  9, CIFAR-10 Batch 1:  Loss:     2.0925 Accuracy: 0.293600
Epoch 10, CIFAR-10 Batch 1:  Loss:     2.0588 Accuracy: 0.308400
Epoch 11, CIFAR-10 Batch 1:  Loss:     2.0222 Accuracy: 0.330400
Epoch 12, CIFAR-10 Batch 1:  Loss:     1.9836 Accuracy: 0.339400
Epoch 13, CIFAR-10 Batch 1:  Loss:     1.9462 Accuracy: 0.350200
Epoch 14, CIFAR-10 Batch 1:  Loss:     1.9072 Accuracy: 0.361600
Epoch 15, CIFAR-10 Batch 1:  Loss:     1.8769 Accuracy: 0.366200
Epoch 16, CIFAR-10 Batch 1:  Loss:     1.8475 Accuracy: 0.368800
Epoch 17, CIFAR-10 Batch 1:  Loss:     1.8163 Accuracy: 0.375400
Epoch 18, CIFAR-10 Batch 1:  Loss:     1.7928 Accuracy: 0.378400
Epoch 19, CIFAR-10 Batch 1:  Loss:     1.7659 Accuracy: 0.381400
Epoch 20, CIFAR-10 Batch 1:  Loss:     1.7407 Accuracy: 0.383200
Epoch 21, CIFAR-10 Batch 1:  Loss:     1.7198 Accuracy: 0.385000
Epoch 22, CIFAR-10 Batch 1:  Loss:     1.6975 Accuracy: 0.386000
Epoch 23, CIFAR-10 Batch 1:  Loss:     1.6767 Accuracy: 0.387400
Epoch 24, CIFAR-10 Batch 1:  Loss:     1.6559 Accuracy: 0.387800
Epoch 25, CIFAR-10 Batch 1:  Loss:     1.6376 Accuracy: 0.388600
Epoch 26, CIFAR-10 Batch 1:  Loss:     1.6202 Accuracy: 0.390600
Epoch 27, CIFAR-10 Batch 1:  Loss:     1.6020 Accuracy: 0.392200
Epoch 28, CIFAR-10 Batch 1:  Loss:     1.5810 Accuracy: 0.395000
Epoch 29, CIFAR-10 Batch 1:  Loss:     1.5646 Accuracy: 0.392800
Epoch 30, CIFAR-10 Batch 1:  Loss:     1.5471 Accuracy: 0.394400
Epoch 31, CIFAR-10 Batch 1:  Loss:     1.5312 Accuracy: 0.397600
Epoch 32, CIFAR-10 Batch 1:  Loss:     1.5164 Accuracy: 0.399400
Epoch 33, CIFAR-10 Batch 1:  Loss:     1.5018 Accuracy: 0.399800
Epoch 34, CIFAR-10 Batch 1:  Loss:     1.4879 Accuracy: 0.400200
Epoch 35, CIFAR-10 Batch 1:  Loss:     1.4758 Accuracy: 0.400800
Epoch 36, CIFAR-10 Batch 1:  Loss:     1.4633 Accuracy: 0.400400
Epoch 37, CIFAR-10 Batch 1:  Loss:     1.4516 Accuracy: 0.401400
Epoch 38, CIFAR-10 Batch 1:  Loss:     1.4396 Accuracy: 0.402600
Epoch 39, CIFAR-10 Batch 1:  Loss:     1.4282 Accuracy: 0.404000
Epoch 40, CIFAR-10 Batch 1:  Loss:     1.4169 Accuracy: 0.404800
Epoch 41, CIFAR-10 Batch 1:  Loss:     1.4069 Accuracy: 0.406800
Epoch 42, CIFAR-10 Batch 1:  Loss:     1.3957 Accuracy: 0.407400
Epoch 43, CIFAR-10 Batch 1:  Loss:     1.3857 Accuracy: 0.408400
Epoch 44, CIFAR-10 Batch 1:  Loss:     1.3758 Accuracy: 0.408000
Epoch 45, CIFAR-10 Batch 1:  Loss:     1.3662 Accuracy: 0.409600
Epoch 46, CIFAR-10 Batch 1:  Loss:     1.3563 Accuracy: 0.409800
Epoch 47, CIFAR-10 Batch 1:  Loss:     1.3475 Accuracy: 0.410400
Epoch 48, CIFAR-10 Batch 1:  Loss:     1.3398 Accuracy: 0.410000
Epoch 49, CIFAR-10 Batch 1:  Loss:     1.3302 Accuracy: 0.410600
Epoch 50, CIFAR-10 Batch 1:  Loss:     1.3215 Accuracy: 0.410600
Epoch 51, CIFAR-10 Batch 1:  Loss:     1.3130 Accuracy: 0.411600
Epoch 52, CIFAR-10 Batch 1:  Loss:     1.3049 Accuracy: 0.413000
Epoch 53, CIFAR-10 Batch 1:  Loss:     1.2959 Accuracy: 0.413000
Epoch 54, CIFAR-10 Batch 1:  Loss:     1.2869 Accuracy: 0.412800
Epoch 55, CIFAR-10 Batch 1:  Loss:     1.2789 Accuracy: 0.412200
Epoch 56, CIFAR-10 Batch 1:  Loss:     1.2711 Accuracy: 0.413400
Epoch 57, CIFAR-10 Batch 1:  Loss:     1.2638 Accuracy: 0.413200
Epoch 58, CIFAR-10 Batch 1:  Loss:     1.2564 Accuracy: 0.413400
Epoch 59, CIFAR-10 Batch 1:  Loss:     1.2490 Accuracy: 0.413400
Epoch 60, CIFAR-10 Batch 1:  Loss:     1.2419 Accuracy: 0.415000
Epoch 61, CIFAR-10 Batch 1:  Loss:     1.2350 Accuracy: 0.414800
Epoch 62, CIFAR-10 Batch 1:  Loss:     1.2286 Accuracy: 0.415400
Epoch 63, CIFAR-10 Batch 1:  Loss:     1.2220 Accuracy: 0.417000
Epoch 64, CIFAR-10 Batch 1:  Loss:     1.2149 Accuracy: 0.418200
Epoch 65, CIFAR-10 Batch 1:  Loss:     1.2085 Accuracy: 0.418600
Epoch 66, CIFAR-10 Batch 1:  Loss:     1.2020 Accuracy: 0.420400
Epoch 67, CIFAR-10 Batch 1:  Loss:     1.1955 Accuracy: 0.421600
Epoch 68, CIFAR-10 Batch 1:  Loss:     1.1901 Accuracy: 0.422400
Epoch 69, CIFAR-10 Batch 1:  Loss:     1.1848 Accuracy: 0.423200
Epoch 70, CIFAR-10 Batch 1:  Loss:     1.1795 Accuracy: 0.423800
Epoch 71, CIFAR-10 Batch 1:  Loss:     1.1746 Accuracy: 0.424400
Epoch 72, CIFAR-10 Batch 1:  Loss:     1.1698 Accuracy: 0.424800
Epoch 73, CIFAR-10 Batch 1:  Loss:     1.1645 Accuracy: 0.426400
Epoch 74, CIFAR-10 Batch 1:  Loss:     1.1591 Accuracy: 0.426800
Epoch 75, CIFAR-10 Batch 1:  Loss:     1.1545 Accuracy: 0.428600
Epoch 76, CIFAR-10 Batch 1:  Loss:     1.1489 Accuracy: 0.429200
Epoch 77, CIFAR-10 Batch 1:  Loss:     1.1438 Accuracy: 0.430000
Epoch 78, CIFAR-10 Batch 1:  Loss:     1.1386 Accuracy: 0.431200
Epoch 79, CIFAR-10 Batch 1:  Loss:     1.1337 Accuracy: 0.430800
Epoch 80, CIFAR-10 Batch 1:  Loss:     1.1283 Accuracy: 0.431600
Epoch 81, CIFAR-10 Batch 1:  Loss:     1.1248 Accuracy: 0.431200
Epoch 82, CIFAR-10 Batch 1:  Loss:     1.1198 Accuracy: 0.431800
Epoch 83, CIFAR-10 Batch 1:  Loss:     1.1146 Accuracy: 0.432400
Epoch 84, CIFAR-10 Batch 1:  Loss:     1.1099 Accuracy: 0.433800
Epoch 85, CIFAR-10 Batch 1:  Loss:     1.1059 Accuracy: 0.434200
Epoch 86, CIFAR-10 Batch 1:  Loss:     1.1012 Accuracy: 0.435000
Epoch 87, CIFAR-10 Batch 1:  Loss:     1.0972 Accuracy: 0.436000
Epoch 88, CIFAR-10 Batch 1:  Loss:     1.0932 Accuracy: 0.435600
Epoch 89, CIFAR-10 Batch 1:  Loss:     1.0893 Accuracy: 0.436600
Epoch 90, CIFAR-10 Batch 1:  Loss:     1.0855 Accuracy: 0.436200
Epoch 91, CIFAR-10 Batch 1:  Loss:     1.0819 Accuracy: 0.436800
Epoch 92, CIFAR-10 Batch 1:  Loss:     1.0779 Accuracy: 0.436800
Epoch 93, CIFAR-10 Batch 1:  Loss:     1.0742 Accuracy: 0.436600
Epoch 94, CIFAR-10 Batch 1:  Loss:     1.0710 Accuracy: 0.436000
Epoch 95, CIFAR-10 Batch 1:  Loss:     1.0675 Accuracy: 0.436800
Epoch 96, CIFAR-10 Batch 1:  Loss:     1.0638 Accuracy: 0.436600
Epoch 97, CIFAR-10 Batch 1:  Loss:     1.0605 Accuracy: 0.437600
Epoch 98, CIFAR-10 Batch 1:  Loss:     1.0570 Accuracy: 0.438200
Epoch 99, CIFAR-10 Batch 1:  Loss:     1.0537 Accuracy: 0.438400
Epoch 100, CIFAR-10 Batch 1:  Loss:     1.0504 Accuracy: 0.438800

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.


In [25]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  Loss:     2.3132 Accuracy: 0.188600
Epoch  1, CIFAR-10 Batch 2:  Loss:     2.1482 Accuracy: 0.217600
Epoch  1, CIFAR-10 Batch 3:  Loss:     2.0624 Accuracy: 0.243200
Epoch  1, CIFAR-10 Batch 4:  Loss:     2.0247 Accuracy: 0.248200
Epoch  1, CIFAR-10 Batch 5:  Loss:     2.0147 Accuracy: 0.275800
Epoch  2, CIFAR-10 Batch 1:  Loss:     2.2152 Accuracy: 0.281200
Epoch  2, CIFAR-10 Batch 2:  Loss:     1.8937 Accuracy: 0.300000
Epoch  2, CIFAR-10 Batch 3:  Loss:     1.7888 Accuracy: 0.320200
Epoch  2, CIFAR-10 Batch 4:  Loss:     1.8190 Accuracy: 0.316800
Epoch  2, CIFAR-10 Batch 5:  Loss:     1.8558 Accuracy: 0.333800
Epoch  3, CIFAR-10 Batch 1:  Loss:     2.0746 Accuracy: 0.357600
Epoch  3, CIFAR-10 Batch 2:  Loss:     1.6814 Accuracy: 0.378400
Epoch  3, CIFAR-10 Batch 3:  Loss:     1.6008 Accuracy: 0.385000
Epoch  3, CIFAR-10 Batch 4:  Loss:     1.6722 Accuracy: 0.378600
Epoch  3, CIFAR-10 Batch 5:  Loss:     1.7148 Accuracy: 0.384600
Epoch  4, CIFAR-10 Batch 1:  Loss:     1.9897 Accuracy: 0.396000
Epoch  4, CIFAR-10 Batch 2:  Loss:     1.5266 Accuracy: 0.414600
Epoch  4, CIFAR-10 Batch 3:  Loss:     1.4601 Accuracy: 0.418000
Epoch  4, CIFAR-10 Batch 4:  Loss:     1.6010 Accuracy: 0.408000
Epoch  4, CIFAR-10 Batch 5:  Loss:     1.6073 Accuracy: 0.411400
Epoch  5, CIFAR-10 Batch 1:  Loss:     1.9175 Accuracy: 0.425400
Epoch  5, CIFAR-10 Batch 2:  Loss:     1.4216 Accuracy: 0.434800
Epoch  5, CIFAR-10 Batch 3:  Loss:     1.3714 Accuracy: 0.434400
Epoch  5, CIFAR-10 Batch 4:  Loss:     1.5623 Accuracy: 0.434200
Epoch  5, CIFAR-10 Batch 5:  Loss:     1.5272 Accuracy: 0.432600
Epoch  6, CIFAR-10 Batch 1:  Loss:     1.8535 Accuracy: 0.444600
Epoch  6, CIFAR-10 Batch 2:  Loss:     1.3413 Accuracy: 0.443400
Epoch  6, CIFAR-10 Batch 3:  Loss:     1.3150 Accuracy: 0.445400
Epoch  6, CIFAR-10 Batch 4:  Loss:     1.5374 Accuracy: 0.448400
Epoch  6, CIFAR-10 Batch 5:  Loss:     1.4769 Accuracy: 0.440600
Epoch  7, CIFAR-10 Batch 1:  Loss:     1.7994 Accuracy: 0.447600
Epoch  7, CIFAR-10 Batch 2:  Loss:     1.2855 Accuracy: 0.451000
Epoch  7, CIFAR-10 Batch 3:  Loss:     1.2728 Accuracy: 0.451200
Epoch  7, CIFAR-10 Batch 4:  Loss:     1.5155 Accuracy: 0.460000
Epoch  7, CIFAR-10 Batch 5:  Loss:     1.4327 Accuracy: 0.450200
Epoch  8, CIFAR-10 Batch 1:  Loss:     1.7642 Accuracy: 0.455000
Epoch  8, CIFAR-10 Batch 2:  Loss:     1.2439 Accuracy: 0.453800
Epoch  8, CIFAR-10 Batch 3:  Loss:     1.2411 Accuracy: 0.457600
Epoch  8, CIFAR-10 Batch 4:  Loss:     1.4983 Accuracy: 0.471000
Epoch  8, CIFAR-10 Batch 5:  Loss:     1.3905 Accuracy: 0.456600
Epoch  9, CIFAR-10 Batch 1:  Loss:     1.7360 Accuracy: 0.466200
Epoch  9, CIFAR-10 Batch 2:  Loss:     1.2086 Accuracy: 0.456600
Epoch  9, CIFAR-10 Batch 3:  Loss:     1.2144 Accuracy: 0.466800
Epoch  9, CIFAR-10 Batch 4:  Loss:     1.4749 Accuracy: 0.475200
Epoch  9, CIFAR-10 Batch 5:  Loss:     1.3590 Accuracy: 0.461600
Epoch 10, CIFAR-10 Batch 1:  Loss:     1.7132 Accuracy: 0.472400
Epoch 10, CIFAR-10 Batch 2:  Loss:     1.1783 Accuracy: 0.462200
Epoch 10, CIFAR-10 Batch 3:  Loss:     1.1938 Accuracy: 0.470000
Epoch 10, CIFAR-10 Batch 4:  Loss:     1.4573 Accuracy: 0.482400
Epoch 10, CIFAR-10 Batch 5:  Loss:     1.3381 Accuracy: 0.470200
Epoch 11, CIFAR-10 Batch 1:  Loss:     1.6973 Accuracy: 0.477200
Epoch 11, CIFAR-10 Batch 2:  Loss:     1.1506 Accuracy: 0.469800
Epoch 11, CIFAR-10 Batch 3:  Loss:     1.1736 Accuracy: 0.472200
Epoch 11, CIFAR-10 Batch 4:  Loss:     1.4375 Accuracy: 0.486400
Epoch 11, CIFAR-10 Batch 5:  Loss:     1.3220 Accuracy: 0.474800
Epoch 12, CIFAR-10 Batch 1:  Loss:     1.6798 Accuracy: 0.477200
Epoch 12, CIFAR-10 Batch 2:  Loss:     1.1264 Accuracy: 0.475800
Epoch 12, CIFAR-10 Batch 3:  Loss:     1.1549 Accuracy: 0.475400
Epoch 12, CIFAR-10 Batch 4:  Loss:     1.4159 Accuracy: 0.488400
Epoch 12, CIFAR-10 Batch 5:  Loss:     1.3094 Accuracy: 0.476000
Epoch 13, CIFAR-10 Batch 1:  Loss:     1.6659 Accuracy: 0.480600
Epoch 13, CIFAR-10 Batch 2:  Loss:     1.1040 Accuracy: 0.480000
Epoch 13, CIFAR-10 Batch 3:  Loss:     1.1383 Accuracy: 0.480000
Epoch 13, CIFAR-10 Batch 4:  Loss:     1.3930 Accuracy: 0.491800
Epoch 13, CIFAR-10 Batch 5:  Loss:     1.2978 Accuracy: 0.480400
Epoch 14, CIFAR-10 Batch 1:  Loss:     1.6515 Accuracy: 0.484600
Epoch 14, CIFAR-10 Batch 2:  Loss:     1.0855 Accuracy: 0.484800
Epoch 14, CIFAR-10 Batch 3:  Loss:     1.1210 Accuracy: 0.483000
Epoch 14, CIFAR-10 Batch 4:  Loss:     1.3716 Accuracy: 0.493200
Epoch 14, CIFAR-10 Batch 5:  Loss:     1.2892 Accuracy: 0.483400
Epoch 15, CIFAR-10 Batch 1:  Loss:     1.6396 Accuracy: 0.488400
Epoch 15, CIFAR-10 Batch 2:  Loss:     1.0705 Accuracy: 0.485000
Epoch 15, CIFAR-10 Batch 3:  Loss:     1.1039 Accuracy: 0.487400
Epoch 15, CIFAR-10 Batch 4:  Loss:     1.3537 Accuracy: 0.494600
Epoch 15, CIFAR-10 Batch 5:  Loss:     1.2814 Accuracy: 0.486200
Epoch 16, CIFAR-10 Batch 1:  Loss:     1.6267 Accuracy: 0.488400
Epoch 16, CIFAR-10 Batch 2:  Loss:     1.0557 Accuracy: 0.487800
Epoch 16, CIFAR-10 Batch 3:  Loss:     1.0899 Accuracy: 0.490200
Epoch 16, CIFAR-10 Batch 4:  Loss:     1.3364 Accuracy: 0.497200
Epoch 16, CIFAR-10 Batch 5:  Loss:     1.2749 Accuracy: 0.487400
Epoch 17, CIFAR-10 Batch 1:  Loss:     1.6168 Accuracy: 0.490600
Epoch 17, CIFAR-10 Batch 2:  Loss:     1.0453 Accuracy: 0.489000
Epoch 17, CIFAR-10 Batch 3:  Loss:     1.0760 Accuracy: 0.492400
Epoch 17, CIFAR-10 Batch 4:  Loss:     1.3193 Accuracy: 0.500200
Epoch 17, CIFAR-10 Batch 5:  Loss:     1.2694 Accuracy: 0.492800
Epoch 18, CIFAR-10 Batch 1:  Loss:     1.6073 Accuracy: 0.489400
Epoch 18, CIFAR-10 Batch 2:  Loss:     1.0364 Accuracy: 0.490000
Epoch 18, CIFAR-10 Batch 3:  Loss:     1.0634 Accuracy: 0.496200
Epoch 18, CIFAR-10 Batch 4:  Loss:     1.3034 Accuracy: 0.501800
Epoch 18, CIFAR-10 Batch 5:  Loss:     1.2650 Accuracy: 0.497000
Epoch 19, CIFAR-10 Batch 1:  Loss:     1.6020 Accuracy: 0.495200
Epoch 19, CIFAR-10 Batch 2:  Loss:     1.0284 Accuracy: 0.492600
Epoch 19, CIFAR-10 Batch 3:  Loss:     1.0524 Accuracy: 0.500400
Epoch 19, CIFAR-10 Batch 4:  Loss:     1.2888 Accuracy: 0.502000
Epoch 19, CIFAR-10 Batch 5:  Loss:     1.2592 Accuracy: 0.499600
Epoch 20, CIFAR-10 Batch 1:  Loss:     1.5915 Accuracy: 0.497400
Epoch 20, CIFAR-10 Batch 2:  Loss:     1.0204 Accuracy: 0.494200
Epoch 20, CIFAR-10 Batch 3:  Loss:     1.0412 Accuracy: 0.502400
Epoch 20, CIFAR-10 Batch 4:  Loss:     1.2762 Accuracy: 0.504600
Epoch 20, CIFAR-10 Batch 5:  Loss:     1.2516 Accuracy: 0.500400
Epoch 21, CIFAR-10 Batch 1:  Loss:     1.5835 Accuracy: 0.505000
Epoch 21, CIFAR-10 Batch 2:  Loss:     1.0111 Accuracy: 0.498600
Epoch 21, CIFAR-10 Batch 3:  Loss:     1.0320 Accuracy: 0.505800
Epoch 21, CIFAR-10 Batch 4:  Loss:     1.2635 Accuracy: 0.507800
Epoch 21, CIFAR-10 Batch 5:  Loss:     1.2449 Accuracy: 0.503400
Epoch 22, CIFAR-10 Batch 1:  Loss:     1.5779 Accuracy: 0.508200
Epoch 22, CIFAR-10 Batch 2:  Loss:     1.0046 Accuracy: 0.503400
Epoch 22, CIFAR-10 Batch 3:  Loss:     1.0235 Accuracy: 0.507000
Epoch 22, CIFAR-10 Batch 4:  Loss:     1.2519 Accuracy: 0.507600
Epoch 22, CIFAR-10 Batch 5:  Loss:     1.2376 Accuracy: 0.506200
Epoch 23, CIFAR-10 Batch 1:  Loss:     1.5702 Accuracy: 0.511400
Epoch 23, CIFAR-10 Batch 2:  Loss:     0.9981 Accuracy: 0.501600
Epoch 23, CIFAR-10 Batch 3:  Loss:     1.0167 Accuracy: 0.509600
Epoch 23, CIFAR-10 Batch 4:  Loss:     1.2422 Accuracy: 0.509000
Epoch 23, CIFAR-10 Batch 5:  Loss:     1.2316 Accuracy: 0.505800
Epoch 24, CIFAR-10 Batch 1:  Loss:     1.5633 Accuracy: 0.513600
Epoch 24, CIFAR-10 Batch 2:  Loss:     0.9926 Accuracy: 0.503400
Epoch 24, CIFAR-10 Batch 3:  Loss:     1.0094 Accuracy: 0.511600
Epoch 24, CIFAR-10 Batch 4:  Loss:     1.2314 Accuracy: 0.509600
Epoch 24, CIFAR-10 Batch 5:  Loss:     1.2258 Accuracy: 0.507400
Epoch 25, CIFAR-10 Batch 1:  Loss:     1.5574 Accuracy: 0.516000
Epoch 25, CIFAR-10 Batch 2:  Loss:     0.9874 Accuracy: 0.506600
Epoch 25, CIFAR-10 Batch 3:  Loss:     1.0019 Accuracy: 0.514600
Epoch 25, CIFAR-10 Batch 4:  Loss:     1.2213 Accuracy: 0.511600
Epoch 25, CIFAR-10 Batch 5:  Loss:     1.2188 Accuracy: 0.508400
Epoch 26, CIFAR-10 Batch 1:  Loss:     1.5506 Accuracy: 0.516000
Epoch 26, CIFAR-10 Batch 2:  Loss:     0.9817 Accuracy: 0.508400
Epoch 26, CIFAR-10 Batch 3:  Loss:     0.9949 Accuracy: 0.515000
Epoch 26, CIFAR-10 Batch 4:  Loss:     1.2119 Accuracy: 0.514000
Epoch 26, CIFAR-10 Batch 5:  Loss:     1.2120 Accuracy: 0.512200
Epoch 27, CIFAR-10 Batch 1:  Loss:     1.5442 Accuracy: 0.516400
Epoch 27, CIFAR-10 Batch 2:  Loss:     0.9765 Accuracy: 0.509800
Epoch 27, CIFAR-10 Batch 3:  Loss:     0.9879 Accuracy: 0.516800
Epoch 27, CIFAR-10 Batch 4:  Loss:     1.2036 Accuracy: 0.515600
Epoch 27, CIFAR-10 Batch 5:  Loss:     1.2039 Accuracy: 0.513000
Epoch 28, CIFAR-10 Batch 1:  Loss:     1.5389 Accuracy: 0.519400
Epoch 28, CIFAR-10 Batch 2:  Loss:     0.9725 Accuracy: 0.511200
Epoch 28, CIFAR-10 Batch 3:  Loss:     0.9807 Accuracy: 0.518800
Epoch 28, CIFAR-10 Batch 4:  Loss:     1.1943 Accuracy: 0.519200
Epoch 28, CIFAR-10 Batch 5:  Loss:     1.1975 Accuracy: 0.514200
Epoch 29, CIFAR-10 Batch 1:  Loss:     1.5347 Accuracy: 0.521400
Epoch 29, CIFAR-10 Batch 2:  Loss:     0.9662 Accuracy: 0.512800
Epoch 29, CIFAR-10 Batch 3:  Loss:     0.9742 Accuracy: 0.520600
Epoch 29, CIFAR-10 Batch 4:  Loss:     1.1872 Accuracy: 0.519600
Epoch 29, CIFAR-10 Batch 5:  Loss:     1.1895 Accuracy: 0.516000
Epoch 30, CIFAR-10 Batch 1:  Loss:     1.5297 Accuracy: 0.522200
Epoch 30, CIFAR-10 Batch 2:  Loss:     0.9611 Accuracy: 0.514000
Epoch 30, CIFAR-10 Batch 3:  Loss:     0.9691 Accuracy: 0.522800
Epoch 30, CIFAR-10 Batch 4:  Loss:     1.1797 Accuracy: 0.520600
Epoch 30, CIFAR-10 Batch 5:  Loss:     1.1823 Accuracy: 0.516800
Epoch 31, CIFAR-10 Batch 1:  Loss:     1.5247 Accuracy: 0.523000
Epoch 31, CIFAR-10 Batch 2:  Loss:     0.9554 Accuracy: 0.516000
Epoch 31, CIFAR-10 Batch 3:  Loss:     0.9641 Accuracy: 0.523000
Epoch 31, CIFAR-10 Batch 4:  Loss:     1.1716 Accuracy: 0.523000
Epoch 31, CIFAR-10 Batch 5:  Loss:     1.1765 Accuracy: 0.517200
Epoch 32, CIFAR-10 Batch 1:  Loss:     1.5203 Accuracy: 0.525000
Epoch 32, CIFAR-10 Batch 2:  Loss:     0.9498 Accuracy: 0.517200
Epoch 32, CIFAR-10 Batch 3:  Loss:     0.9590 Accuracy: 0.524000
Epoch 32, CIFAR-10 Batch 4:  Loss:     1.1639 Accuracy: 0.524800
Epoch 32, CIFAR-10 Batch 5:  Loss:     1.1700 Accuracy: 0.517800
Epoch 33, CIFAR-10 Batch 1:  Loss:     1.5163 Accuracy: 0.523800
Epoch 33, CIFAR-10 Batch 2:  Loss:     0.9450 Accuracy: 0.517600
Epoch 33, CIFAR-10 Batch 3:  Loss:     0.9544 Accuracy: 0.526000
Epoch 33, CIFAR-10 Batch 4:  Loss:     1.1566 Accuracy: 0.525200
Epoch 33, CIFAR-10 Batch 5:  Loss:     1.1626 Accuracy: 0.519000
Epoch 34, CIFAR-10 Batch 1:  Loss:     1.5132 Accuracy: 0.524600
Epoch 34, CIFAR-10 Batch 2:  Loss:     0.9405 Accuracy: 0.518000
Epoch 34, CIFAR-10 Batch 3:  Loss:     0.9484 Accuracy: 0.527000
Epoch 34, CIFAR-10 Batch 4:  Loss:     1.1493 Accuracy: 0.526600
Epoch 34, CIFAR-10 Batch 5:  Loss:     1.1567 Accuracy: 0.518800
Epoch 35, CIFAR-10 Batch 1:  Loss:     1.5103 Accuracy: 0.526400
Epoch 35, CIFAR-10 Batch 2:  Loss:     0.9356 Accuracy: 0.519200
Epoch 35, CIFAR-10 Batch 3:  Loss:     0.9435 Accuracy: 0.531600
Epoch 35, CIFAR-10 Batch 4:  Loss:     1.1421 Accuracy: 0.528000
Epoch 35, CIFAR-10 Batch 5:  Loss:     1.1512 Accuracy: 0.522400
Epoch 36, CIFAR-10 Batch 1:  Loss:     1.5075 Accuracy: 0.527200
Epoch 36, CIFAR-10 Batch 2:  Loss:     0.9303 Accuracy: 0.520200
Epoch 36, CIFAR-10 Batch 3:  Loss:     0.9373 Accuracy: 0.532600
Epoch 36, CIFAR-10 Batch 4:  Loss:     1.1357 Accuracy: 0.530800
Epoch 36, CIFAR-10 Batch 5:  Loss:     1.1453 Accuracy: 0.524000
Epoch 37, CIFAR-10 Batch 1:  Loss:     1.5033 Accuracy: 0.527800
Epoch 37, CIFAR-10 Batch 2:  Loss:     0.9255 Accuracy: 0.521800
Epoch 37, CIFAR-10 Batch 3:  Loss:     0.9332 Accuracy: 0.533600
Epoch 37, CIFAR-10 Batch 4:  Loss:     1.1299 Accuracy: 0.530600
Epoch 37, CIFAR-10 Batch 5:  Loss:     1.1397 Accuracy: 0.525400
Epoch 38, CIFAR-10 Batch 1:  Loss:     1.4989 Accuracy: 0.529800
Epoch 38, CIFAR-10 Batch 2:  Loss:     0.9216 Accuracy: 0.523000
Epoch 38, CIFAR-10 Batch 3:  Loss:     0.9294 Accuracy: 0.533600
Epoch 38, CIFAR-10 Batch 4:  Loss:     1.1246 Accuracy: 0.532200
Epoch 38, CIFAR-10 Batch 5:  Loss:     1.1333 Accuracy: 0.525200
Epoch 39, CIFAR-10 Batch 1:  Loss:     1.4944 Accuracy: 0.530400
Epoch 39, CIFAR-10 Batch 2:  Loss:     0.9175 Accuracy: 0.523200
Epoch 39, CIFAR-10 Batch 3:  Loss:     0.9261 Accuracy: 0.533800
Epoch 39, CIFAR-10 Batch 4:  Loss:     1.1188 Accuracy: 0.533400
Epoch 39, CIFAR-10 Batch 5:  Loss:     1.1273 Accuracy: 0.526600
Epoch 40, CIFAR-10 Batch 1:  Loss:     1.4913 Accuracy: 0.531200
Epoch 40, CIFAR-10 Batch 2:  Loss:     0.9135 Accuracy: 0.523600
Epoch 40, CIFAR-10 Batch 3:  Loss:     0.9224 Accuracy: 0.534800
Epoch 40, CIFAR-10 Batch 4:  Loss:     1.1141 Accuracy: 0.533000
Epoch 40, CIFAR-10 Batch 5:  Loss:     1.1222 Accuracy: 0.525200
Epoch 41, CIFAR-10 Batch 1:  Loss:     1.4872 Accuracy: 0.533800
Epoch 41, CIFAR-10 Batch 2:  Loss:     0.9102 Accuracy: 0.526000
Epoch 41, CIFAR-10 Batch 3:  Loss:     0.9188 Accuracy: 0.535400
Epoch 41, CIFAR-10 Batch 4:  Loss:     1.1092 Accuracy: 0.534400
Epoch 41, CIFAR-10 Batch 5:  Loss:     1.1161 Accuracy: 0.526600
Epoch 42, CIFAR-10 Batch 1:  Loss:     1.4824 Accuracy: 0.535200
Epoch 42, CIFAR-10 Batch 2:  Loss:     0.9074 Accuracy: 0.526800
Epoch 42, CIFAR-10 Batch 3:  Loss:     0.9143 Accuracy: 0.536600
Epoch 42, CIFAR-10 Batch 4:  Loss:     1.1047 Accuracy: 0.533000
Epoch 42, CIFAR-10 Batch 5:  Loss:     1.1114 Accuracy: 0.527600
Epoch 43, CIFAR-10 Batch 1:  Loss:     1.4784 Accuracy: 0.537000
Epoch 43, CIFAR-10 Batch 2:  Loss:     0.9045 Accuracy: 0.527200
Epoch 43, CIFAR-10 Batch 3:  Loss:     0.9100 Accuracy: 0.538200
Epoch 43, CIFAR-10 Batch 4:  Loss:     1.1002 Accuracy: 0.534600
Epoch 43, CIFAR-10 Batch 5:  Loss:     1.1067 Accuracy: 0.527400
Epoch 44, CIFAR-10 Batch 1:  Loss:     1.4733 Accuracy: 0.537800
Epoch 44, CIFAR-10 Batch 2:  Loss:     0.9010 Accuracy: 0.527000
Epoch 44, CIFAR-10 Batch 3:  Loss:     0.9067 Accuracy: 0.539200
Epoch 44, CIFAR-10 Batch 4:  Loss:     1.0967 Accuracy: 0.535200
Epoch 44, CIFAR-10 Batch 5:  Loss:     1.1015 Accuracy: 0.527800
Epoch 45, CIFAR-10 Batch 1:  Loss:     1.4683 Accuracy: 0.537000
Epoch 45, CIFAR-10 Batch 2:  Loss:     0.8980 Accuracy: 0.526800
Epoch 45, CIFAR-10 Batch 3:  Loss:     0.9030 Accuracy: 0.539400
Epoch 45, CIFAR-10 Batch 4:  Loss:     1.0932 Accuracy: 0.536600
Epoch 45, CIFAR-10 Batch 5:  Loss:     1.0963 Accuracy: 0.527000
Epoch 46, CIFAR-10 Batch 1:  Loss:     1.4627 Accuracy: 0.537600
Epoch 46, CIFAR-10 Batch 2:  Loss:     0.8948 Accuracy: 0.527400
Epoch 46, CIFAR-10 Batch 3:  Loss:     0.8991 Accuracy: 0.540400
Epoch 46, CIFAR-10 Batch 4:  Loss:     1.0898 Accuracy: 0.537200
Epoch 46, CIFAR-10 Batch 5:  Loss:     1.0921 Accuracy: 0.527000
Epoch 47, CIFAR-10 Batch 1:  Loss:     1.4571 Accuracy: 0.538000
Epoch 47, CIFAR-10 Batch 2:  Loss:     0.8917 Accuracy: 0.528000
Epoch 47, CIFAR-10 Batch 3:  Loss:     0.8961 Accuracy: 0.541000
Epoch 47, CIFAR-10 Batch 4:  Loss:     1.0864 Accuracy: 0.537200
Epoch 47, CIFAR-10 Batch 5:  Loss:     1.0877 Accuracy: 0.528400
Epoch 48, CIFAR-10 Batch 1:  Loss:     1.4530 Accuracy: 0.537800
Epoch 48, CIFAR-10 Batch 2:  Loss:     0.8883 Accuracy: 0.529400
Epoch 48, CIFAR-10 Batch 3:  Loss:     0.8929 Accuracy: 0.542200
Epoch 48, CIFAR-10 Batch 4:  Loss:     1.0838 Accuracy: 0.537200
Epoch 48, CIFAR-10 Batch 5:  Loss:     1.0825 Accuracy: 0.529200
Epoch 49, CIFAR-10 Batch 1:  Loss:     1.4473 Accuracy: 0.538200
Epoch 49, CIFAR-10 Batch 2:  Loss:     0.8847 Accuracy: 0.529800
Epoch 49, CIFAR-10 Batch 3:  Loss:     0.8897 Accuracy: 0.540200
Epoch 49, CIFAR-10 Batch 4:  Loss:     1.0809 Accuracy: 0.537000
Epoch 49, CIFAR-10 Batch 5:  Loss:     1.0776 Accuracy: 0.530400
Epoch 50, CIFAR-10 Batch 1:  Loss:     1.4418 Accuracy: 0.539000
Epoch 50, CIFAR-10 Batch 2:  Loss:     0.8819 Accuracy: 0.530400
Epoch 50, CIFAR-10 Batch 3:  Loss:     0.8869 Accuracy: 0.540800
Epoch 50, CIFAR-10 Batch 4:  Loss:     1.0781 Accuracy: 0.538000
Epoch 50, CIFAR-10 Batch 5:  Loss:     1.0735 Accuracy: 0.529200
Epoch 51, CIFAR-10 Batch 1:  Loss:     1.4367 Accuracy: 0.537600
Epoch 51, CIFAR-10 Batch 2:  Loss:     0.8792 Accuracy: 0.530800
Epoch 51, CIFAR-10 Batch 3:  Loss:     0.8850 Accuracy: 0.543000
Epoch 51, CIFAR-10 Batch 4:  Loss:     1.0761 Accuracy: 0.537800
Epoch 51, CIFAR-10 Batch 5:  Loss:     1.0690 Accuracy: 0.531800
Epoch 52, CIFAR-10 Batch 1:  Loss:     1.4322 Accuracy: 0.537800
Epoch 52, CIFAR-10 Batch 2:  Loss:     0.8769 Accuracy: 0.530600
Epoch 52, CIFAR-10 Batch 3:  Loss:     0.8830 Accuracy: 0.542200
Epoch 52, CIFAR-10 Batch 4:  Loss:     1.0740 Accuracy: 0.539200
Epoch 52, CIFAR-10 Batch 5:  Loss:     1.0644 Accuracy: 0.532000
Epoch 53, CIFAR-10 Batch 1:  Loss:     1.4277 Accuracy: 0.538600
Epoch 53, CIFAR-10 Batch 2:  Loss:     0.8747 Accuracy: 0.532200
Epoch 53, CIFAR-10 Batch 3:  Loss:     0.8807 Accuracy: 0.543400
Epoch 53, CIFAR-10 Batch 4:  Loss:     1.0720 Accuracy: 0.540000
Epoch 53, CIFAR-10 Batch 5:  Loss:     1.0606 Accuracy: 0.533200
Epoch 54, CIFAR-10 Batch 1:  Loss:     1.4232 Accuracy: 0.538000
Epoch 54, CIFAR-10 Batch 2:  Loss:     0.8727 Accuracy: 0.533600
Epoch 54, CIFAR-10 Batch 3:  Loss:     0.8792 Accuracy: 0.543800
Epoch 54, CIFAR-10 Batch 4:  Loss:     1.0705 Accuracy: 0.540000
Epoch 54, CIFAR-10 Batch 5:  Loss:     1.0563 Accuracy: 0.533400
Epoch 55, CIFAR-10 Batch 1:  Loss:     1.4165 Accuracy: 0.538200
Epoch 55, CIFAR-10 Batch 2:  Loss:     0.8691 Accuracy: 0.534400
Epoch 55, CIFAR-10 Batch 3:  Loss:     0.8773 Accuracy: 0.544600
Epoch 55, CIFAR-10 Batch 4:  Loss:     1.0696 Accuracy: 0.542000
Epoch 55, CIFAR-10 Batch 5:  Loss:     1.0526 Accuracy: 0.534800
Epoch 56, CIFAR-10 Batch 1:  Loss:     1.4113 Accuracy: 0.538800
Epoch 56, CIFAR-10 Batch 2:  Loss:     0.8669 Accuracy: 0.535000
Epoch 56, CIFAR-10 Batch 3:  Loss:     0.8758 Accuracy: 0.545000
Epoch 56, CIFAR-10 Batch 4:  Loss:     1.0687 Accuracy: 0.543000
Epoch 56, CIFAR-10 Batch 5:  Loss:     1.0478 Accuracy: 0.534400
Epoch 57, CIFAR-10 Batch 1:  Loss:     1.4058 Accuracy: 0.539600
Epoch 57, CIFAR-10 Batch 2:  Loss:     0.8641 Accuracy: 0.535200
Epoch 57, CIFAR-10 Batch 3:  Loss:     0.8744 Accuracy: 0.546000
Epoch 57, CIFAR-10 Batch 4:  Loss:     1.0673 Accuracy: 0.542800
Epoch 57, CIFAR-10 Batch 5:  Loss:     1.0441 Accuracy: 0.535000
Epoch 58, CIFAR-10 Batch 1:  Loss:     1.4010 Accuracy: 0.540200
Epoch 58, CIFAR-10 Batch 2:  Loss:     0.8618 Accuracy: 0.534200
Epoch 58, CIFAR-10 Batch 3:  Loss:     0.8726 Accuracy: 0.546800
Epoch 58, CIFAR-10 Batch 4:  Loss:     1.0662 Accuracy: 0.543400
Epoch 58, CIFAR-10 Batch 5:  Loss:     1.0400 Accuracy: 0.534000
Epoch 59, CIFAR-10 Batch 1:  Loss:     1.3956 Accuracy: 0.539800
Epoch 59, CIFAR-10 Batch 2:  Loss:     0.8601 Accuracy: 0.534800
Epoch 59, CIFAR-10 Batch 3:  Loss:     0.8704 Accuracy: 0.547200
Epoch 59, CIFAR-10 Batch 4:  Loss:     1.0649 Accuracy: 0.544000
Epoch 59, CIFAR-10 Batch 5:  Loss:     1.0361 Accuracy: 0.533400
Epoch 60, CIFAR-10 Batch 1:  Loss:     1.3921 Accuracy: 0.541600
Epoch 60, CIFAR-10 Batch 2:  Loss:     0.8579 Accuracy: 0.535800
Epoch 60, CIFAR-10 Batch 3:  Loss:     0.8689 Accuracy: 0.547600
Epoch 60, CIFAR-10 Batch 4:  Loss:     1.0634 Accuracy: 0.545400
Epoch 60, CIFAR-10 Batch 5:  Loss:     1.0316 Accuracy: 0.532800
Epoch 61, CIFAR-10 Batch 1:  Loss:     1.3868 Accuracy: 0.542800
Epoch 61, CIFAR-10 Batch 2:  Loss:     0.8560 Accuracy: 0.537400
Epoch 61, CIFAR-10 Batch 3:  Loss:     0.8673 Accuracy: 0.546600
Epoch 61, CIFAR-10 Batch 4:  Loss:     1.0618 Accuracy: 0.544800
Epoch 61, CIFAR-10 Batch 5:  Loss:     1.0272 Accuracy: 0.533600
Epoch 62, CIFAR-10 Batch 1:  Loss:     1.3830 Accuracy: 0.541200
Epoch 62, CIFAR-10 Batch 2:  Loss:     0.8537 Accuracy: 0.537600
Epoch 62, CIFAR-10 Batch 3:  Loss:     0.8657 Accuracy: 0.546600
Epoch 62, CIFAR-10 Batch 4:  Loss:     1.0605 Accuracy: 0.546200
Epoch 62, CIFAR-10 Batch 5:  Loss:     1.0231 Accuracy: 0.534200
Epoch 63, CIFAR-10 Batch 1:  Loss:     1.3784 Accuracy: 0.541200
Epoch 63, CIFAR-10 Batch 2:  Loss:     0.8514 Accuracy: 0.538000
Epoch 63, CIFAR-10 Batch 3:  Loss:     0.8645 Accuracy: 0.546000
Epoch 63, CIFAR-10 Batch 4:  Loss:     1.0594 Accuracy: 0.547800
Epoch 63, CIFAR-10 Batch 5:  Loss:     1.0196 Accuracy: 0.535000
Epoch 64, CIFAR-10 Batch 1:  Loss:     1.3749 Accuracy: 0.541400
Epoch 64, CIFAR-10 Batch 2:  Loss:     0.8488 Accuracy: 0.538200
Epoch 64, CIFAR-10 Batch 3:  Loss:     0.8636 Accuracy: 0.547600
Epoch 64, CIFAR-10 Batch 4:  Loss:     1.0579 Accuracy: 0.548000
Epoch 64, CIFAR-10 Batch 5:  Loss:     1.0169 Accuracy: 0.535800
Epoch 65, CIFAR-10 Batch 1:  Loss:     1.3712 Accuracy: 0.543600
Epoch 65, CIFAR-10 Batch 2:  Loss:     0.8467 Accuracy: 0.537400
Epoch 65, CIFAR-10 Batch 3:  Loss:     0.8625 Accuracy: 0.547200
Epoch 65, CIFAR-10 Batch 4:  Loss:     1.0568 Accuracy: 0.548800
Epoch 65, CIFAR-10 Batch 5:  Loss:     1.0129 Accuracy: 0.535000
Epoch 66, CIFAR-10 Batch 1:  Loss:     1.3680 Accuracy: 0.545000
Epoch 66, CIFAR-10 Batch 2:  Loss:     0.8445 Accuracy: 0.538000
Epoch 66, CIFAR-10 Batch 3:  Loss:     0.8617 Accuracy: 0.547400
Epoch 66, CIFAR-10 Batch 4:  Loss:     1.0557 Accuracy: 0.550400
Epoch 66, CIFAR-10 Batch 5:  Loss:     1.0089 Accuracy: 0.535000
Epoch 67, CIFAR-10 Batch 1:  Loss:     1.3649 Accuracy: 0.545000
Epoch 67, CIFAR-10 Batch 2:  Loss:     0.8422 Accuracy: 0.538000
Epoch 67, CIFAR-10 Batch 3:  Loss:     0.8606 Accuracy: 0.548200
Epoch 67, CIFAR-10 Batch 4:  Loss:     1.0549 Accuracy: 0.550800
Epoch 67, CIFAR-10 Batch 5:  Loss:     1.0057 Accuracy: 0.535600
Epoch 68, CIFAR-10 Batch 1:  Loss:     1.3621 Accuracy: 0.545400
Epoch 68, CIFAR-10 Batch 2:  Loss:     0.8397 Accuracy: 0.539000
Epoch 68, CIFAR-10 Batch 3:  Loss:     0.8593 Accuracy: 0.548800
Epoch 68, CIFAR-10 Batch 4:  Loss:     1.0537 Accuracy: 0.549800
Epoch 68, CIFAR-10 Batch 5:  Loss:     1.0022 Accuracy: 0.536200
Epoch 69, CIFAR-10 Batch 1:  Loss:     1.3583 Accuracy: 0.546400
Epoch 69, CIFAR-10 Batch 2:  Loss:     0.8374 Accuracy: 0.540000
Epoch 69, CIFAR-10 Batch 3:  Loss:     0.8583 Accuracy: 0.549400
Epoch 69, CIFAR-10 Batch 4:  Loss:     1.0529 Accuracy: 0.549400
Epoch 69, CIFAR-10 Batch 5:  Loss:     0.9999 Accuracy: 0.536200
Epoch 70, CIFAR-10 Batch 1:  Loss:     1.3546 Accuracy: 0.546400
Epoch 70, CIFAR-10 Batch 2:  Loss:     0.8354 Accuracy: 0.540400
Epoch 70, CIFAR-10 Batch 3:  Loss:     0.8577 Accuracy: 0.549000
Epoch 70, CIFAR-10 Batch 4:  Loss:     1.0522 Accuracy: 0.550600
Epoch 70, CIFAR-10 Batch 5:  Loss:     0.9965 Accuracy: 0.536600
Epoch 71, CIFAR-10 Batch 1:  Loss:     1.3517 Accuracy: 0.547600
Epoch 71, CIFAR-10 Batch 2:  Loss:     0.8334 Accuracy: 0.542200
Epoch 71, CIFAR-10 Batch 3:  Loss:     0.8563 Accuracy: 0.548400
Epoch 71, CIFAR-10 Batch 4:  Loss:     1.0515 Accuracy: 0.549600
Epoch 71, CIFAR-10 Batch 5:  Loss:     0.9933 Accuracy: 0.537600
Epoch 72, CIFAR-10 Batch 1:  Loss:     1.3486 Accuracy: 0.546800
Epoch 72, CIFAR-10 Batch 2:  Loss:     0.8311 Accuracy: 0.542600
Epoch 72, CIFAR-10 Batch 3:  Loss:     0.8557 Accuracy: 0.548400
Epoch 72, CIFAR-10 Batch 4:  Loss:     1.0505 Accuracy: 0.549800
Epoch 72, CIFAR-10 Batch 5:  Loss:     0.9904 Accuracy: 0.538400
Epoch 73, CIFAR-10 Batch 1:  Loss:     1.3463 Accuracy: 0.547600
Epoch 73, CIFAR-10 Batch 2:  Loss:     0.8289 Accuracy: 0.544200
Epoch 73, CIFAR-10 Batch 3:  Loss:     0.8546 Accuracy: 0.548800
Epoch 73, CIFAR-10 Batch 4:  Loss:     1.0498 Accuracy: 0.550000
Epoch 73, CIFAR-10 Batch 5:  Loss:     0.9879 Accuracy: 0.538600
Epoch 74, CIFAR-10 Batch 1:  Loss:     1.3447 Accuracy: 0.547200
Epoch 74, CIFAR-10 Batch 2:  Loss:     0.8268 Accuracy: 0.543800
Epoch 74, CIFAR-10 Batch 3:  Loss:     0.8534 Accuracy: 0.549800
Epoch 74, CIFAR-10 Batch 4:  Loss:     1.0492 Accuracy: 0.549800
Epoch 74, CIFAR-10 Batch 5:  Loss:     0.9845 Accuracy: 0.538000
Epoch 75, CIFAR-10 Batch 1:  Loss:     1.3415 Accuracy: 0.547800
Epoch 75, CIFAR-10 Batch 2:  Loss:     0.8248 Accuracy: 0.544000
Epoch 75, CIFAR-10 Batch 3:  Loss:     0.8524 Accuracy: 0.549800
Epoch 75, CIFAR-10 Batch 4:  Loss:     1.0487 Accuracy: 0.550000
Epoch 75, CIFAR-10 Batch 5:  Loss:     0.9818 Accuracy: 0.538200
Epoch 76, CIFAR-10 Batch 1:  Loss:     1.3377 Accuracy: 0.547800
Epoch 76, CIFAR-10 Batch 2:  Loss:     0.8224 Accuracy: 0.544800
Epoch 76, CIFAR-10 Batch 3:  Loss:     0.8518 Accuracy: 0.549600
Epoch 76, CIFAR-10 Batch 4:  Loss:     1.0477 Accuracy: 0.549400
Epoch 76, CIFAR-10 Batch 5:  Loss:     0.9789 Accuracy: 0.539200
Epoch 77, CIFAR-10 Batch 1:  Loss:     1.3356 Accuracy: 0.548000
Epoch 77, CIFAR-10 Batch 2:  Loss:     0.8209 Accuracy: 0.545400
Epoch 77, CIFAR-10 Batch 3:  Loss:     0.8515 Accuracy: 0.550200
Epoch 77, CIFAR-10 Batch 4:  Loss:     1.0463 Accuracy: 0.549200
Epoch 77, CIFAR-10 Batch 5:  Loss:     0.9763 Accuracy: 0.539800
Epoch 78, CIFAR-10 Batch 1:  Loss:     1.3328 Accuracy: 0.547600
Epoch 78, CIFAR-10 Batch 2:  Loss:     0.8188 Accuracy: 0.544800
Epoch 78, CIFAR-10 Batch 3:  Loss:     0.8500 Accuracy: 0.550600
Epoch 78, CIFAR-10 Batch 4:  Loss:     1.0450 Accuracy: 0.550600
Epoch 78, CIFAR-10 Batch 5:  Loss:     0.9742 Accuracy: 0.539200
Epoch 79, CIFAR-10 Batch 1:  Loss:     1.3305 Accuracy: 0.548400
Epoch 79, CIFAR-10 Batch 2:  Loss:     0.8168 Accuracy: 0.545400
Epoch 79, CIFAR-10 Batch 3:  Loss:     0.8499 Accuracy: 0.551400
Epoch 79, CIFAR-10 Batch 4:  Loss:     1.0447 Accuracy: 0.550200
Epoch 79, CIFAR-10 Batch 5:  Loss:     0.9721 Accuracy: 0.539800
Epoch 80, CIFAR-10 Batch 1:  Loss:     1.3279 Accuracy: 0.548200
Epoch 80, CIFAR-10 Batch 2:  Loss:     0.8150 Accuracy: 0.544600
Epoch 80, CIFAR-10 Batch 3:  Loss:     0.8491 Accuracy: 0.551400
Epoch 80, CIFAR-10 Batch 4:  Loss:     1.0438 Accuracy: 0.550800
Epoch 80, CIFAR-10 Batch 5:  Loss:     0.9693 Accuracy: 0.540200
Epoch 81, CIFAR-10 Batch 1:  Loss:     1.3256 Accuracy: 0.547600
Epoch 81, CIFAR-10 Batch 2:  Loss:     0.8137 Accuracy: 0.545000
Epoch 81, CIFAR-10 Batch 3:  Loss:     0.8494 Accuracy: 0.551000
Epoch 81, CIFAR-10 Batch 4:  Loss:     1.0430 Accuracy: 0.552200
Epoch 81, CIFAR-10 Batch 5:  Loss:     0.9668 Accuracy: 0.540600
Epoch 82, CIFAR-10 Batch 1:  Loss:     1.3228 Accuracy: 0.548200
Epoch 82, CIFAR-10 Batch 2:  Loss:     0.8118 Accuracy: 0.544800
Epoch 82, CIFAR-10 Batch 3:  Loss:     0.8482 Accuracy: 0.551000
Epoch 82, CIFAR-10 Batch 4:  Loss:     1.0423 Accuracy: 0.552400
Epoch 82, CIFAR-10 Batch 5:  Loss:     0.9647 Accuracy: 0.541600
Epoch 83, CIFAR-10 Batch 1:  Loss:     1.3208 Accuracy: 0.549000
Epoch 83, CIFAR-10 Batch 2:  Loss:     0.8103 Accuracy: 0.545400
Epoch 83, CIFAR-10 Batch 3:  Loss:     0.8476 Accuracy: 0.551000
Epoch 83, CIFAR-10 Batch 4:  Loss:     1.0417 Accuracy: 0.552200
Epoch 83, CIFAR-10 Batch 5:  Loss:     0.9625 Accuracy: 0.540600
Epoch 84, CIFAR-10 Batch 1:  Loss:     1.3181 Accuracy: 0.547000
Epoch 84, CIFAR-10 Batch 2:  Loss:     0.8088 Accuracy: 0.545800
Epoch 84, CIFAR-10 Batch 3:  Loss:     0.8473 Accuracy: 0.550600
Epoch 84, CIFAR-10 Batch 4:  Loss:     1.0417 Accuracy: 0.553600
Epoch 84, CIFAR-10 Batch 5:  Loss:     0.9603 Accuracy: 0.542600
Epoch 85, CIFAR-10 Batch 1:  Loss:     1.3165 Accuracy: 0.546600
Epoch 85, CIFAR-10 Batch 2:  Loss:     0.8068 Accuracy: 0.545200
Epoch 85, CIFAR-10 Batch 3:  Loss:     0.8461 Accuracy: 0.550000
Epoch 85, CIFAR-10 Batch 4:  Loss:     1.0413 Accuracy: 0.552800
Epoch 85, CIFAR-10 Batch 5:  Loss:     0.9582 Accuracy: 0.543600
Epoch 86, CIFAR-10 Batch 1:  Loss:     1.3142 Accuracy: 0.546000
Epoch 86, CIFAR-10 Batch 2:  Loss:     0.8052 Accuracy: 0.545600
Epoch 86, CIFAR-10 Batch 3:  Loss:     0.8451 Accuracy: 0.550800
Epoch 86, CIFAR-10 Batch 4:  Loss:     1.0408 Accuracy: 0.552800
Epoch 86, CIFAR-10 Batch 5:  Loss:     0.9563 Accuracy: 0.543400
Epoch 87, CIFAR-10 Batch 1:  Loss:     1.3118 Accuracy: 0.546400
Epoch 87, CIFAR-10 Batch 2:  Loss:     0.8037 Accuracy: 0.546600
Epoch 87, CIFAR-10 Batch 3:  Loss:     0.8448 Accuracy: 0.551000
Epoch 87, CIFAR-10 Batch 4:  Loss:     1.0406 Accuracy: 0.552600
Epoch 87, CIFAR-10 Batch 5:  Loss:     0.9542 Accuracy: 0.543200
Epoch 88, CIFAR-10 Batch 1:  Loss:     1.3090 Accuracy: 0.546600
Epoch 88, CIFAR-10 Batch 2:  Loss:     0.8018 Accuracy: 0.545400
Epoch 88, CIFAR-10 Batch 3:  Loss:     0.8445 Accuracy: 0.551000
Epoch 88, CIFAR-10 Batch 4:  Loss:     1.0398 Accuracy: 0.553200
Epoch 88, CIFAR-10 Batch 5:  Loss:     0.9525 Accuracy: 0.543800
Epoch 89, CIFAR-10 Batch 1:  Loss:     1.3063 Accuracy: 0.547400
Epoch 89, CIFAR-10 Batch 2:  Loss:     0.8006 Accuracy: 0.546800
Epoch 89, CIFAR-10 Batch 3:  Loss:     0.8443 Accuracy: 0.552200
Epoch 89, CIFAR-10 Batch 4:  Loss:     1.0398 Accuracy: 0.553400
Epoch 89, CIFAR-10 Batch 5:  Loss:     0.9504 Accuracy: 0.544000
Epoch 90, CIFAR-10 Batch 1:  Loss:     1.3052 Accuracy: 0.547600
Epoch 90, CIFAR-10 Batch 2:  Loss:     0.7985 Accuracy: 0.546600
Epoch 90, CIFAR-10 Batch 3:  Loss:     0.8438 Accuracy: 0.552600
Epoch 90, CIFAR-10 Batch 4:  Loss:     1.0400 Accuracy: 0.553200
Epoch 90, CIFAR-10 Batch 5:  Loss:     0.9485 Accuracy: 0.544600
Epoch 91, CIFAR-10 Batch 1:  Loss:     1.3028 Accuracy: 0.548000
Epoch 91, CIFAR-10 Batch 2:  Loss:     0.7969 Accuracy: 0.545200
Epoch 91, CIFAR-10 Batch 3:  Loss:     0.8436 Accuracy: 0.552600
Epoch 91, CIFAR-10 Batch 4:  Loss:     1.0402 Accuracy: 0.552800
Epoch 91, CIFAR-10 Batch 5:  Loss:     0.9463 Accuracy: 0.544400
Epoch 92, CIFAR-10 Batch 1:  Loss:     1.3012 Accuracy: 0.548600
Epoch 92, CIFAR-10 Batch 2:  Loss:     0.7954 Accuracy: 0.545400
Epoch 92, CIFAR-10 Batch 3:  Loss:     0.8432 Accuracy: 0.551800
Epoch 92, CIFAR-10 Batch 4:  Loss:     1.0400 Accuracy: 0.553000
Epoch 92, CIFAR-10 Batch 5:  Loss:     0.9444 Accuracy: 0.545000
Epoch 93, CIFAR-10 Batch 1:  Loss:     1.2993 Accuracy: 0.548600
Epoch 93, CIFAR-10 Batch 2:  Loss:     0.7939 Accuracy: 0.545800
Epoch 93, CIFAR-10 Batch 3:  Loss:     0.8432 Accuracy: 0.552200
Epoch 93, CIFAR-10 Batch 4:  Loss:     1.0401 Accuracy: 0.553600
Epoch 93, CIFAR-10 Batch 5:  Loss:     0.9420 Accuracy: 0.545800
Epoch 94, CIFAR-10 Batch 1:  Loss:     1.2981 Accuracy: 0.548200
Epoch 94, CIFAR-10 Batch 2:  Loss:     0.7918 Accuracy: 0.546400
Epoch 94, CIFAR-10 Batch 3:  Loss:     0.8434 Accuracy: 0.552800
Epoch 94, CIFAR-10 Batch 4:  Loss:     1.0399 Accuracy: 0.553200
Epoch 94, CIFAR-10 Batch 5:  Loss:     0.9404 Accuracy: 0.545400
Epoch 95, CIFAR-10 Batch 1:  Loss:     1.2958 Accuracy: 0.548800
Epoch 95, CIFAR-10 Batch 2:  Loss:     0.7904 Accuracy: 0.546200
Epoch 95, CIFAR-10 Batch 3:  Loss:     0.8429 Accuracy: 0.552600
Epoch 95, CIFAR-10 Batch 4:  Loss:     1.0395 Accuracy: 0.553400
Epoch 95, CIFAR-10 Batch 5:  Loss:     0.9389 Accuracy: 0.546800
Epoch 96, CIFAR-10 Batch 1:  Loss:     1.2944 Accuracy: 0.548400
Epoch 96, CIFAR-10 Batch 2:  Loss:     0.7893 Accuracy: 0.546200
Epoch 96, CIFAR-10 Batch 3:  Loss:     0.8425 Accuracy: 0.552800
Epoch 96, CIFAR-10 Batch 4:  Loss:     1.0396 Accuracy: 0.554000
Epoch 96, CIFAR-10 Batch 5:  Loss:     0.9372 Accuracy: 0.547000
Epoch 97, CIFAR-10 Batch 1:  Loss:     1.2927 Accuracy: 0.548000
Epoch 97, CIFAR-10 Batch 2:  Loss:     0.7880 Accuracy: 0.547000
Epoch 97, CIFAR-10 Batch 3:  Loss:     0.8422 Accuracy: 0.554200
Epoch 97, CIFAR-10 Batch 4:  Loss:     1.0396 Accuracy: 0.554000
Epoch 97, CIFAR-10 Batch 5:  Loss:     0.9355 Accuracy: 0.547800
Epoch 98, CIFAR-10 Batch 1:  Loss:     1.2911 Accuracy: 0.548800
Epoch 98, CIFAR-10 Batch 2:  Loss:     0.7865 Accuracy: 0.547400
Epoch 98, CIFAR-10 Batch 3:  Loss:     0.8422 Accuracy: 0.553800
Epoch 98, CIFAR-10 Batch 4:  Loss:     1.0398 Accuracy: 0.553400
Epoch 98, CIFAR-10 Batch 5:  Loss:     0.9336 Accuracy: 0.548000
Epoch 99, CIFAR-10 Batch 1:  Loss:     1.2895 Accuracy: 0.548800
Epoch 99, CIFAR-10 Batch 2:  Loss:     0.7851 Accuracy: 0.547000
Epoch 99, CIFAR-10 Batch 3:  Loss:     0.8419 Accuracy: 0.554600
Epoch 99, CIFAR-10 Batch 4:  Loss:     1.0395 Accuracy: 0.553600
Epoch 99, CIFAR-10 Batch 5:  Loss:     0.9321 Accuracy: 0.548400
Epoch 100, CIFAR-10 Batch 1:  Loss:     1.2874 Accuracy: 0.549000
Epoch 100, CIFAR-10 Batch 2:  Loss:     0.7840 Accuracy: 0.547200
Epoch 100, CIFAR-10 Batch 3:  Loss:     0.8413 Accuracy: 0.554200
Epoch 100, CIFAR-10 Batch 4:  Loss:     1.0396 Accuracy: 0.553600
Epoch 100, CIFAR-10 Batch 5:  Loss:     0.9303 Accuracy: 0.548400

Checkpoint

The model has been saved to disk.

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.


In [26]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_training.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for train_feature_batch, train_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: train_feature_batch, loaded_y: train_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


Testing Accuracy: 0.5505859375

Why 50-70% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 70%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.