Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.


In [22]:
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

# Use Floyd's cifar-10 dataset if present
floyd_cifar10_location = '/input/cifar-10/python.tar.gz'
if isfile(floyd_cifar10_location):
    tar_gz_path = floyd_cifar10_location
else:
    tar_gz_path = 'cifar-10-python.tar.gz'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(tar_gz_path):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            tar_gz_path,
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open(tar_gz_path) as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


All files found!

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

  • airplane
  • automobile
  • bird
  • cat
  • deer
  • dog
  • frog
  • horse
  • ship
  • truck

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.


In [121]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id = 4
sample_id = 9
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 4:
Samples: 10000
Label Counts: {0: 1003, 1: 963, 2: 1041, 3: 976, 4: 1004, 5: 1021, 6: 1004, 7: 981, 8: 1024, 9: 983}
First 20 Labels: [0, 6, 0, 2, 7, 2, 1, 2, 4, 1, 5, 6, 6, 3, 1, 3, 5, 5, 8, 1]

Example of Image 9:
Image - Min Value: 22 Max Value: 255
Image - Shape: (32, 32, 3)
Label - Label Id: 1 Name: automobile
  • "What are all possible labels?"

0 airplane 1 automobile 2 bird 3 cat 4 deer 5 dog 6 frog 7 horse 8 ship 9 truck

  • "What is the range of values for the image data?"

32 x 32 x 3 x {0..255}

  • "Are the labels in order or random?"

random

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.


In [24]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    return np.array(x) / 255

tests.test_normalize(normalize)


Tests Passed

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.


In [25]:
def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    from sklearn.preprocessing import LabelBinarizer
    encoder = LabelBinarizer()
    encoder.fit([i for i in range(10)])
    return encoder.transform(x)

tests.test_one_hot_encode(one_hot_encode)


Tests Passed

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.


In [26]:
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [27]:
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

In [28]:
valid_labels


Out[28]:
array([[0, 0, 0, ..., 0, 0, 0],
       [0, 0, 1, ..., 0, 0, 0],
       [0, 0, 1, ..., 0, 0, 0],
       ..., 
       [0, 0, 0, ..., 0, 0, 1],
       [0, 1, 0, ..., 0, 0, 0],
       [0, 1, 0, ..., 0, 0, 0]])

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

Note: If you're finding it hard to dedicate enough time for this course each week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages to build each layer, except the layers you build in the "Convolutional and Max Pooling Layer" section. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

However, if you would like to get the most out of this course, try to solve all the problems without using anything from the TF Layers packages. You can still use classes from other packages that happen to have the same name as ones you find in TF Layers! For example, instead of using the TF Layers version of the conv2d class, tf.layers.conv2d, you would want to use the TF Neural Network version of conv2d, tf.nn.conv2d.

Let's begin!

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

  • Implement neural_net_image_input
    • Return a TF Placeholder
    • Set the shape using image_shape with batch size set to None.
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_label_input
    • Return a TF Placeholder
    • Set the shape using n_classes with batch size set to None.
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_keep_prob_input
    • Return a TF Placeholder for dropout keep probability.
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.

These names will be used at the end of the project to load your saved model.

Note: None for shapes in TensorFlow allow for a dynamic size.


In [29]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a batch of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    x = tf.placeholder(tf.float32,
                       [None, image_shape[0], image_shape[1], image_shape[2]],
                       name='x')
    return x


def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    y = tf.placeholder(tf.int32,
                       [None, n_classes],
                       name='y')
    return y


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    return tf.placeholder(tf.float32, name='keep_prob')


tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Image Input Tests Passed.
Label Input Tests Passed.
Keep Prob Tests Passed.

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • Apply a convolution to x_tensor using weight and conv_strides.
    • We recommend you use same padding, but you're welcome to use any padding.
  • Add bias
  • Add a nonlinear activation to the convolution.
  • Apply Max Pooling using pool_ksize and pool_strides.
    • We recommend you use same padding, but you're welcome to use any padding.

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer, but you can still use TensorFlow's Neural Network package. You may still use the shortcut option for all the other layers.


In [87]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_ksize: kernal size 2-D Tuple for the convolutional layer
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """
    num_units = (int(x_tensor.shape[2]) / conv_strides[0] / pool_strides[0])**2 * conv_num_outputs
    deviation = 1/np.sqrt(num_units)

    W = tf.Variable(tf.truncated_normal([conv_ksize[0], conv_ksize[1],
                                        int(x_tensor.shape[3]), conv_num_outputs],
                                        mean=0.0, stddev=deviation))
    b = tf.Variable(tf.truncated_normal([conv_num_outputs],
                                        mean=0.0, stddev=deviation))
    
    x_conved = tf.nn.conv2d(x_tensor, W, 
                            strides=[1, conv_strides[0], conv_strides[0], 1], 
                            padding='SAME')
    x_biased = tf.nn.bias_add(x_conved, b)
    x_rected = tf.nn.relu(x_biased)
    return tf.nn.max_pool(x_rected,
                          ksize=[1, pool_ksize[0], pool_ksize[1], 1],
                          strides=[1, pool_strides[0], pool_strides[1], 1],
                          padding='SAME')


tests.test_con_pool(conv2d_maxpool)


Tests Passed

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [88]:
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    height = int(x_tensor.shape[1])
    width = int(x_tensor.shape[2])
    depth = int(x_tensor.shape[3])
    volume = width * height * depth
    return tf.reshape(x_tensor, [-1, volume])

tests.test_flatten(flatten)


Tests Passed

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [96]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    height = int(x_tensor.shape[1])
    deviation = 1/np.sqrt(num_outputs)
    
    W = tf.Variable(tf.truncated_normal([height, num_outputs],
                                        mean=0.0, stddev=deviation))
    b = tf.Variable(tf.truncated_normal([num_outputs],
                                        mean=0.0, stddev=deviation))
    fc1 = tf.add(tf.matmul(x_tensor, W), b)
    return tf.nn.relu(fc1)

tests.test_fully_conn(fully_conn)


Tests Passed

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.

Note: Activation, softmax, or cross entropy should not be applied to this.


In [97]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    height = int(x_tensor.shape[1])
    deviation = 1/np.sqrt(num_outputs)
    
    W = tf.Variable(tf.truncated_normal([height, num_outputs],
                                        mean=0.0, stddev=deviation))
    b = tf.Variable(tf.truncated_normal([num_outputs],
                                        mean=0.0, stddev=deviation))
    return tf.add(tf.matmul(x_tensor, W), b)

tests.test_output(output)


Tests Passed

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • Apply a Flatten Layer
  • Apply 1, 2, or 3 Fully Connected Layers
  • Apply an Output Layer
  • Return the output
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.

In [99]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """
    # Play around with different number of outputs, kernel size and stride
    
    # 32x32x3
    conv1 = conv2d_maxpool(x, 16,
                           [5, 5], [1, 1],
                           [2, 2], [2, 2])
    # 16x16x16
    conv2 = conv2d_maxpool(conv1, 32,
                           [3, 3], [1, 1],
                           [2, 2], [2, 2])
    # 8x8x32
    conv2f = flatten(conv2)
    
    # 2048
    fc1 = fully_conn(conv2f, 256)
    fc1d = tf.nn.dropout(fc1, keep_prob)
    
    # 128
    logits = output(fc1d, 10)
    
    #10
    return logits


##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

tests.test_conv_net(conv_net)


Neural Network Built!

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

  • x for image input
  • y for labels
  • keep_prob for keep probability for dropout

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

Note: Nothing needs to be returned. This function is only optimizing the neural network.


In [100]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    session.run(optimizer, feed_dict={x: feature_batch,
                                      y: label_batch,
                                      keep_prob: keep_probability})

tests.test_train_nn(train_neural_network)


Tests Passed

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.


In [101]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """

    correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
    accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32))
    tr_cost, tr_acc = session.run([cost, accuracy],
                                  feed_dict={x: feature_batch,
                                             y: label_batch,
                                             keep_prob: 1.0})
    va_cost, va_acc = session.run([cost, accuracy],
                                  feed_dict={x: valid_features,
                                             y: valid_labels,
                                             keep_prob: 1.0})
    print("Losses: {:>7.1f}  {:>7.1f}\tAccuracies: {:>5.3f}  {:>5.3f}".format(tr_cost, va_cost, tr_acc, va_acc))

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
    • 64
    • 128
    • 256
    • ...
  • Set keep_probability to the probability of keeping a node using dropout

In [127]:
# TODO: Tune Parameters
epochs = 100
batch_size = 4096
keep_probability = 0.5

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.


In [128]:
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  Losses:     2.3      2.3	Accuracies: 0.132  0.133
Epoch  2, CIFAR-10 Batch 1:  Losses:     2.3      2.3	Accuracies: 0.200  0.178
Epoch  3, CIFAR-10 Batch 1:  Losses:     2.2      2.2	Accuracies: 0.224  0.219
Epoch  4, CIFAR-10 Batch 1:  Losses:     2.2      2.2	Accuracies: 0.218  0.221
Epoch  5, CIFAR-10 Batch 1:  Losses:     2.1      2.1	Accuracies: 0.233  0.232
Epoch  6, CIFAR-10 Batch 1:  Losses:     2.1      2.1	Accuracies: 0.267  0.256
Epoch  7, CIFAR-10 Batch 1:  Losses:     2.0      2.0	Accuracies: 0.255  0.281
Epoch  8, CIFAR-10 Batch 1:  Losses:     2.0      2.0	Accuracies: 0.277  0.292
Epoch  9, CIFAR-10 Batch 1:  Losses:     2.0      2.0	Accuracies: 0.316  0.299
Epoch 10, CIFAR-10 Batch 1:  Losses:     1.9      1.9	Accuracies: 0.330  0.311
Epoch 11, CIFAR-10 Batch 1:  Losses:     1.9      1.9	Accuracies: 0.350  0.329
Epoch 12, CIFAR-10 Batch 1:  Losses:     1.9      1.9	Accuracies: 0.369  0.350
Epoch 13, CIFAR-10 Batch 1:  Losses:     1.8      1.8	Accuracies: 0.389  0.350
Epoch 14, CIFAR-10 Batch 1:  Losses:     1.8      1.8	Accuracies: 0.377  0.350
Epoch 15, CIFAR-10 Batch 1:  Losses:     1.7      1.8	Accuracies: 0.413  0.371
Epoch 16, CIFAR-10 Batch 1:  Losses:     1.7      1.8	Accuracies: 0.433  0.390
Epoch 17, CIFAR-10 Batch 1:  Losses:     1.7      1.7	Accuracies: 0.429  0.401
Epoch 18, CIFAR-10 Batch 1:  Losses:     1.7      1.7	Accuracies: 0.439  0.388
Epoch 19, CIFAR-10 Batch 1:  Losses:     1.6      1.7	Accuracies: 0.453  0.401
Epoch 20, CIFAR-10 Batch 1:  Losses:     1.6      1.7	Accuracies: 0.467  0.419
Epoch 21, CIFAR-10 Batch 1:  Losses:     1.6      1.6	Accuracies: 0.474  0.427
Epoch 22, CIFAR-10 Batch 1:  Losses:     1.5      1.6	Accuracies: 0.486  0.427
Epoch 23, CIFAR-10 Batch 1:  Losses:     1.5      1.6	Accuracies: 0.475  0.427
Epoch 24, CIFAR-10 Batch 1:  Losses:     1.5      1.6	Accuracies: 0.488  0.447
Epoch 25, CIFAR-10 Batch 1:  Losses:     1.5      1.6	Accuracies: 0.510  0.455
Epoch 26, CIFAR-10 Batch 1:  Losses:     1.4      1.6	Accuracies: 0.509  0.449
Epoch 27, CIFAR-10 Batch 1:  Losses:     1.4      1.6	Accuracies: 0.527  0.452
Epoch 28, CIFAR-10 Batch 1:  Losses:     1.4      1.5	Accuracies: 0.541  0.456
Epoch 29, CIFAR-10 Batch 1:  Losses:     1.4      1.5	Accuracies: 0.543  0.460
Epoch 30, CIFAR-10 Batch 1:  Losses:     1.4      1.5	Accuracies: 0.533  0.460
Epoch 31, CIFAR-10 Batch 1:  Losses:     1.3      1.5	Accuracies: 0.559  0.464
Epoch 32, CIFAR-10 Batch 1:  Losses:     1.3      1.5	Accuracies: 0.571  0.461
Epoch 33, CIFAR-10 Batch 1:  Losses:     1.3      1.5	Accuracies: 0.579  0.472
Epoch 34, CIFAR-10 Batch 1:  Losses:     1.3      1.5	Accuracies: 0.578  0.471
Epoch 35, CIFAR-10 Batch 1:  Losses:     1.3      1.5	Accuracies: 0.583  0.475
Epoch 36, CIFAR-10 Batch 1:  Losses:     1.2      1.5	Accuracies: 0.600  0.480
Epoch 37, CIFAR-10 Batch 1:  Losses:     1.2      1.4	Accuracies: 0.603  0.482
Epoch 38, CIFAR-10 Batch 1:  Losses:     1.2      1.4	Accuracies: 0.604  0.485
Epoch 39, CIFAR-10 Batch 1:  Losses:     1.2      1.4	Accuracies: 0.606  0.489
Epoch 40, CIFAR-10 Batch 1:  Losses:     1.2      1.4	Accuracies: 0.619  0.488
Epoch 41, CIFAR-10 Batch 1:  Losses:     1.2      1.4	Accuracies: 0.625  0.487
Epoch 42, CIFAR-10 Batch 1:  Losses:     1.1      1.4	Accuracies: 0.635  0.486
Epoch 43, CIFAR-10 Batch 1:  Losses:     1.1      1.4	Accuracies: 0.640  0.493
Epoch 44, CIFAR-10 Batch 1:  Losses:     1.1      1.4	Accuracies: 0.634  0.489
Epoch 45, CIFAR-10 Batch 1:  Losses:     1.1      1.4	Accuracies: 0.658  0.495
Epoch 46, CIFAR-10 Batch 1:  Losses:     1.1      1.4	Accuracies: 0.657  0.505
Epoch 47, CIFAR-10 Batch 1:  Losses:     1.0      1.4	Accuracies: 0.677  0.504
Epoch 48, CIFAR-10 Batch 1:  Losses:     1.0      1.4	Accuracies: 0.678  0.501
Epoch 49, CIFAR-10 Batch 1:  Losses:     1.0      1.4	Accuracies: 0.673  0.506
Epoch 50, CIFAR-10 Batch 1:  Losses:     1.0      1.4	Accuracies: 0.693  0.495
Epoch 51, CIFAR-10 Batch 1:  Losses:     1.0      1.4	Accuracies: 0.693  0.506
Epoch 52, CIFAR-10 Batch 1:  Losses:     1.0      1.4	Accuracies: 0.702  0.505
Epoch 53, CIFAR-10 Batch 1:  Losses:     1.0      1.4	Accuracies: 0.709  0.505
Epoch 54, CIFAR-10 Batch 1:  Losses:     0.9      1.4	Accuracies: 0.718  0.515
Epoch 55, CIFAR-10 Batch 1:  Losses:     0.9      1.3	Accuracies: 0.725  0.516
Epoch 56, CIFAR-10 Batch 1:  Losses:     0.9      1.4	Accuracies: 0.719  0.509
Epoch 57, CIFAR-10 Batch 1:  Losses:     0.9      1.3	Accuracies: 0.723  0.521
Epoch 58, CIFAR-10 Batch 1:  Losses:     0.9      1.3	Accuracies: 0.725  0.517
Epoch 59, CIFAR-10 Batch 1:  Losses:     0.9      1.3	Accuracies: 0.757  0.523
Epoch 60, CIFAR-10 Batch 1:  Losses:     0.9      1.4	Accuracies: 0.752  0.517
Epoch 61, CIFAR-10 Batch 1:  Losses:     0.9      1.3	Accuracies: 0.759  0.516
Epoch 62, CIFAR-10 Batch 1:  Losses:     0.9      1.4	Accuracies: 0.748  0.513
Epoch 63, CIFAR-10 Batch 1:  Losses:     0.8      1.3	Accuracies: 0.759  0.517
Epoch 64, CIFAR-10 Batch 1:  Losses:     0.8      1.3	Accuracies: 0.786  0.528
Epoch 65, CIFAR-10 Batch 1:  Losses:     0.8      1.3	Accuracies: 0.790  0.532
Epoch 66, CIFAR-10 Batch 1:  Losses:     0.8      1.3	Accuracies: 0.776  0.528
Epoch 67, CIFAR-10 Batch 1:  Losses:     0.8      1.3	Accuracies: 0.775  0.527
Epoch 68, CIFAR-10 Batch 1:  Losses:     0.8      1.3	Accuracies: 0.782  0.521
Epoch 69, CIFAR-10 Batch 1:  Losses:     0.8      1.3	Accuracies: 0.809  0.530
Epoch 70, CIFAR-10 Batch 1:  Losses:     0.7      1.3	Accuracies: 0.817  0.535
Epoch 71, CIFAR-10 Batch 1:  Losses:     0.7      1.3	Accuracies: 0.819  0.533
Epoch 72, CIFAR-10 Batch 1:  Losses:     0.7      1.3	Accuracies: 0.804  0.528
Epoch 73, CIFAR-10 Batch 1:  Losses:     0.8      1.4	Accuracies: 0.792  0.521
Epoch 74, CIFAR-10 Batch 1:  Losses:     0.7      1.3	Accuracies: 0.804  0.526
Epoch 75, CIFAR-10 Batch 1:  Losses:     0.7      1.3	Accuracies: 0.842  0.536
Epoch 76, CIFAR-10 Batch 1:  Losses:     0.7      1.3	Accuracies: 0.821  0.533
Epoch 77, CIFAR-10 Batch 1:  Losses:     0.7      1.3	Accuracies: 0.850  0.541
Epoch 78, CIFAR-10 Batch 1:  Losses:     0.7      1.3	Accuracies: 0.850  0.540
Epoch 79, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.854  0.534
Epoch 80, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.858  0.540
Epoch 81, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.861  0.538
Epoch 82, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.868  0.543
Epoch 83, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.853  0.538
Epoch 84, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.871  0.541
Epoch 85, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.876  0.544
Epoch 86, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.874  0.543
Epoch 87, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.892  0.546
Epoch 88, CIFAR-10 Batch 1:  Losses:     0.6      1.3	Accuracies: 0.877  0.540
Epoch 89, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.886  0.544
Epoch 90, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.896  0.547
Epoch 91, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.897  0.550
Epoch 92, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.905  0.547
Epoch 93, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.903  0.549
Epoch 94, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.911  0.545
Epoch 95, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.911  0.545
Epoch 96, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.916  0.550
Epoch 97, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.911  0.549
Epoch 98, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.910  0.548
Epoch 99, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.908  0.545
Epoch 100, CIFAR-10 Batch 1:  Losses:     0.5      1.3	Accuracies: 0.921  0.547

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.


In [129]:
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  Losses:     2.3      2.3	Accuracies: 0.132  0.139
Epoch  1, CIFAR-10 Batch 2:  Losses:     2.3      2.3	Accuracies: 0.198  0.188
Epoch  1, CIFAR-10 Batch 3:  Losses:     2.2      2.2	Accuracies: 0.262  0.233
Epoch  1, CIFAR-10 Batch 4:  Losses:     2.2      2.2	Accuracies: 0.290  0.275
Epoch  1, CIFAR-10 Batch 5:  Losses:     2.1      2.1	Accuracies: 0.304  0.271
Epoch  2, CIFAR-10 Batch 1:  Losses:     2.1      2.0	Accuracies: 0.290  0.285
Epoch  2, CIFAR-10 Batch 2:  Losses:     2.0      2.0	Accuracies: 0.292  0.281
Epoch  2, CIFAR-10 Batch 3:  Losses:     1.9      2.0	Accuracies: 0.334  0.305
Epoch  2, CIFAR-10 Batch 4:  Losses:     1.9      1.9	Accuracies: 0.342  0.321
Epoch  2, CIFAR-10 Batch 5:  Losses:     1.9      1.9	Accuracies: 0.342  0.327
Epoch  3, CIFAR-10 Batch 1:  Losses:     1.9      1.9	Accuracies: 0.361  0.330
Epoch  3, CIFAR-10 Batch 2:  Losses:     1.8      1.9	Accuracies: 0.370  0.359
Epoch  3, CIFAR-10 Batch 3:  Losses:     1.8      1.8	Accuracies: 0.375  0.366
Epoch  3, CIFAR-10 Batch 4:  Losses:     1.8      1.8	Accuracies: 0.412  0.367
Epoch  3, CIFAR-10 Batch 5:  Losses:     1.8      1.8	Accuracies: 0.366  0.389
Epoch  4, CIFAR-10 Batch 1:  Losses:     1.7      1.7	Accuracies: 0.399  0.394
Epoch  4, CIFAR-10 Batch 2:  Losses:     1.7      1.7	Accuracies: 0.413  0.403
Epoch  4, CIFAR-10 Batch 3:  Losses:     1.7      1.7	Accuracies: 0.417  0.402
Epoch  4, CIFAR-10 Batch 4:  Losses:     1.7      1.7	Accuracies: 0.442  0.409
Epoch  4, CIFAR-10 Batch 5:  Losses:     1.7      1.7	Accuracies: 0.420  0.423
Epoch  5, CIFAR-10 Batch 1:  Losses:     1.6      1.6	Accuracies: 0.444  0.425
Epoch  5, CIFAR-10 Batch 2:  Losses:     1.6      1.6	Accuracies: 0.429  0.425
Epoch  5, CIFAR-10 Batch 3:  Losses:     1.6      1.6	Accuracies: 0.439  0.434
Epoch  5, CIFAR-10 Batch 4:  Losses:     1.6      1.6	Accuracies: 0.458  0.432
Epoch  5, CIFAR-10 Batch 5:  Losses:     1.6      1.6	Accuracies: 0.452  0.444
Epoch  6, CIFAR-10 Batch 1:  Losses:     1.6      1.6	Accuracies: 0.459  0.443
Epoch  6, CIFAR-10 Batch 2:  Losses:     1.5      1.6	Accuracies: 0.460  0.449
Epoch  6, CIFAR-10 Batch 3:  Losses:     1.5      1.6	Accuracies: 0.442  0.442
Epoch  6, CIFAR-10 Batch 4:  Losses:     1.5      1.6	Accuracies: 0.479  0.448
Epoch  6, CIFAR-10 Batch 5:  Losses:     1.5      1.5	Accuracies: 0.474  0.465
Epoch  7, CIFAR-10 Batch 1:  Losses:     1.5      1.5	Accuracies: 0.481  0.459
Epoch  7, CIFAR-10 Batch 2:  Losses:     1.5      1.5	Accuracies: 0.475  0.465
Epoch  7, CIFAR-10 Batch 3:  Losses:     1.5      1.5	Accuracies: 0.458  0.455
Epoch  7, CIFAR-10 Batch 4:  Losses:     1.5      1.5	Accuracies: 0.488  0.455
Epoch  7, CIFAR-10 Batch 5:  Losses:     1.5      1.5	Accuracies: 0.490  0.470
Epoch  8, CIFAR-10 Batch 1:  Losses:     1.5      1.5	Accuracies: 0.504  0.463
Epoch  8, CIFAR-10 Batch 2:  Losses:     1.5      1.5	Accuracies: 0.483  0.472
Epoch  8, CIFAR-10 Batch 3:  Losses:     1.4      1.5	Accuracies: 0.478  0.464
Epoch  8, CIFAR-10 Batch 4:  Losses:     1.4      1.5	Accuracies: 0.505  0.470
Epoch  8, CIFAR-10 Batch 5:  Losses:     1.4      1.5	Accuracies: 0.509  0.474
Epoch  9, CIFAR-10 Batch 1:  Losses:     1.4      1.5	Accuracies: 0.521  0.475
Epoch  9, CIFAR-10 Batch 2:  Losses:     1.4      1.5	Accuracies: 0.509  0.481
Epoch  9, CIFAR-10 Batch 3:  Losses:     1.4      1.5	Accuracies: 0.489  0.480
Epoch  9, CIFAR-10 Batch 4:  Losses:     1.4      1.5	Accuracies: 0.522  0.481
Epoch  9, CIFAR-10 Batch 5:  Losses:     1.4      1.5	Accuracies: 0.516  0.487
Epoch 10, CIFAR-10 Batch 1:  Losses:     1.4      1.4	Accuracies: 0.530  0.488
Epoch 10, CIFAR-10 Batch 2:  Losses:     1.4      1.4	Accuracies: 0.525  0.487
Epoch 10, CIFAR-10 Batch 3:  Losses:     1.4      1.4	Accuracies: 0.521  0.484
Epoch 10, CIFAR-10 Batch 4:  Losses:     1.4      1.4	Accuracies: 0.526  0.485
Epoch 10, CIFAR-10 Batch 5:  Losses:     1.4      1.4	Accuracies: 0.527  0.491
Epoch 11, CIFAR-10 Batch 1:  Losses:     1.4      1.4	Accuracies: 0.543  0.497
Epoch 11, CIFAR-10 Batch 2:  Losses:     1.3      1.4	Accuracies: 0.548  0.499
Epoch 11, CIFAR-10 Batch 3:  Losses:     1.3      1.4	Accuracies: 0.515  0.493
Epoch 11, CIFAR-10 Batch 4:  Losses:     1.3      1.4	Accuracies: 0.550  0.494
Epoch 11, CIFAR-10 Batch 5:  Losses:     1.3      1.4	Accuracies: 0.552  0.505
Epoch 12, CIFAR-10 Batch 1:  Losses:     1.3      1.4	Accuracies: 0.550  0.508
Epoch 12, CIFAR-10 Batch 2:  Losses:     1.3      1.4	Accuracies: 0.558  0.506
Epoch 12, CIFAR-10 Batch 3:  Losses:     1.3      1.4	Accuracies: 0.528  0.505
Epoch 12, CIFAR-10 Batch 4:  Losses:     1.3      1.4	Accuracies: 0.546  0.503
Epoch 12, CIFAR-10 Batch 5:  Losses:     1.3      1.4	Accuracies: 0.566  0.509
Epoch 13, CIFAR-10 Batch 1:  Losses:     1.3      1.4	Accuracies: 0.546  0.513
Epoch 13, CIFAR-10 Batch 2:  Losses:     1.3      1.4	Accuracies: 0.577  0.515
Epoch 13, CIFAR-10 Batch 3:  Losses:     1.3      1.4	Accuracies: 0.553  0.512
Epoch 13, CIFAR-10 Batch 4:  Losses:     1.3      1.4	Accuracies: 0.550  0.509
Epoch 13, CIFAR-10 Batch 5:  Losses:     1.3      1.4	Accuracies: 0.566  0.520
Epoch 14, CIFAR-10 Batch 1:  Losses:     1.3      1.4	Accuracies: 0.564  0.519
Epoch 14, CIFAR-10 Batch 2:  Losses:     1.3      1.4	Accuracies: 0.582  0.523
Epoch 14, CIFAR-10 Batch 3:  Losses:     1.2      1.3	Accuracies: 0.566  0.523
Epoch 14, CIFAR-10 Batch 4:  Losses:     1.2      1.3	Accuracies: 0.559  0.519
Epoch 14, CIFAR-10 Batch 5:  Losses:     1.2      1.3	Accuracies: 0.584  0.524
Epoch 15, CIFAR-10 Batch 1:  Losses:     1.2      1.3	Accuracies: 0.579  0.521
Epoch 15, CIFAR-10 Batch 2:  Losses:     1.2      1.3	Accuracies: 0.592  0.532
Epoch 15, CIFAR-10 Batch 3:  Losses:     1.2      1.3	Accuracies: 0.563  0.531
Epoch 15, CIFAR-10 Batch 4:  Losses:     1.2      1.3	Accuracies: 0.566  0.528
Epoch 15, CIFAR-10 Batch 5:  Losses:     1.2      1.3	Accuracies: 0.594  0.533
Epoch 16, CIFAR-10 Batch 1:  Losses:     1.2      1.3	Accuracies: 0.592  0.535
Epoch 16, CIFAR-10 Batch 2:  Losses:     1.2      1.3	Accuracies: 0.606  0.532
Epoch 16, CIFAR-10 Batch 3:  Losses:     1.2      1.3	Accuracies: 0.571  0.535
Epoch 16, CIFAR-10 Batch 4:  Losses:     1.2      1.3	Accuracies: 0.572  0.534
Epoch 16, CIFAR-10 Batch 5:  Losses:     1.2      1.3	Accuracies: 0.604  0.539
Epoch 17, CIFAR-10 Batch 1:  Losses:     1.2      1.3	Accuracies: 0.603  0.539
Epoch 17, CIFAR-10 Batch 2:  Losses:     1.2      1.3	Accuracies: 0.604  0.539
Epoch 17, CIFAR-10 Batch 3:  Losses:     1.2      1.3	Accuracies: 0.584  0.541
Epoch 17, CIFAR-10 Batch 4:  Losses:     1.2      1.3	Accuracies: 0.580  0.542
Epoch 17, CIFAR-10 Batch 5:  Losses:     1.2      1.3	Accuracies: 0.624  0.546
Epoch 18, CIFAR-10 Batch 1:  Losses:     1.2      1.3	Accuracies: 0.605  0.545
Epoch 18, CIFAR-10 Batch 2:  Losses:     1.2      1.3	Accuracies: 0.618  0.545
Epoch 18, CIFAR-10 Batch 3:  Losses:     1.1      1.3	Accuracies: 0.594  0.546
Epoch 18, CIFAR-10 Batch 4:  Losses:     1.1      1.3	Accuracies: 0.600  0.546
Epoch 18, CIFAR-10 Batch 5:  Losses:     1.2      1.3	Accuracies: 0.621  0.548
Epoch 19, CIFAR-10 Batch 1:  Losses:     1.2      1.3	Accuracies: 0.604  0.549
Epoch 19, CIFAR-10 Batch 2:  Losses:     1.2      1.3	Accuracies: 0.616  0.546
Epoch 19, CIFAR-10 Batch 3:  Losses:     1.1      1.3	Accuracies: 0.603  0.548
Epoch 19, CIFAR-10 Batch 4:  Losses:     1.1      1.3	Accuracies: 0.604  0.547
Epoch 19, CIFAR-10 Batch 5:  Losses:     1.1      1.3	Accuracies: 0.644  0.553
Epoch 20, CIFAR-10 Batch 1:  Losses:     1.1      1.3	Accuracies: 0.623  0.540
Epoch 20, CIFAR-10 Batch 2:  Losses:     1.1      1.3	Accuracies: 0.632  0.553
Epoch 20, CIFAR-10 Batch 3:  Losses:     1.1      1.3	Accuracies: 0.611  0.551
Epoch 20, CIFAR-10 Batch 4:  Losses:     1.1      1.3	Accuracies: 0.618  0.548
Epoch 20, CIFAR-10 Batch 5:  Losses:     1.1      1.3	Accuracies: 0.653  0.562
Epoch 21, CIFAR-10 Batch 1:  Losses:     1.1      1.2	Accuracies: 0.637  0.556
Epoch 21, CIFAR-10 Batch 2:  Losses:     1.1      1.2	Accuracies: 0.651  0.561
Epoch 21, CIFAR-10 Batch 3:  Losses:     1.1      1.2	Accuracies: 0.624  0.556
Epoch 21, CIFAR-10 Batch 4:  Losses:     1.1      1.2	Accuracies: 0.621  0.556
Epoch 21, CIFAR-10 Batch 5:  Losses:     1.1      1.2	Accuracies: 0.657  0.565
Epoch 22, CIFAR-10 Batch 1:  Losses:     1.1      1.2	Accuracies: 0.655  0.558
Epoch 22, CIFAR-10 Batch 2:  Losses:     1.1      1.2	Accuracies: 0.658  0.565
Epoch 22, CIFAR-10 Batch 3:  Losses:     1.0      1.2	Accuracies: 0.631  0.562
Epoch 22, CIFAR-10 Batch 4:  Losses:     1.0      1.2	Accuracies: 0.631  0.566
Epoch 22, CIFAR-10 Batch 5:  Losses:     1.1      1.2	Accuracies: 0.673  0.565
Epoch 23, CIFAR-10 Batch 1:  Losses:     1.1      1.2	Accuracies: 0.672  0.564
Epoch 23, CIFAR-10 Batch 2:  Losses:     1.1      1.2	Accuracies: 0.662  0.568
Epoch 23, CIFAR-10 Batch 3:  Losses:     1.0      1.2	Accuracies: 0.637  0.568
Epoch 23, CIFAR-10 Batch 4:  Losses:     1.0      1.2	Accuracies: 0.647  0.568
Epoch 23, CIFAR-10 Batch 5:  Losses:     1.0      1.2	Accuracies: 0.678  0.568
Epoch 24, CIFAR-10 Batch 1:  Losses:     1.0      1.2	Accuracies: 0.675  0.568
Epoch 24, CIFAR-10 Batch 2:  Losses:     1.0      1.2	Accuracies: 0.663  0.569
Epoch 24, CIFAR-10 Batch 3:  Losses:     1.0      1.2	Accuracies: 0.658  0.572
Epoch 24, CIFAR-10 Batch 4:  Losses:     1.0      1.2	Accuracies: 0.649  0.565
Epoch 24, CIFAR-10 Batch 5:  Losses:     1.0      1.2	Accuracies: 0.684  0.575
Epoch 25, CIFAR-10 Batch 1:  Losses:     1.0      1.2	Accuracies: 0.667  0.569
Epoch 25, CIFAR-10 Batch 2:  Losses:     1.0      1.2	Accuracies: 0.670  0.574
Epoch 25, CIFAR-10 Batch 3:  Losses:     1.0      1.2	Accuracies: 0.658  0.572
Epoch 25, CIFAR-10 Batch 4:  Losses:     1.0      1.2	Accuracies: 0.663  0.571
Epoch 25, CIFAR-10 Batch 5:  Losses:     1.0      1.2	Accuracies: 0.697  0.580
Epoch 26, CIFAR-10 Batch 1:  Losses:     1.0      1.2	Accuracies: 0.697  0.572
Epoch 26, CIFAR-10 Batch 2:  Losses:     1.0      1.2	Accuracies: 0.668  0.580
Epoch 26, CIFAR-10 Batch 3:  Losses:     1.0      1.2	Accuracies: 0.667  0.575
Epoch 26, CIFAR-10 Batch 4:  Losses:     1.0      1.2	Accuracies: 0.672  0.581
Epoch 26, CIFAR-10 Batch 5:  Losses:     1.0      1.2	Accuracies: 0.693  0.578
Epoch 27, CIFAR-10 Batch 1:  Losses:     1.0      1.2	Accuracies: 0.697  0.575
Epoch 27, CIFAR-10 Batch 2:  Losses:     1.0      1.2	Accuracies: 0.691  0.580
Epoch 27, CIFAR-10 Batch 3:  Losses:     1.0      1.2	Accuracies: 0.678  0.580
Epoch 27, CIFAR-10 Batch 4:  Losses:     0.9      1.2	Accuracies: 0.682  0.583
Epoch 27, CIFAR-10 Batch 5:  Losses:     0.9      1.2	Accuracies: 0.704  0.588
Epoch 28, CIFAR-10 Batch 1:  Losses:     1.0      1.2	Accuracies: 0.676  0.576
Epoch 28, CIFAR-10 Batch 2:  Losses:     1.0      1.2	Accuracies: 0.703  0.586
Epoch 28, CIFAR-10 Batch 3:  Losses:     0.9      1.2	Accuracies: 0.686  0.584
Epoch 28, CIFAR-10 Batch 4:  Losses:     0.9      1.2	Accuracies: 0.696  0.580
Epoch 28, CIFAR-10 Batch 5:  Losses:     0.9      1.2	Accuracies: 0.720  0.590
Epoch 29, CIFAR-10 Batch 1:  Losses:     0.9      1.2	Accuracies: 0.717  0.586
Epoch 29, CIFAR-10 Batch 2:  Losses:     0.9      1.2	Accuracies: 0.703  0.586
Epoch 29, CIFAR-10 Batch 3:  Losses:     0.9      1.2	Accuracies: 0.697  0.593
Epoch 29, CIFAR-10 Batch 4:  Losses:     0.9      1.2	Accuracies: 0.693  0.579
Epoch 29, CIFAR-10 Batch 5:  Losses:     0.9      1.2	Accuracies: 0.723  0.595
Epoch 30, CIFAR-10 Batch 1:  Losses:     0.9      1.2	Accuracies: 0.719  0.589
Epoch 30, CIFAR-10 Batch 2:  Losses:     0.9      1.2	Accuracies: 0.698  0.590
Epoch 30, CIFAR-10 Batch 3:  Losses:     0.9      1.2	Accuracies: 0.704  0.591
Epoch 30, CIFAR-10 Batch 4:  Losses:     0.9      1.2	Accuracies: 0.707  0.585
Epoch 30, CIFAR-10 Batch 5:  Losses:     0.9      1.1	Accuracies: 0.734  0.597
Epoch 31, CIFAR-10 Batch 1:  Losses:     0.9      1.1	Accuracies: 0.722  0.594
Epoch 31, CIFAR-10 Batch 2:  Losses:     0.9      1.1	Accuracies: 0.713  0.595
Epoch 31, CIFAR-10 Batch 3:  Losses:     0.9      1.1	Accuracies: 0.709  0.594
Epoch 31, CIFAR-10 Batch 4:  Losses:     0.9      1.1	Accuracies: 0.725  0.589
Epoch 31, CIFAR-10 Batch 5:  Losses:     0.9      1.1	Accuracies: 0.740  0.601
Epoch 32, CIFAR-10 Batch 1:  Losses:     0.9      1.1	Accuracies: 0.733  0.596
Epoch 32, CIFAR-10 Batch 2:  Losses:     0.9      1.1	Accuracies: 0.729  0.598
Epoch 32, CIFAR-10 Batch 3:  Losses:     0.9      1.1	Accuracies: 0.719  0.599
Epoch 32, CIFAR-10 Batch 4:  Losses:     0.9      1.1	Accuracies: 0.733  0.586
Epoch 32, CIFAR-10 Batch 5:  Losses:     0.9      1.1	Accuracies: 0.746  0.603
Epoch 33, CIFAR-10 Batch 1:  Losses:     0.9      1.1	Accuracies: 0.744  0.600
Epoch 33, CIFAR-10 Batch 2:  Losses:     0.9      1.1	Accuracies: 0.726  0.595
Epoch 33, CIFAR-10 Batch 3:  Losses:     0.8      1.1	Accuracies: 0.728  0.597
Epoch 33, CIFAR-10 Batch 4:  Losses:     0.8      1.1	Accuracies: 0.741  0.587
Epoch 33, CIFAR-10 Batch 5:  Losses:     0.8      1.1	Accuracies: 0.754  0.606
Epoch 34, CIFAR-10 Batch 1:  Losses:     0.9      1.1	Accuracies: 0.738  0.604
Epoch 34, CIFAR-10 Batch 2:  Losses:     0.8      1.1	Accuracies: 0.726  0.599
Epoch 34, CIFAR-10 Batch 3:  Losses:     0.8      1.1	Accuracies: 0.741  0.601
Epoch 34, CIFAR-10 Batch 4:  Losses:     0.8      1.1	Accuracies: 0.739  0.600
Epoch 34, CIFAR-10 Batch 5:  Losses:     0.8      1.1	Accuracies: 0.761  0.611
Epoch 35, CIFAR-10 Batch 1:  Losses:     0.8      1.1	Accuracies: 0.760  0.605
Epoch 35, CIFAR-10 Batch 2:  Losses:     0.8      1.1	Accuracies: 0.754  0.605
Epoch 35, CIFAR-10 Batch 3:  Losses:     0.8      1.1	Accuracies: 0.745  0.602
Epoch 35, CIFAR-10 Batch 4:  Losses:     0.8      1.1	Accuracies: 0.760  0.603
Epoch 35, CIFAR-10 Batch 5:  Losses:     0.8      1.1	Accuracies: 0.766  0.613
Epoch 36, CIFAR-10 Batch 1:  Losses:     0.8      1.1	Accuracies: 0.761  0.608
Epoch 36, CIFAR-10 Batch 2:  Losses:     0.8      1.1	Accuracies: 0.762  0.605
Epoch 36, CIFAR-10 Batch 3:  Losses:     0.8      1.1	Accuracies: 0.751  0.602
Epoch 36, CIFAR-10 Batch 4:  Losses:     0.8      1.1	Accuracies: 0.774  0.607
Epoch 36, CIFAR-10 Batch 5:  Losses:     0.8      1.1	Accuracies: 0.782  0.614
Epoch 37, CIFAR-10 Batch 1:  Losses:     0.8      1.1	Accuracies: 0.762  0.614
Epoch 37, CIFAR-10 Batch 2:  Losses:     0.8      1.1	Accuracies: 0.757  0.609
Epoch 37, CIFAR-10 Batch 3:  Losses:     0.8      1.1	Accuracies: 0.761  0.606
Epoch 37, CIFAR-10 Batch 4:  Losses:     0.8      1.1	Accuracies: 0.776  0.607
Epoch 37, CIFAR-10 Batch 5:  Losses:     0.8      1.1	Accuracies: 0.780  0.616
Epoch 38, CIFAR-10 Batch 1:  Losses:     0.8      1.1	Accuracies: 0.771  0.615
Epoch 38, CIFAR-10 Batch 2:  Losses:     0.8      1.1	Accuracies: 0.760  0.611
Epoch 38, CIFAR-10 Batch 3:  Losses:     0.7      1.1	Accuracies: 0.778  0.613
Epoch 38, CIFAR-10 Batch 4:  Losses:     0.7      1.1	Accuracies: 0.775  0.612
Epoch 38, CIFAR-10 Batch 5:  Losses:     0.7      1.1	Accuracies: 0.795  0.622
Epoch 39, CIFAR-10 Batch 1:  Losses:     0.8      1.1	Accuracies: 0.771  0.614
Epoch 39, CIFAR-10 Batch 2:  Losses:     0.8      1.1	Accuracies: 0.766  0.613
Epoch 39, CIFAR-10 Batch 3:  Losses:     0.7      1.1	Accuracies: 0.771  0.612
Epoch 39, CIFAR-10 Batch 4:  Losses:     0.7      1.1	Accuracies: 0.792  0.616
Epoch 39, CIFAR-10 Batch 5:  Losses:     0.7      1.1	Accuracies: 0.791  0.619
Epoch 40, CIFAR-10 Batch 1:  Losses:     0.7      1.1	Accuracies: 0.790  0.620
Epoch 40, CIFAR-10 Batch 2:  Losses:     0.7      1.1	Accuracies: 0.778  0.617
Epoch 40, CIFAR-10 Batch 3:  Losses:     0.7      1.1	Accuracies: 0.762  0.611
Epoch 40, CIFAR-10 Batch 4:  Losses:     0.7      1.1	Accuracies: 0.790  0.613
Epoch 40, CIFAR-10 Batch 5:  Losses:     0.7      1.1	Accuracies: 0.800  0.619
Epoch 41, CIFAR-10 Batch 1:  Losses:     0.7      1.1	Accuracies: 0.783  0.614
Epoch 41, CIFAR-10 Batch 2:  Losses:     0.7      1.1	Accuracies: 0.781  0.615
Epoch 41, CIFAR-10 Batch 3:  Losses:     0.7      1.1	Accuracies: 0.782  0.614
Epoch 41, CIFAR-10 Batch 4:  Losses:     0.7      1.1	Accuracies: 0.795  0.618
Epoch 41, CIFAR-10 Batch 5:  Losses:     0.7      1.1	Accuracies: 0.809  0.630
Epoch 42, CIFAR-10 Batch 1:  Losses:     0.7      1.1	Accuracies: 0.793  0.618
Epoch 42, CIFAR-10 Batch 2:  Losses:     0.7      1.1	Accuracies: 0.792  0.618
Epoch 42, CIFAR-10 Batch 3:  Losses:     0.7      1.1	Accuracies: 0.806  0.620
Epoch 42, CIFAR-10 Batch 4:  Losses:     0.7      1.1	Accuracies: 0.803  0.626
Epoch 42, CIFAR-10 Batch 5:  Losses:     0.7      1.1	Accuracies: 0.808  0.624
Epoch 43, CIFAR-10 Batch 1:  Losses:     0.7      1.1	Accuracies: 0.797  0.619
Epoch 43, CIFAR-10 Batch 2:  Losses:     0.7      1.1	Accuracies: 0.796  0.612
Epoch 43, CIFAR-10 Batch 3:  Losses:     0.7      1.1	Accuracies: 0.808  0.621
Epoch 43, CIFAR-10 Batch 4:  Losses:     0.7      1.1	Accuracies: 0.819  0.617
Epoch 43, CIFAR-10 Batch 5:  Losses:     0.7      1.1	Accuracies: 0.823  0.625
Epoch 44, CIFAR-10 Batch 1:  Losses:     0.7      1.1	Accuracies: 0.802  0.619
Epoch 44, CIFAR-10 Batch 2:  Losses:     0.7      1.1	Accuracies: 0.811  0.620
Epoch 44, CIFAR-10 Batch 3:  Losses:     0.7      1.1	Accuracies: 0.816  0.624
Epoch 44, CIFAR-10 Batch 4:  Losses:     0.7      1.1	Accuracies: 0.816  0.620
Epoch 44, CIFAR-10 Batch 5:  Losses:     0.6      1.1	Accuracies: 0.825  0.633
Epoch 45, CIFAR-10 Batch 1:  Losses:     0.7      1.1	Accuracies: 0.811  0.621
Epoch 45, CIFAR-10 Batch 2:  Losses:     0.7      1.1	Accuracies: 0.800  0.622
Epoch 45, CIFAR-10 Batch 3:  Losses:     0.6      1.1	Accuracies: 0.819  0.631
Epoch 45, CIFAR-10 Batch 4:  Losses:     0.6      1.1	Accuracies: 0.822  0.624
Epoch 45, CIFAR-10 Batch 5:  Losses:     0.6      1.1	Accuracies: 0.830  0.632
Epoch 46, CIFAR-10 Batch 1:  Losses:     0.7      1.1	Accuracies: 0.819  0.622
Epoch 46, CIFAR-10 Batch 2:  Losses:     0.7      1.1	Accuracies: 0.809  0.624
Epoch 46, CIFAR-10 Batch 3:  Losses:     0.6      1.1	Accuracies: 0.827  0.629
Epoch 46, CIFAR-10 Batch 4:  Losses:     0.6      1.1	Accuracies: 0.825  0.623
Epoch 46, CIFAR-10 Batch 5:  Losses:     0.6      1.1	Accuracies: 0.828  0.628
Epoch 47, CIFAR-10 Batch 1:  Losses:     0.6      1.1	Accuracies: 0.812  0.625
Epoch 47, CIFAR-10 Batch 2:  Losses:     0.6      1.1	Accuracies: 0.822  0.630
Epoch 47, CIFAR-10 Batch 3:  Losses:     0.6      1.1	Accuracies: 0.832  0.634
Epoch 47, CIFAR-10 Batch 4:  Losses:     0.6      1.1	Accuracies: 0.829  0.624
Epoch 47, CIFAR-10 Batch 5:  Losses:     0.6      1.1	Accuracies: 0.838  0.631
Epoch 48, CIFAR-10 Batch 1:  Losses:     0.6      1.0	Accuracies: 0.828  0.633
Epoch 48, CIFAR-10 Batch 2:  Losses:     0.6      1.0	Accuracies: 0.823  0.631
Epoch 48, CIFAR-10 Batch 3:  Losses:     0.6      1.0	Accuracies: 0.840  0.635
Epoch 48, CIFAR-10 Batch 4:  Losses:     0.6      1.1	Accuracies: 0.832  0.624
Epoch 48, CIFAR-10 Batch 5:  Losses:     0.6      1.1	Accuracies: 0.839  0.632
Epoch 49, CIFAR-10 Batch 1:  Losses:     0.6      1.0	Accuracies: 0.835  0.632
Epoch 49, CIFAR-10 Batch 2:  Losses:     0.6      1.0	Accuracies: 0.845  0.636
Epoch 49, CIFAR-10 Batch 3:  Losses:     0.6      1.0	Accuracies: 0.845  0.635
Epoch 49, CIFAR-10 Batch 4:  Losses:     0.6      1.0	Accuracies: 0.835  0.633
Epoch 49, CIFAR-10 Batch 5:  Losses:     0.6      1.1	Accuracies: 0.851  0.630
Epoch 50, CIFAR-10 Batch 1:  Losses:     0.6      1.0	Accuracies: 0.843  0.635
Epoch 50, CIFAR-10 Batch 2:  Losses:     0.6      1.0	Accuracies: 0.838  0.641
Epoch 50, CIFAR-10 Batch 3:  Losses:     0.6      1.0	Accuracies: 0.845  0.635
Epoch 50, CIFAR-10 Batch 4:  Losses:     0.6      1.0	Accuracies: 0.844  0.625
Epoch 50, CIFAR-10 Batch 5:  Losses:     0.6      1.0	Accuracies: 0.849  0.633
Epoch 51, CIFAR-10 Batch 1:  Losses:     0.6      1.0	Accuracies: 0.848  0.632
Epoch 51, CIFAR-10 Batch 2:  Losses:     0.6      1.0	Accuracies: 0.840  0.639
Epoch 51, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.866  0.638
Epoch 51, CIFAR-10 Batch 4:  Losses:     0.6      1.0	Accuracies: 0.859  0.631
Epoch 51, CIFAR-10 Batch 5:  Losses:     0.6      1.0	Accuracies: 0.864  0.633
Epoch 52, CIFAR-10 Batch 1:  Losses:     0.6      1.0	Accuracies: 0.858  0.636
Epoch 52, CIFAR-10 Batch 2:  Losses:     0.6      1.0	Accuracies: 0.842  0.637
Epoch 52, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.860  0.636
Epoch 52, CIFAR-10 Batch 4:  Losses:     0.5      1.1	Accuracies: 0.855  0.629
Epoch 52, CIFAR-10 Batch 5:  Losses:     0.5      1.0	Accuracies: 0.865  0.636
Epoch 53, CIFAR-10 Batch 1:  Losses:     0.6      1.0	Accuracies: 0.858  0.633
Epoch 53, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.849  0.640
Epoch 53, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.870  0.639
Epoch 53, CIFAR-10 Batch 4:  Losses:     0.5      1.0	Accuracies: 0.865  0.639
Epoch 53, CIFAR-10 Batch 5:  Losses:     0.5      1.0	Accuracies: 0.870  0.641
Epoch 54, CIFAR-10 Batch 1:  Losses:     0.5      1.0	Accuracies: 0.865  0.632
Epoch 54, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.866  0.639
Epoch 54, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.871  0.635
Epoch 54, CIFAR-10 Batch 4:  Losses:     0.5      1.0	Accuracies: 0.871  0.640
Epoch 54, CIFAR-10 Batch 5:  Losses:     0.5      1.0	Accuracies: 0.879  0.639
Epoch 55, CIFAR-10 Batch 1:  Losses:     0.5      1.0	Accuracies: 0.869  0.630
Epoch 55, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.870  0.644
Epoch 55, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.873  0.635
Epoch 55, CIFAR-10 Batch 4:  Losses:     0.5      1.0	Accuracies: 0.880  0.636
Epoch 55, CIFAR-10 Batch 5:  Losses:     0.5      1.0	Accuracies: 0.875  0.644
Epoch 56, CIFAR-10 Batch 1:  Losses:     0.5      1.0	Accuracies: 0.868  0.636
Epoch 56, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.861  0.642
Epoch 56, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.875  0.637
Epoch 56, CIFAR-10 Batch 4:  Losses:     0.5      1.0	Accuracies: 0.869  0.633
Epoch 56, CIFAR-10 Batch 5:  Losses:     0.5      1.0	Accuracies: 0.880  0.647
Epoch 57, CIFAR-10 Batch 1:  Losses:     0.5      1.0	Accuracies: 0.875  0.641
Epoch 57, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.873  0.643
Epoch 57, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.881  0.642
Epoch 57, CIFAR-10 Batch 4:  Losses:     0.5      1.0	Accuracies: 0.881  0.639
Epoch 57, CIFAR-10 Batch 5:  Losses:     0.5      1.0	Accuracies: 0.884  0.639
Epoch 58, CIFAR-10 Batch 1:  Losses:     0.5      1.0	Accuracies: 0.880  0.643
Epoch 58, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.879  0.649
Epoch 58, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.889  0.648
Epoch 58, CIFAR-10 Batch 4:  Losses:     0.5      1.0	Accuracies: 0.892  0.645
Epoch 58, CIFAR-10 Batch 5:  Losses:     0.5      1.0	Accuracies: 0.890  0.648
Epoch 59, CIFAR-10 Batch 1:  Losses:     0.5      1.0	Accuracies: 0.885  0.642
Epoch 59, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.889  0.641
Epoch 59, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.890  0.646
Epoch 59, CIFAR-10 Batch 4:  Losses:     0.5      1.0	Accuracies: 0.892  0.644
Epoch 59, CIFAR-10 Batch 5:  Losses:     0.5      1.0	Accuracies: 0.894  0.643
Epoch 60, CIFAR-10 Batch 1:  Losses:     0.5      1.0	Accuracies: 0.882  0.640
Epoch 60, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.891  0.646
Epoch 60, CIFAR-10 Batch 3:  Losses:     0.5      1.0	Accuracies: 0.889  0.645
Epoch 60, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.900  0.644
Epoch 60, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.902  0.647
Epoch 61, CIFAR-10 Batch 1:  Losses:     0.5      1.0	Accuracies: 0.892  0.643
Epoch 61, CIFAR-10 Batch 2:  Losses:     0.5      1.0	Accuracies: 0.894  0.649
Epoch 61, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.892  0.647
Epoch 61, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.902  0.650
Epoch 61, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.903  0.646
Epoch 62, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.903  0.644
Epoch 62, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.906  0.651
Epoch 62, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.892  0.647
Epoch 62, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.906  0.645
Epoch 62, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.908  0.651
Epoch 63, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.900  0.643
Epoch 63, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.906  0.650
Epoch 63, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.900  0.649
Epoch 63, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.911  0.648
Epoch 63, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.918  0.650
Epoch 64, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.907  0.644
Epoch 64, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.907  0.648
Epoch 64, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.906  0.650
Epoch 64, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.902  0.645
Epoch 64, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.916  0.651
Epoch 65, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.910  0.642
Epoch 65, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.906  0.650
Epoch 65, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.910  0.649
Epoch 65, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.903  0.643
Epoch 65, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.920  0.649
Epoch 66, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.918  0.651
Epoch 66, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.907  0.653
Epoch 66, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.907  0.653
Epoch 66, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.915  0.649
Epoch 66, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.922  0.647
Epoch 67, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.921  0.649
Epoch 67, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.922  0.652
Epoch 67, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.916  0.652
Epoch 67, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.923  0.647
Epoch 67, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.918  0.645
Epoch 68, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.926  0.649
Epoch 68, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.921  0.652
Epoch 68, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.912  0.655
Epoch 68, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.907  0.639
Epoch 68, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.931  0.646
Epoch 69, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.918  0.641
Epoch 69, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.925  0.656
Epoch 69, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.911  0.651
Epoch 69, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.922  0.654
Epoch 69, CIFAR-10 Batch 5:  Losses:     0.4      1.0	Accuracies: 0.936  0.651
Epoch 70, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.931  0.655
Epoch 70, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.920  0.655
Epoch 70, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.916  0.659
Epoch 70, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.928  0.655
Epoch 70, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.934  0.652
Epoch 71, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.927  0.647
Epoch 71, CIFAR-10 Batch 2:  Losses:     0.4      1.0	Accuracies: 0.933  0.657
Epoch 71, CIFAR-10 Batch 3:  Losses:     0.4      1.0	Accuracies: 0.921  0.661
Epoch 71, CIFAR-10 Batch 4:  Losses:     0.4      1.0	Accuracies: 0.926  0.655
Epoch 71, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.938  0.659
Epoch 72, CIFAR-10 Batch 1:  Losses:     0.4      1.0	Accuracies: 0.932  0.654
Epoch 72, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.929  0.653
Epoch 72, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.928  0.654
Epoch 72, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.939  0.656
Epoch 72, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.943  0.656
Epoch 73, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.938  0.659
Epoch 73, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.936  0.660
Epoch 73, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.928  0.661
Epoch 73, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.943  0.657
Epoch 73, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.946  0.656
Epoch 74, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.941  0.658
Epoch 74, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.938  0.655
Epoch 74, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.932  0.657
Epoch 74, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.942  0.656
Epoch 74, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.950  0.655
Epoch 75, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.943  0.657
Epoch 75, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.944  0.663
Epoch 75, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.932  0.659
Epoch 75, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.950  0.660
Epoch 75, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.949  0.657
Epoch 76, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.942  0.651
Epoch 76, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.948  0.659
Epoch 76, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.937  0.661
Epoch 76, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.943  0.654
Epoch 76, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.947  0.652
Epoch 77, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.948  0.657
Epoch 77, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.939  0.659
Epoch 77, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.936  0.658
Epoch 77, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.942  0.652
Epoch 77, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.954  0.651
Epoch 78, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.943  0.647
Epoch 78, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.944  0.661
Epoch 78, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.937  0.655
Epoch 78, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.949  0.659
Epoch 78, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.949  0.661
Epoch 79, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.947  0.656
Epoch 79, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.953  0.664
Epoch 79, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.946  0.663
Epoch 79, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.955  0.661
Epoch 79, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.952  0.662
Epoch 80, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.953  0.652
Epoch 80, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.952  0.662
Epoch 80, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.946  0.659
Epoch 80, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.948  0.651
Epoch 80, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.954  0.660
Epoch 81, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.955  0.657
Epoch 81, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.954  0.663
Epoch 81, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.954  0.663
Epoch 81, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.949  0.658
Epoch 81, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.960  0.659
Epoch 82, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.960  0.649
Epoch 82, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.960  0.665
Epoch 82, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.950  0.663
Epoch 82, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.955  0.655
Epoch 82, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.958  0.663
Epoch 83, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.954  0.653
Epoch 83, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.962  0.662
Epoch 83, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.955  0.663
Epoch 83, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.954  0.657
Epoch 83, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.958  0.656
Epoch 84, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.960  0.659
Epoch 84, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.962  0.660
Epoch 84, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.952  0.661
Epoch 84, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.962  0.656
Epoch 84, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.964  0.662
Epoch 85, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.960  0.656
Epoch 85, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.963  0.662
Epoch 85, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.957  0.663
Epoch 85, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.960  0.656
Epoch 85, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.959  0.657
Epoch 86, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.963  0.649
Epoch 86, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.973  0.663
Epoch 86, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.955  0.663
Epoch 86, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.957  0.658
Epoch 86, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.968  0.657
Epoch 87, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.968  0.659
Epoch 87, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.969  0.660
Epoch 87, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.963  0.667
Epoch 87, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.968  0.664
Epoch 87, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.955  0.656
Epoch 88, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.958  0.642
Epoch 88, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.963  0.649
Epoch 88, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.953  0.660
Epoch 88, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.968  0.663
Epoch 88, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.964  0.660
Epoch 89, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.967  0.658
Epoch 89, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.954  0.648
Epoch 89, CIFAR-10 Batch 3:  Losses:     0.3      1.1	Accuracies: 0.943  0.641
Epoch 89, CIFAR-10 Batch 4:  Losses:     0.3      1.1	Accuracies: 0.946  0.635
Epoch 89, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.942  0.636
Epoch 90, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.946  0.646
Epoch 90, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.965  0.654
Epoch 90, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.964  0.662
Epoch 90, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.967  0.658
Epoch 90, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.963  0.660
Epoch 91, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.962  0.650
Epoch 91, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.959  0.648
Epoch 91, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.946  0.651
Epoch 91, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.959  0.646
Epoch 91, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.954  0.646
Epoch 92, CIFAR-10 Batch 1:  Losses:     0.3      1.0	Accuracies: 0.968  0.652
Epoch 92, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.964  0.659
Epoch 92, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.968  0.666
Epoch 92, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.965  0.653
Epoch 92, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.960  0.659
Epoch 93, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.972  0.655
Epoch 93, CIFAR-10 Batch 2:  Losses:     0.3      1.0	Accuracies: 0.964  0.645
Epoch 93, CIFAR-10 Batch 3:  Losses:     0.3      1.0	Accuracies: 0.960  0.651
Epoch 93, CIFAR-10 Batch 4:  Losses:     0.3      1.0	Accuracies: 0.965  0.645
Epoch 93, CIFAR-10 Batch 5:  Losses:     0.3      1.0	Accuracies: 0.958  0.644
Epoch 94, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.974  0.658
Epoch 94, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.975  0.666
Epoch 94, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.963  0.663
Epoch 94, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.970  0.644
Epoch 94, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.968  0.655
Epoch 95, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.975  0.661
Epoch 95, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.973  0.659
Epoch 95, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.967  0.653
Epoch 95, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.973  0.646
Epoch 95, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.967  0.659
Epoch 96, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.979  0.663
Epoch 96, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.975  0.661
Epoch 96, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.963  0.656
Epoch 96, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.979  0.653
Epoch 96, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.975  0.665
Epoch 97, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.979  0.656
Epoch 97, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.969  0.651
Epoch 97, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.964  0.657
Epoch 97, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.972  0.657
Epoch 97, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.968  0.662
Epoch 98, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.979  0.653
Epoch 98, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.975  0.660
Epoch 98, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.972  0.663
Epoch 98, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.976  0.660
Epoch 98, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.965  0.660
Epoch 99, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.978  0.650
Epoch 99, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.975  0.661
Epoch 99, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.974  0.668
Epoch 99, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.975  0.665
Epoch 99, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.973  0.661
Epoch 100, CIFAR-10 Batch 1:  Losses:     0.2      1.0	Accuracies: 0.979  0.653
Epoch 100, CIFAR-10 Batch 2:  Losses:     0.2      1.0	Accuracies: 0.980  0.666
Epoch 100, CIFAR-10 Batch 3:  Losses:     0.2      1.0	Accuracies: 0.976  0.661
Epoch 100, CIFAR-10 Batch 4:  Losses:     0.2      1.0	Accuracies: 0.974  0.656
Epoch 100, CIFAR-10 Batch 5:  Losses:     0.2      1.0	Accuracies: 0.975  0.663

Checkpoint

The model has been saved to disk.

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.


In [130]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_test.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for test_feature_batch, test_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: test_feature_batch, loaded_y: test_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


INFO:tensorflow:Restoring parameters from ./image_classification
Testing Accuracy: 0.6556825041770935

Why 50-80% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 80%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.


In [ ]: