Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

# Use Floyd's cifar-10 dataset if present
floyd_cifar10_location = '/input/cifar-10/python.tar.gz'
if isfile(floyd_cifar10_location):
    tar_gz_path = floyd_cifar10_location
else:
    tar_gz_path = 'cifar-10-python.tar.gz'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(tar_gz_path):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            tar_gz_path,
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open(tar_gz_path) as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


All files found!

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

  • airplane
  • automobile
  • bird
  • cat
  • deer
  • dog
  • frog
  • horse
  • ship
  • truck

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.


In [2]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id = 1
sample_id = 2
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 1:
Samples: 10000
Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}
First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]

Example of Image 2:
Image - Min Value: 20 Max Value: 255
Image - Shape: (32, 32, 3)
Label - Label Id: 9 Name: truck

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.


In [3]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    # TODO: Implement Function
    return x/255


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_normalize(normalize)


Tests Passed

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.

Hint: Don't reinvent the wheel.


In [4]:
from sklearn.preprocessing import LabelBinarizer

encoder = LabelBinarizer()
encoder.fit(range(10))

def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    # TODO: Implement Function
    return encoder.transform(x)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_one_hot_encode(one_hot_encode)


Tests Passed

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.


In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [6]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

Note: If you're finding it hard to dedicate enough time for this course each week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages to build each layer, except the layers you build in the "Convolutional and Max Pooling Layer" section. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

However, if you would like to get the most out of this course, try to solve all the problems without using anything from the TF Layers packages. You can still use classes from other packages that happen to have the same name as ones you find in TF Layers! For example, instead of using the TF Layers version of the conv2d class, tf.layers.conv2d, you would want to use the TF Neural Network version of conv2d, tf.nn.conv2d.

Let's begin!

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

  • Implement neural_net_image_input
    • Return a TF Placeholder
    • Set the shape using image_shape with batch size set to None.
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_label_input
    • Return a TF Placeholder
    • Set the shape using n_classes with batch size set to None.
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_keep_prob_input
    • Return a TF Placeholder for dropout keep probability.
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.

These names will be used at the end of the project to load your saved model.

Note: None for shapes in TensorFlow allow for a dynamic size.


In [7]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a batch of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    # TODO: Implement Function
    x = tf.placeholder(tf.float32, shape=[None] + list(image_shape), name='x')
    return x


def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    # TODO: Implement Function
    y = tf.placeholder(tf.float32, shape=(None, n_classes), name='y')
    return y


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    # TODO: Implement Function
    keep_prob = tf.placeholder(tf.float32, name='keep_prob')
    return keep_prob


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Image Input Tests Passed.
Label Input Tests Passed.
Keep Prob Tests Passed.

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • Apply a convolution to x_tensor using weight and conv_strides.
    • We recommend you use same padding, but you're welcome to use any padding.
  • Add bias
  • Add a nonlinear activation to the convolution.
  • Apply Max Pooling using pool_ksize and pool_strides.
    • We recommend you use same padding, but you're welcome to use any padding.

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer, but you can still use TensorFlow's Neural Network package. You may still use the shortcut option for all the other layers.


In [8]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_ksize: kernal size 2-D Tuple for the convolutional layer
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """
    # TODO: Implement Function
    weights = tf.Variable(tf.truncated_normal(list(conv_ksize) + [x_tensor.shape.as_list()[-1]] + [conv_num_outputs]))
    bias = tf.Variable(tf.zeros((conv_num_outputs)))
    c_padding = 'SAME'
    c_strides = (1,) + conv_strides + (1,)
    p_padding = 'SAME'
    p_strides = (1,) + pool_strides + (1,)
    p_ksize = (1,) + pool_ksize + (1,)
    conv = tf.nn.conv2d(x_tensor, weights, c_strides, c_padding) + bias
    conv = tf.nn.relu(conv)
    conv = tf.nn.max_pool(conv, p_ksize, p_strides, p_padding)
    return conv 


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_con_pool(conv2d_maxpool)


Tests Passed

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [9]:
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    # TODO: Implement Function
    return tf.contrib.layers.flatten(x_tensor)

print
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_flatten(flatten)


Tests Passed

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [10]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    layer = tf.contrib.layers.fully_connected(x_tensor, num_outputs=num_outputs)
    layer = tf.nn.relu(layer)
    return layer


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_fully_conn(fully_conn)


Tests Passed

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.

Note: Activation, softmax, or cross entropy should not be applied to this.


In [11]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    return tf.contrib.layers.fully_connected(x_tensor, num_outputs=num_outputs, activation_fn=None)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_output(output)


Tests Passed

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • Apply a Flatten Layer
  • Apply 1, 2, or 3 Fully Connected Layers
  • Apply an Output Layer
  • Return the output
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.

In [12]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """
    # TODO: Apply 1, 2, or 3 Convolution and Max Pool layers
    #    Play around with different number of outputs, kernel size and stride
    # Function Definition from Above:
    #    conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides)
    conv_1_num_outputs = 16
    conv_1_conv_ksize = (4, 4)
    conv_1_conv_strides = (2, 2)
    conv_1_pool_ksize = (2, 2)
    conv_1_pool_strides = (2, 2)
    conv_1 = conv2d_maxpool(x, conv_1_num_outputs, conv_1_conv_ksize, conv_1_conv_strides, conv_1_pool_ksize, conv_1_pool_strides)
    conv_1 = tf.nn.dropout(conv_1, keep_prob)
    

    # TODO: Apply a Flatten Layer
    # Function Definition from Above:
    #   flatten(x_tensor)
    conv_1 = flatten(conv_1)
    

    # TODO: Apply 1, 2, or 3 Fully Connected Layers
    #    Play around with different number of outputs
    # Function Definition from Above:
    #   fully_conn(x_tensor, num_outputs)
    fconn_1_num_outputs = 16
    fconn_2_num_outputs = 10
    fconn_1 = fully_conn(conv_1, fconn_1_num_outputs)
    fconn_2 = fully_conn(fconn_1, fconn_2_num_outputs)
    
    
    # TODO: Apply an Output Layer
    #    Set this to the number of classes
    # Function Definition from Above:
    #   output(x_tensor, num_outputs)
    num_outputs = 10
    output_layer = output(fconn_2, num_outputs)
    
    
    # TODO: return output
    return output_layer


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""

##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

tests.test_conv_net(conv_net)


Neural Network Built!

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

  • x for image input
  • y for labels
  • keep_prob for keep probability for dropout

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

Note: Nothing needs to be returned. This function is only optimizing the neural network.


In [13]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    # TODO: Implement Function
    session.run(optimizer, feed_dict={x: feature_batch, y: label_batch, keep_prob: keep_probability})


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_train_nn(train_neural_network)


Tests Passed

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.


In [14]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """
    loss = session.run(cost, feed_dict={x: feature_batch, y:label_batch, keep_prob: 1.0})
    acc = session.run(accuracy, feed_dict={x: valid_features, y:valid_labels, keep_prob: 1.0})
    print("Loss: {}".format(loss))
    print("Accuracy: {}".format(acc))

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
    • 64
    • 128
    • 256
    • ...
  • Set keep_probability to the probability of keeping a node using dropout

In [15]:
# TODO: Tune Parameters
epochs = 100
batch_size = 64
keep_probability = 0.5

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.


In [16]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  Loss: 2.2456588745117188
Accuracy: 0.14839999377727509
Epoch  2, CIFAR-10 Batch 1:  Loss: 2.2228574752807617
Accuracy: 0.18379999697208405
Epoch  3, CIFAR-10 Batch 1:  Loss: 2.203314781188965
Accuracy: 0.20360000431537628
Epoch  4, CIFAR-10 Batch 1:  Loss: 2.102692127227783
Accuracy: 0.2587999999523163
Epoch  5, CIFAR-10 Batch 1:  Loss: 2.0347514152526855
Accuracy: 0.3131999969482422
Epoch  6, CIFAR-10 Batch 1:  Loss: 1.9529826641082764
Accuracy: 0.3278000056743622
Epoch  7, CIFAR-10 Batch 1:  Loss: 1.8932406902313232
Accuracy: 0.34380000829696655
Epoch  8, CIFAR-10 Batch 1:  Loss: 1.8463184833526611
Accuracy: 0.3612000048160553
Epoch  9, CIFAR-10 Batch 1:  Loss: 1.800697684288025
Accuracy: 0.3691999912261963
Epoch 10, CIFAR-10 Batch 1:  Loss: 1.773847222328186
Accuracy: 0.3718000054359436
Epoch 11, CIFAR-10 Batch 1:  Loss: 1.7259718179702759
Accuracy: 0.38260000944137573
Epoch 12, CIFAR-10 Batch 1:  Loss: 1.6842559576034546
Accuracy: 0.39340001344680786
Epoch 13, CIFAR-10 Batch 1:  Loss: 1.6501190662384033
Accuracy: 0.41040000319480896
Epoch 14, CIFAR-10 Batch 1:  Loss: 1.6026151180267334
Accuracy: 0.4146000146865845
Epoch 15, CIFAR-10 Batch 1:  Loss: 1.6132590770721436
Accuracy: 0.41600000858306885
Epoch 16, CIFAR-10 Batch 1:  Loss: 1.5777428150177002
Accuracy: 0.4334000051021576
Epoch 17, CIFAR-10 Batch 1:  Loss: 1.5860086679458618
Accuracy: 0.4406000077724457
Epoch 18, CIFAR-10 Batch 1:  Loss: 1.5815776586532593
Accuracy: 0.44699999690055847
Epoch 19, CIFAR-10 Batch 1:  Loss: 1.5521409511566162
Accuracy: 0.45739999413490295
Epoch 20, CIFAR-10 Batch 1:  Loss: 1.504465103149414
Accuracy: 0.4674000144004822
Epoch 21, CIFAR-10 Batch 1:  Loss: 1.4788421392440796
Accuracy: 0.4652000069618225
Epoch 22, CIFAR-10 Batch 1:  Loss: 1.4677622318267822
Accuracy: 0.477400004863739
Epoch 23, CIFAR-10 Batch 1:  Loss: 1.427333116531372
Accuracy: 0.48080000281333923
Epoch 24, CIFAR-10 Batch 1:  Loss: 1.4165451526641846
Accuracy: 0.48500001430511475
Epoch 25, CIFAR-10 Batch 1:  Loss: 1.4053030014038086
Accuracy: 0.48660001158714294
Epoch 26, CIFAR-10 Batch 1:  Loss: 1.3817192316055298
Accuracy: 0.4885999858379364
Epoch 27, CIFAR-10 Batch 1:  Loss: 1.3644040822982788
Accuracy: 0.4880000054836273
Epoch 28, CIFAR-10 Batch 1:  Loss: 1.3520208597183228
Accuracy: 0.4959999918937683
Epoch 29, CIFAR-10 Batch 1:  Loss: 1.3035637140274048
Accuracy: 0.49480000138282776
Epoch 30, CIFAR-10 Batch 1:  Loss: 1.286607027053833
Accuracy: 0.4997999966144562
Epoch 31, CIFAR-10 Batch 1:  Loss: 1.2819592952728271
Accuracy: 0.5019999742507935
Epoch 32, CIFAR-10 Batch 1:  Loss: 1.252350091934204
Accuracy: 0.5095999836921692
Epoch 33, CIFAR-10 Batch 1:  Loss: 1.2068716287612915
Accuracy: 0.5094000101089478
Epoch 34, CIFAR-10 Batch 1:  Loss: 1.1971105337142944
Accuracy: 0.5072000026702881
Epoch 35, CIFAR-10 Batch 1:  Loss: 1.201977252960205
Accuracy: 0.5088000297546387
Epoch 36, CIFAR-10 Batch 1:  Loss: 1.15644109249115
Accuracy: 0.5123999714851379
Epoch 37, CIFAR-10 Batch 1:  Loss: 1.1680353879928589
Accuracy: 0.5149999856948853
Epoch 38, CIFAR-10 Batch 1:  Loss: 1.1613332033157349
Accuracy: 0.5174000263214111
Epoch 39, CIFAR-10 Batch 1:  Loss: 1.1382999420166016
Accuracy: 0.5149999856948853
Epoch 40, CIFAR-10 Batch 1:  Loss: 1.1504480838775635
Accuracy: 0.5167999863624573
Epoch 41, CIFAR-10 Batch 1:  Loss: 1.1532580852508545
Accuracy: 0.5216000080108643
Epoch 42, CIFAR-10 Batch 1:  Loss: 1.0832242965698242
Accuracy: 0.519599974155426
Epoch 43, CIFAR-10 Batch 1:  Loss: 1.121562123298645
Accuracy: 0.5180000066757202
Epoch 44, CIFAR-10 Batch 1:  Loss: 1.1349412202835083
Accuracy: 0.520799994468689
Epoch 45, CIFAR-10 Batch 1:  Loss: 1.0877070426940918
Accuracy: 0.5217999815940857
Epoch 46, CIFAR-10 Batch 1:  Loss: 1.0952332019805908
Accuracy: 0.5288000106811523
Epoch 47, CIFAR-10 Batch 1:  Loss: 1.063412070274353
Accuracy: 0.5271999835968018
Epoch 48, CIFAR-10 Batch 1:  Loss: 1.0887346267700195
Accuracy: 0.5234000086784363
Epoch 49, CIFAR-10 Batch 1:  Loss: 1.0554275512695312
Accuracy: 0.5228000283241272
Epoch 50, CIFAR-10 Batch 1:  Loss: 1.089920997619629
Accuracy: 0.5288000106811523
Epoch 51, CIFAR-10 Batch 1:  Loss: 1.0340304374694824
Accuracy: 0.5302000045776367
Epoch 52, CIFAR-10 Batch 1:  Loss: 1.073946237564087
Accuracy: 0.5238000154495239
Epoch 53, CIFAR-10 Batch 1:  Loss: 1.060490369796753
Accuracy: 0.5278000235557556
Epoch 54, CIFAR-10 Batch 1:  Loss: 1.080582618713379
Accuracy: 0.5202000141143799
Epoch 55, CIFAR-10 Batch 1:  Loss: 1.0120888948440552
Accuracy: 0.5210000276565552
Epoch 56, CIFAR-10 Batch 1:  Loss: 1.0186234712600708
Accuracy: 0.5252000093460083
Epoch 57, CIFAR-10 Batch 1:  Loss: 1.0261948108673096
Accuracy: 0.5278000235557556
Epoch 58, CIFAR-10 Batch 1:  Loss: 1.0321109294891357
Accuracy: 0.5325999855995178
Epoch 59, CIFAR-10 Batch 1:  Loss: 1.04452645778656
Accuracy: 0.52920001745224
Epoch 60, CIFAR-10 Batch 1:  Loss: 1.0084903240203857
Accuracy: 0.5296000242233276
Epoch 61, CIFAR-10 Batch 1:  Loss: 1.038388729095459
Accuracy: 0.5325999855995178
Epoch 62, CIFAR-10 Batch 1:  Loss: 0.9916473627090454
Accuracy: 0.5343999862670898
Epoch 63, CIFAR-10 Batch 1:  Loss: 1.0487167835235596
Accuracy: 0.5293999910354614
Epoch 64, CIFAR-10 Batch 1:  Loss: 1.0535812377929688
Accuracy: 0.5281999707221985
Epoch 65, CIFAR-10 Batch 1:  Loss: 1.0563992261886597
Accuracy: 0.5320000052452087
Epoch 66, CIFAR-10 Batch 1:  Loss: 1.011755108833313
Accuracy: 0.5260000228881836
Epoch 67, CIFAR-10 Batch 1:  Loss: 1.0272871255874634
Accuracy: 0.534600019454956
Epoch 68, CIFAR-10 Batch 1:  Loss: 1.0191396474838257
Accuracy: 0.5351999998092651
Epoch 69, CIFAR-10 Batch 1:  Loss: 1.0428380966186523
Accuracy: 0.5347999930381775
Epoch 70, CIFAR-10 Batch 1:  Loss: 1.014078974723816
Accuracy: 0.5311999917030334
Epoch 71, CIFAR-10 Batch 1:  Loss: 1.0525434017181396
Accuracy: 0.5303999781608582
Epoch 72, CIFAR-10 Batch 1:  Loss: 1.0044472217559814
Accuracy: 0.5317999720573425
Epoch 73, CIFAR-10 Batch 1:  Loss: 0.9829424619674683
Accuracy: 0.5320000052452087
Epoch 74, CIFAR-10 Batch 1:  Loss: 1.0065399408340454
Accuracy: 0.5317999720573425
Epoch 75, CIFAR-10 Batch 1:  Loss: 1.045783281326294
Accuracy: 0.5353999733924866
Epoch 76, CIFAR-10 Batch 1:  Loss: 0.9968324899673462
Accuracy: 0.5378000140190125
Epoch 77, CIFAR-10 Batch 1:  Loss: 1.0221030712127686
Accuracy: 0.5361999869346619
Epoch 78, CIFAR-10 Batch 1:  Loss: 1.0076982975006104
Accuracy: 0.5338000059127808
Epoch 79, CIFAR-10 Batch 1:  Loss: 0.977861225605011
Accuracy: 0.5374000072479248
Epoch 80, CIFAR-10 Batch 1:  Loss: 0.9824552536010742
Accuracy: 0.5365999937057495
Epoch 81, CIFAR-10 Batch 1:  Loss: 1.0313940048217773
Accuracy: 0.5368000268936157
Epoch 82, CIFAR-10 Batch 1:  Loss: 1.0019489526748657
Accuracy: 0.5368000268936157
Epoch 83, CIFAR-10 Batch 1:  Loss: 0.9957841634750366
Accuracy: 0.532800018787384
Epoch 84, CIFAR-10 Batch 1:  Loss: 1.0009589195251465
Accuracy: 0.5351999998092651
Epoch 85, CIFAR-10 Batch 1:  Loss: 1.0025355815887451
Accuracy: 0.5396000146865845
Epoch 86, CIFAR-10 Batch 1:  Loss: 0.9698912501335144
Accuracy: 0.5375999808311462
Epoch 87, CIFAR-10 Batch 1:  Loss: 0.9765342473983765
Accuracy: 0.5379999876022339
Epoch 88, CIFAR-10 Batch 1:  Loss: 1.0041935443878174
Accuracy: 0.5404000282287598
Epoch 89, CIFAR-10 Batch 1:  Loss: 0.9839698672294617
Accuracy: 0.5371999740600586
Epoch 90, CIFAR-10 Batch 1:  Loss: 0.9696253538131714
Accuracy: 0.5351999998092651
Epoch 91, CIFAR-10 Batch 1:  Loss: 0.9668289422988892
Accuracy: 0.5392000079154968
Epoch 92, CIFAR-10 Batch 1:  Loss: 0.9703388214111328
Accuracy: 0.5351999998092651
Epoch 93, CIFAR-10 Batch 1:  Loss: 0.9596959948539734
Accuracy: 0.5406000018119812
Epoch 94, CIFAR-10 Batch 1:  Loss: 0.9814890027046204
Accuracy: 0.5368000268936157
Epoch 95, CIFAR-10 Batch 1:  Loss: 0.9572793841362
Accuracy: 0.5388000011444092
Epoch 96, CIFAR-10 Batch 1:  Loss: 0.9639244079589844
Accuracy: 0.5401999950408936
Epoch 97, CIFAR-10 Batch 1:  Loss: 0.9518283605575562
Accuracy: 0.5383999943733215
Epoch 98, CIFAR-10 Batch 1:  Loss: 0.9782615900039673
Accuracy: 0.5357999801635742
Epoch 99, CIFAR-10 Batch 1:  Loss: 0.9753526449203491
Accuracy: 0.5472000241279602
Epoch 100, CIFAR-10 Batch 1:  Loss: 0.9815242886543274
Accuracy: 0.5396000146865845

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.


In [17]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  Loss: 2.289694309234619
Accuracy: 0.1501999944448471
Epoch  1, CIFAR-10 Batch 2:  Loss: 2.188994884490967
Accuracy: 0.1754000037908554
Epoch  1, CIFAR-10 Batch 3:  Loss: 2.076352119445801
Accuracy: 0.18619999289512634
Epoch  1, CIFAR-10 Batch 4:  Loss: 2.013221263885498
Accuracy: 0.19300000369548798
Epoch  1, CIFAR-10 Batch 5:  Loss: 2.11391019821167
Accuracy: 0.20020000636577606
Epoch  2, CIFAR-10 Batch 1:  Loss: 2.182705879211426
Accuracy: 0.24879999458789825
Epoch  2, CIFAR-10 Batch 2:  Loss: 2.033438205718994
Accuracy: 0.2524000108242035
Epoch  2, CIFAR-10 Batch 3:  Loss: 1.7980804443359375
Accuracy: 0.2635999917984009
Epoch  2, CIFAR-10 Batch 4:  Loss: 1.7772022485733032
Accuracy: 0.301800012588501
Epoch  2, CIFAR-10 Batch 5:  Loss: 1.7278356552124023
Accuracy: 0.31119999289512634
Epoch  3, CIFAR-10 Batch 1:  Loss: 2.0143818855285645
Accuracy: 0.32019999623298645
Epoch  3, CIFAR-10 Batch 2:  Loss: 1.8045246601104736
Accuracy: 0.3222000002861023
Epoch  3, CIFAR-10 Batch 3:  Loss: 1.6512197256088257
Accuracy: 0.3264000117778778
Epoch  3, CIFAR-10 Batch 4:  Loss: 1.7045538425445557
Accuracy: 0.3393999934196472
Epoch  3, CIFAR-10 Batch 5:  Loss: 1.6201496124267578
Accuracy: 0.335999995470047
Epoch  4, CIFAR-10 Batch 1:  Loss: 1.9391419887542725
Accuracy: 0.3587999939918518
Epoch  4, CIFAR-10 Batch 2:  Loss: 1.6838080883026123
Accuracy: 0.37619999051094055
Epoch  4, CIFAR-10 Batch 3:  Loss: 1.5394046306610107
Accuracy: 0.41519999504089355
Epoch  4, CIFAR-10 Batch 4:  Loss: 1.5669333934783936
Accuracy: 0.42399999499320984
Epoch  4, CIFAR-10 Batch 5:  Loss: 1.4757850170135498
Accuracy: 0.4293999969959259
Epoch  5, CIFAR-10 Batch 1:  Loss: 1.8049408197402954
Accuracy: 0.43880000710487366
Epoch  5, CIFAR-10 Batch 2:  Loss: 1.4893085956573486
Accuracy: 0.45239999890327454
Epoch  5, CIFAR-10 Batch 3:  Loss: 1.317798137664795
Accuracy: 0.4496000111103058
Epoch  5, CIFAR-10 Batch 4:  Loss: 1.4863989353179932
Accuracy: 0.4636000096797943
Epoch  5, CIFAR-10 Batch 5:  Loss: 1.389000415802002
Accuracy: 0.46239998936653137
Epoch  6, CIFAR-10 Batch 1:  Loss: 1.7684777975082397
Accuracy: 0.4765999913215637
Epoch  6, CIFAR-10 Batch 2:  Loss: 1.3891499042510986
Accuracy: 0.4724000096321106
Epoch  6, CIFAR-10 Batch 3:  Loss: 1.2260068655014038
Accuracy: 0.46160000562667847
Epoch  6, CIFAR-10 Batch 4:  Loss: 1.4114367961883545
Accuracy: 0.483599990606308
Epoch  6, CIFAR-10 Batch 5:  Loss: 1.3426716327667236
Accuracy: 0.477400004863739
Epoch  7, CIFAR-10 Batch 1:  Loss: 1.7295799255371094
Accuracy: 0.4830000102519989
Epoch  7, CIFAR-10 Batch 2:  Loss: 1.3349982500076294
Accuracy: 0.49320000410079956
Epoch  7, CIFAR-10 Batch 3:  Loss: 1.1981823444366455
Accuracy: 0.483599990606308
Epoch  7, CIFAR-10 Batch 4:  Loss: 1.4050829410552979
Accuracy: 0.49219998717308044
Epoch  7, CIFAR-10 Batch 5:  Loss: 1.3448997735977173
Accuracy: 0.49559998512268066
Epoch  8, CIFAR-10 Batch 1:  Loss: 1.6731736660003662
Accuracy: 0.49559998512268066
Epoch  8, CIFAR-10 Batch 2:  Loss: 1.2832739353179932
Accuracy: 0.501800000667572
Epoch  8, CIFAR-10 Batch 3:  Loss: 1.1718207597732544
Accuracy: 0.4950000047683716
Epoch  8, CIFAR-10 Batch 4:  Loss: 1.3875067234039307
Accuracy: 0.5049999952316284
Epoch  8, CIFAR-10 Batch 5:  Loss: 1.3087221384048462
Accuracy: 0.5045999884605408
Epoch  9, CIFAR-10 Batch 1:  Loss: 1.6494659185409546
Accuracy: 0.5049999952316284
Epoch  9, CIFAR-10 Batch 2:  Loss: 1.2185858488082886
Accuracy: 0.5130000114440918
Epoch  9, CIFAR-10 Batch 3:  Loss: 1.1324760913848877
Accuracy: 0.5001999735832214
Epoch  9, CIFAR-10 Batch 4:  Loss: 1.3718808889389038
Accuracy: 0.5135999917984009
Epoch  9, CIFAR-10 Batch 5:  Loss: 1.3067731857299805
Accuracy: 0.5073999762535095
Epoch 10, CIFAR-10 Batch 1:  Loss: 1.639012336730957
Accuracy: 0.5005999803543091
Epoch 10, CIFAR-10 Batch 2:  Loss: 1.2312122583389282
Accuracy: 0.5152000188827515
Epoch 10, CIFAR-10 Batch 3:  Loss: 1.0965873003005981
Accuracy: 0.5073999762535095
Epoch 10, CIFAR-10 Batch 4:  Loss: 1.3839694261550903
Accuracy: 0.5185999870300293
Epoch 10, CIFAR-10 Batch 5:  Loss: 1.310805082321167
Accuracy: 0.5184000134468079
Epoch 11, CIFAR-10 Batch 1:  Loss: 1.58930504322052
Accuracy: 0.522599995136261
Epoch 11, CIFAR-10 Batch 2:  Loss: 1.2087900638580322
Accuracy: 0.520799994468689
Epoch 11, CIFAR-10 Batch 3:  Loss: 1.0571773052215576
Accuracy: 0.5194000005722046
Epoch 11, CIFAR-10 Batch 4:  Loss: 1.368760108947754
Accuracy: 0.5266000032424927
Epoch 11, CIFAR-10 Batch 5:  Loss: 1.3091150522232056
Accuracy: 0.5249999761581421
Epoch 12, CIFAR-10 Batch 1:  Loss: 1.6133372783660889
Accuracy: 0.5230000019073486
Epoch 12, CIFAR-10 Batch 2:  Loss: 1.1852424144744873
Accuracy: 0.526199996471405
Epoch 12, CIFAR-10 Batch 3:  Loss: 1.062178373336792
Accuracy: 0.5303999781608582
Epoch 12, CIFAR-10 Batch 4:  Loss: 1.3904449939727783
Accuracy: 0.5264000296592712
Epoch 12, CIFAR-10 Batch 5:  Loss: 1.276389479637146
Accuracy: 0.5266000032424927
Epoch 13, CIFAR-10 Batch 1:  Loss: 1.571345567703247
Accuracy: 0.5324000120162964
Epoch 13, CIFAR-10 Batch 2:  Loss: 1.2096132040023804
Accuracy: 0.5303999781608582
Epoch 13, CIFAR-10 Batch 3:  Loss: 1.0556056499481201
Accuracy: 0.5317999720573425
Epoch 13, CIFAR-10 Batch 4:  Loss: 1.3744007349014282
Accuracy: 0.5347999930381775
Epoch 13, CIFAR-10 Batch 5:  Loss: 1.2782667875289917
Accuracy: 0.5307999849319458
Epoch 14, CIFAR-10 Batch 1:  Loss: 1.5853941440582275
Accuracy: 0.5299999713897705
Epoch 14, CIFAR-10 Batch 2:  Loss: 1.1840918064117432
Accuracy: 0.5267999768257141
Epoch 14, CIFAR-10 Batch 3:  Loss: 1.024600625038147
Accuracy: 0.5357999801635742
Epoch 14, CIFAR-10 Batch 4:  Loss: 1.355432152748108
Accuracy: 0.5357999801635742
Epoch 14, CIFAR-10 Batch 5:  Loss: 1.2858715057373047
Accuracy: 0.5347999930381775
Epoch 15, CIFAR-10 Batch 1:  Loss: 1.557241439819336
Accuracy: 0.5339999794960022
Epoch 15, CIFAR-10 Batch 2:  Loss: 1.163277506828308
Accuracy: 0.5365999937057495
Epoch 15, CIFAR-10 Batch 3:  Loss: 1.0287187099456787
Accuracy: 0.5375999808311462
Epoch 15, CIFAR-10 Batch 4:  Loss: 1.3685358762741089
Accuracy: 0.5364000201225281
Epoch 15, CIFAR-10 Batch 5:  Loss: 1.292654275894165
Accuracy: 0.5356000065803528
Epoch 16, CIFAR-10 Batch 1:  Loss: 1.5397714376449585
Accuracy: 0.5353999733924866
Epoch 16, CIFAR-10 Batch 2:  Loss: 1.160125970840454
Accuracy: 0.5440000295639038
Epoch 16, CIFAR-10 Batch 3:  Loss: 1.0007879734039307
Accuracy: 0.5386000275611877
Epoch 16, CIFAR-10 Batch 4:  Loss: 1.3607996702194214
Accuracy: 0.5404000282287598
Epoch 16, CIFAR-10 Batch 5:  Loss: 1.3055522441864014
Accuracy: 0.5407999753952026
Epoch 17, CIFAR-10 Batch 1:  Loss: 1.5236061811447144
Accuracy: 0.5378000140190125
Epoch 17, CIFAR-10 Batch 2:  Loss: 1.185625433921814
Accuracy: 0.5410000085830688
Epoch 17, CIFAR-10 Batch 3:  Loss: 1.0219465494155884
Accuracy: 0.5428000092506409
Epoch 17, CIFAR-10 Batch 4:  Loss: 1.374309778213501
Accuracy: 0.5496000051498413
Epoch 17, CIFAR-10 Batch 5:  Loss: 1.2998708486557007
Accuracy: 0.5493999719619751
Epoch 18, CIFAR-10 Batch 1:  Loss: 1.538604974746704
Accuracy: 0.5422000288963318
Epoch 18, CIFAR-10 Batch 2:  Loss: 1.1985177993774414
Accuracy: 0.5432000160217285
Epoch 18, CIFAR-10 Batch 3:  Loss: 0.9925235509872437
Accuracy: 0.5404000282287598
Epoch 18, CIFAR-10 Batch 4:  Loss: 1.386903166770935
Accuracy: 0.5411999821662903
Epoch 18, CIFAR-10 Batch 5:  Loss: 1.264367699623108
Accuracy: 0.5465999841690063
Epoch 19, CIFAR-10 Batch 1:  Loss: 1.5285420417785645
Accuracy: 0.5454000234603882
Epoch 19, CIFAR-10 Batch 2:  Loss: 1.1983193159103394
Accuracy: 0.5472000241279602
Epoch 19, CIFAR-10 Batch 3:  Loss: 0.9956730604171753
Accuracy: 0.5447999835014343
Epoch 19, CIFAR-10 Batch 4:  Loss: 1.3502728939056396
Accuracy: 0.5468000173568726
Epoch 19, CIFAR-10 Batch 5:  Loss: 1.2957499027252197
Accuracy: 0.5437999963760376
Epoch 20, CIFAR-10 Batch 1:  Loss: 1.5040233135223389
Accuracy: 0.5393999814987183
Epoch 20, CIFAR-10 Batch 2:  Loss: 1.190004587173462
Accuracy: 0.5464000105857849
Epoch 20, CIFAR-10 Batch 3:  Loss: 0.9781597256660461
Accuracy: 0.5465999841690063
Epoch 20, CIFAR-10 Batch 4:  Loss: 1.345773458480835
Accuracy: 0.5493999719619751
Epoch 20, CIFAR-10 Batch 5:  Loss: 1.2445555925369263
Accuracy: 0.5442000031471252
Epoch 21, CIFAR-10 Batch 1:  Loss: 1.4876501560211182
Accuracy: 0.5447999835014343
Epoch 21, CIFAR-10 Batch 2:  Loss: 1.175615906715393
Accuracy: 0.5523999929428101
Epoch 21, CIFAR-10 Batch 3:  Loss: 0.9786592721939087
Accuracy: 0.5472000241279602
Epoch 21, CIFAR-10 Batch 4:  Loss: 1.3396875858306885
Accuracy: 0.5529999732971191
Epoch 21, CIFAR-10 Batch 5:  Loss: 1.2617127895355225
Accuracy: 0.5436000227928162
Epoch 22, CIFAR-10 Batch 1:  Loss: 1.455749750137329
Accuracy: 0.5429999828338623
Epoch 22, CIFAR-10 Batch 2:  Loss: 1.1859169006347656
Accuracy: 0.5468000173568726
Epoch 22, CIFAR-10 Batch 3:  Loss: 0.9814190864562988
Accuracy: 0.546999990940094
Epoch 22, CIFAR-10 Batch 4:  Loss: 1.348110318183899
Accuracy: 0.5540000200271606
Epoch 22, CIFAR-10 Batch 5:  Loss: 1.2434375286102295
Accuracy: 0.5564000010490417
Epoch 23, CIFAR-10 Batch 1:  Loss: 1.4702229499816895
Accuracy: 0.54339998960495
Epoch 23, CIFAR-10 Batch 2:  Loss: 1.18134605884552
Accuracy: 0.550599992275238
Epoch 23, CIFAR-10 Batch 3:  Loss: 0.984291672706604
Accuracy: 0.5501999855041504
Epoch 23, CIFAR-10 Batch 4:  Loss: 1.3507661819458008
Accuracy: 0.5529999732971191
Epoch 23, CIFAR-10 Batch 5:  Loss: 1.2606604099273682
Accuracy: 0.550599992275238
Epoch 24, CIFAR-10 Batch 1:  Loss: 1.4559987783432007
Accuracy: 0.5546000003814697
Epoch 24, CIFAR-10 Batch 2:  Loss: 1.1657764911651611
Accuracy: 0.5523999929428101
Epoch 24, CIFAR-10 Batch 3:  Loss: 0.9755827784538269
Accuracy: 0.5532000064849854
Epoch 24, CIFAR-10 Batch 4:  Loss: 1.3542711734771729
Accuracy: 0.550599992275238
Epoch 24, CIFAR-10 Batch 5:  Loss: 1.252440094947815
Accuracy: 0.5547999739646912
Epoch 25, CIFAR-10 Batch 1:  Loss: 1.4653303623199463
Accuracy: 0.5537999868392944
Epoch 25, CIFAR-10 Batch 2:  Loss: 1.192011833190918
Accuracy: 0.5559999942779541
Epoch 25, CIFAR-10 Batch 3:  Loss: 0.954932689666748
Accuracy: 0.5479999780654907
Epoch 25, CIFAR-10 Batch 4:  Loss: 1.3247604370117188
Accuracy: 0.5594000220298767
Epoch 25, CIFAR-10 Batch 5:  Loss: 1.256906509399414
Accuracy: 0.5529999732971191
Epoch 26, CIFAR-10 Batch 1:  Loss: 1.4316710233688354
Accuracy: 0.5501999855041504
Epoch 26, CIFAR-10 Batch 2:  Loss: 1.1859978437423706
Accuracy: 0.548799991607666
Epoch 26, CIFAR-10 Batch 3:  Loss: 0.9716545939445496
Accuracy: 0.5541999936103821
Epoch 26, CIFAR-10 Batch 4:  Loss: 1.324777364730835
Accuracy: 0.5622000098228455
Epoch 26, CIFAR-10 Batch 5:  Loss: 1.2659410238265991
Accuracy: 0.557200014591217
Epoch 27, CIFAR-10 Batch 1:  Loss: 1.4426225423812866
Accuracy: 0.5551999807357788
Epoch 27, CIFAR-10 Batch 2:  Loss: 1.1646335124969482
Accuracy: 0.5558000206947327
Epoch 27, CIFAR-10 Batch 3:  Loss: 0.9546213150024414
Accuracy: 0.5562000274658203
Epoch 27, CIFAR-10 Batch 4:  Loss: 1.3371195793151855
Accuracy: 0.5627999901771545
Epoch 27, CIFAR-10 Batch 5:  Loss: 1.2665328979492188
Accuracy: 0.5526000261306763
Epoch 28, CIFAR-10 Batch 1:  Loss: 1.4394363164901733
Accuracy: 0.557200014591217
Epoch 28, CIFAR-10 Batch 2:  Loss: 1.1764177083969116
Accuracy: 0.5540000200271606
Epoch 28, CIFAR-10 Batch 3:  Loss: 0.9711286425590515
Accuracy: 0.5576000213623047
Epoch 28, CIFAR-10 Batch 4:  Loss: 1.3251689672470093
Accuracy: 0.5640000104904175
Epoch 28, CIFAR-10 Batch 5:  Loss: 1.2453467845916748
Accuracy: 0.5583999752998352
Epoch 29, CIFAR-10 Batch 1:  Loss: 1.4097025394439697
Accuracy: 0.5605999827384949
Epoch 29, CIFAR-10 Batch 2:  Loss: 1.1813398599624634
Accuracy: 0.5562000274658203
Epoch 29, CIFAR-10 Batch 3:  Loss: 0.9666585922241211
Accuracy: 0.5522000193595886
Epoch 29, CIFAR-10 Batch 4:  Loss: 1.3309863805770874
Accuracy: 0.5619999766349792
Epoch 29, CIFAR-10 Batch 5:  Loss: 1.2696785926818848
Accuracy: 0.5627999901771545
Epoch 30, CIFAR-10 Batch 1:  Loss: 1.3993594646453857
Accuracy: 0.5612000226974487
Epoch 30, CIFAR-10 Batch 2:  Loss: 1.2098021507263184
Accuracy: 0.5605999827384949
Epoch 30, CIFAR-10 Batch 3:  Loss: 0.9556706547737122
Accuracy: 0.5631999969482422
Epoch 30, CIFAR-10 Batch 4:  Loss: 1.3159751892089844
Accuracy: 0.569599986076355
Epoch 30, CIFAR-10 Batch 5:  Loss: 1.231102705001831
Accuracy: 0.5609999895095825
Epoch 31, CIFAR-10 Batch 1:  Loss: 1.3963284492492676
Accuracy: 0.5645999908447266
Epoch 31, CIFAR-10 Batch 2:  Loss: 1.2079576253890991
Accuracy: 0.5636000037193298
Epoch 31, CIFAR-10 Batch 3:  Loss: 0.9661312103271484
Accuracy: 0.5644000172615051
Epoch 31, CIFAR-10 Batch 4:  Loss: 1.3218380212783813
Accuracy: 0.5699999928474426
Epoch 31, CIFAR-10 Batch 5:  Loss: 1.2237200736999512
Accuracy: 0.5640000104904175
Epoch 32, CIFAR-10 Batch 1:  Loss: 1.3498948812484741
Accuracy: 0.5651999711990356
Epoch 32, CIFAR-10 Batch 2:  Loss: 1.2709853649139404
Accuracy: 0.5622000098228455
Epoch 32, CIFAR-10 Batch 3:  Loss: 0.9507255554199219
Accuracy: 0.5641999840736389
Epoch 32, CIFAR-10 Batch 4:  Loss: 1.2989035844802856
Accuracy: 0.5726000070571899
Epoch 32, CIFAR-10 Batch 5:  Loss: 1.2111210823059082
Accuracy: 0.5676000118255615
Epoch 33, CIFAR-10 Batch 1:  Loss: 1.3797551393508911
Accuracy: 0.5612000226974487
Epoch 33, CIFAR-10 Batch 2:  Loss: 1.2270694971084595
Accuracy: 0.5663999915122986
Epoch 33, CIFAR-10 Batch 3:  Loss: 0.9473758935928345
Accuracy: 0.5644000172615051
Epoch 33, CIFAR-10 Batch 4:  Loss: 1.298370122909546
Accuracy: 0.567799985408783
Epoch 33, CIFAR-10 Batch 5:  Loss: 1.207257628440857
Accuracy: 0.5612000226974487
Epoch 34, CIFAR-10 Batch 1:  Loss: 1.3812787532806396
Accuracy: 0.5663999915122986
Epoch 34, CIFAR-10 Batch 2:  Loss: 1.2431766986846924
Accuracy: 0.5622000098228455
Epoch 34, CIFAR-10 Batch 3:  Loss: 0.924656867980957
Accuracy: 0.567799985408783
Epoch 34, CIFAR-10 Batch 4:  Loss: 1.2727382183074951
Accuracy: 0.5702000260353088
Epoch 34, CIFAR-10 Batch 5:  Loss: 1.2032194137573242
Accuracy: 0.5631999969482422
Epoch 35, CIFAR-10 Batch 1:  Loss: 1.3390979766845703
Accuracy: 0.5680000185966492
Epoch 35, CIFAR-10 Batch 2:  Loss: 1.1994892358779907
Accuracy: 0.5631999969482422
Epoch 35, CIFAR-10 Batch 3:  Loss: 0.9454838037490845
Accuracy: 0.5676000118255615
Epoch 35, CIFAR-10 Batch 4:  Loss: 1.260249376296997
Accuracy: 0.5684000253677368
Epoch 35, CIFAR-10 Batch 5:  Loss: 1.1813695430755615
Accuracy: 0.5644000172615051
Epoch 36, CIFAR-10 Batch 1:  Loss: 1.3461495637893677
Accuracy: 0.5622000098228455
Epoch 36, CIFAR-10 Batch 2:  Loss: 1.2547309398651123
Accuracy: 0.5669999718666077
Epoch 36, CIFAR-10 Batch 3:  Loss: 0.9291908144950867
Accuracy: 0.5699999928474426
Epoch 36, CIFAR-10 Batch 4:  Loss: 1.2406902313232422
Accuracy: 0.5730000138282776
Epoch 36, CIFAR-10 Batch 5:  Loss: 1.1941006183624268
Accuracy: 0.5618000030517578
Epoch 37, CIFAR-10 Batch 1:  Loss: 1.3540441989898682
Accuracy: 0.5680000185966492
Epoch 37, CIFAR-10 Batch 2:  Loss: 1.2231560945510864
Accuracy: 0.569599986076355
Epoch 37, CIFAR-10 Batch 3:  Loss: 0.9223127365112305
Accuracy: 0.5687999725341797
Epoch 37, CIFAR-10 Batch 4:  Loss: 1.2487460374832153
Accuracy: 0.5720000267028809
Epoch 37, CIFAR-10 Batch 5:  Loss: 1.1853621006011963
Accuracy: 0.5655999779701233
Epoch 38, CIFAR-10 Batch 1:  Loss: 1.3321201801300049
Accuracy: 0.5672000050544739
Epoch 38, CIFAR-10 Batch 2:  Loss: 1.239479899406433
Accuracy: 0.5672000050544739
Epoch 38, CIFAR-10 Batch 3:  Loss: 0.9240590929985046
Accuracy: 0.5705999732017517
Epoch 38, CIFAR-10 Batch 4:  Loss: 1.2213798761367798
Accuracy: 0.5727999806404114
Epoch 38, CIFAR-10 Batch 5:  Loss: 1.1784662008285522
Accuracy: 0.5666000247001648
Epoch 39, CIFAR-10 Batch 1:  Loss: 1.3237441778182983
Accuracy: 0.569599986076355
Epoch 39, CIFAR-10 Batch 2:  Loss: 1.2443535327911377
Accuracy: 0.5676000118255615
Epoch 39, CIFAR-10 Batch 3:  Loss: 0.909323513507843
Accuracy: 0.5698000192642212
Epoch 39, CIFAR-10 Batch 4:  Loss: 1.1959213018417358
Accuracy: 0.578000009059906
Epoch 39, CIFAR-10 Batch 5:  Loss: 1.181995153427124
Accuracy: 0.5735999941825867
Epoch 40, CIFAR-10 Batch 1:  Loss: 1.318804383277893
Accuracy: 0.5722000002861023
Epoch 40, CIFAR-10 Batch 2:  Loss: 1.2255892753601074
Accuracy: 0.574999988079071
Epoch 40, CIFAR-10 Batch 3:  Loss: 0.9379520416259766
Accuracy: 0.5722000002861023
Epoch 40, CIFAR-10 Batch 4:  Loss: 1.205370306968689
Accuracy: 0.5784000158309937
Epoch 40, CIFAR-10 Batch 5:  Loss: 1.1687471866607666
Accuracy: 0.5723999738693237
Epoch 41, CIFAR-10 Batch 1:  Loss: 1.3197507858276367
Accuracy: 0.5794000029563904
Epoch 41, CIFAR-10 Batch 2:  Loss: 1.1863702535629272
Accuracy: 0.5735999941825867
Epoch 41, CIFAR-10 Batch 3:  Loss: 0.9208347201347351
Accuracy: 0.5723999738693237
Epoch 41, CIFAR-10 Batch 4:  Loss: 1.2058547735214233
Accuracy: 0.5712000131607056
Epoch 41, CIFAR-10 Batch 5:  Loss: 1.1723811626434326
Accuracy: 0.5699999928474426
Epoch 42, CIFAR-10 Batch 1:  Loss: 1.3081728219985962
Accuracy: 0.5766000151634216
Epoch 42, CIFAR-10 Batch 2:  Loss: 1.2079367637634277
Accuracy: 0.5748000144958496
Epoch 42, CIFAR-10 Batch 3:  Loss: 0.914594829082489
Accuracy: 0.5735999941825867
Epoch 42, CIFAR-10 Batch 4:  Loss: 1.1780818700790405
Accuracy: 0.571399986743927
Epoch 42, CIFAR-10 Batch 5:  Loss: 1.1615631580352783
Accuracy: 0.5727999806404114
Epoch 43, CIFAR-10 Batch 1:  Loss: 1.3284595012664795
Accuracy: 0.5756000280380249
Epoch 43, CIFAR-10 Batch 2:  Loss: 1.2206687927246094
Accuracy: 0.5709999799728394
Epoch 43, CIFAR-10 Batch 3:  Loss: 0.9229222536087036
Accuracy: 0.5748000144958496
Epoch 43, CIFAR-10 Batch 4:  Loss: 1.1786258220672607
Accuracy: 0.5777999758720398
Epoch 43, CIFAR-10 Batch 5:  Loss: 1.1735827922821045
Accuracy: 0.5722000002861023
Epoch 44, CIFAR-10 Batch 1:  Loss: 1.3114326000213623
Accuracy: 0.5784000158309937
Epoch 44, CIFAR-10 Batch 2:  Loss: 1.2117141485214233
Accuracy: 0.5763999819755554
Epoch 44, CIFAR-10 Batch 3:  Loss: 0.9068523645401001
Accuracy: 0.5771999955177307
Epoch 44, CIFAR-10 Batch 4:  Loss: 1.1455438137054443
Accuracy: 0.5784000158309937
Epoch 44, CIFAR-10 Batch 5:  Loss: 1.169466257095337
Accuracy: 0.5745999813079834
Epoch 45, CIFAR-10 Batch 1:  Loss: 1.301418662071228
Accuracy: 0.5789999961853027
Epoch 45, CIFAR-10 Batch 2:  Loss: 1.1655027866363525
Accuracy: 0.5716000199317932
Epoch 45, CIFAR-10 Batch 3:  Loss: 0.8974682092666626
Accuracy: 0.5788000226020813
Epoch 45, CIFAR-10 Batch 4:  Loss: 1.1240853071212769
Accuracy: 0.5788000226020813
Epoch 45, CIFAR-10 Batch 5:  Loss: 1.1892826557159424
Accuracy: 0.5771999955177307
Epoch 46, CIFAR-10 Batch 1:  Loss: 1.283557653427124
Accuracy: 0.5745999813079834
Epoch 46, CIFAR-10 Batch 2:  Loss: 1.1965367794036865
Accuracy: 0.5767999887466431
Epoch 46, CIFAR-10 Batch 3:  Loss: 0.8972219228744507
Accuracy: 0.5727999806404114
Epoch 46, CIFAR-10 Batch 4:  Loss: 1.150972604751587
Accuracy: 0.5784000158309937
Epoch 46, CIFAR-10 Batch 5:  Loss: 1.145589828491211
Accuracy: 0.5806000232696533
Epoch 47, CIFAR-10 Batch 1:  Loss: 1.2890647649765015
Accuracy: 0.5758000016212463
Epoch 47, CIFAR-10 Batch 2:  Loss: 1.2051160335540771
Accuracy: 0.5785999894142151
Epoch 47, CIFAR-10 Batch 3:  Loss: 0.9228968620300293
Accuracy: 0.5738000273704529
Epoch 47, CIFAR-10 Batch 4:  Loss: 1.1327531337738037
Accuracy: 0.579800009727478
Epoch 47, CIFAR-10 Batch 5:  Loss: 1.1428756713867188
Accuracy: 0.5820000171661377
Epoch 48, CIFAR-10 Batch 1:  Loss: 1.2830226421356201
Accuracy: 0.5785999894142151
Epoch 48, CIFAR-10 Batch 2:  Loss: 1.1573359966278076
Accuracy: 0.5812000036239624
Epoch 48, CIFAR-10 Batch 3:  Loss: 0.8801666498184204
Accuracy: 0.5766000151634216
Epoch 48, CIFAR-10 Batch 4:  Loss: 1.1254746913909912
Accuracy: 0.5838000178337097
Epoch 48, CIFAR-10 Batch 5:  Loss: 1.179443597793579
Accuracy: 0.5770000219345093
Epoch 49, CIFAR-10 Batch 1:  Loss: 1.2736930847167969
Accuracy: 0.5817999839782715
Epoch 49, CIFAR-10 Batch 2:  Loss: 1.1909840106964111
Accuracy: 0.5794000029563904
Epoch 49, CIFAR-10 Batch 3:  Loss: 0.8843674659729004
Accuracy: 0.5767999887466431
Epoch 49, CIFAR-10 Batch 4:  Loss: 1.1339216232299805
Accuracy: 0.5817999839782715
Epoch 49, CIFAR-10 Batch 5:  Loss: 1.1526000499725342
Accuracy: 0.5834000110626221
Epoch 50, CIFAR-10 Batch 1:  Loss: 1.2931421995162964
Accuracy: 0.5830000042915344
Epoch 50, CIFAR-10 Batch 2:  Loss: 1.1671442985534668
Accuracy: 0.5806000232696533
Epoch 50, CIFAR-10 Batch 3:  Loss: 0.8956335186958313
Accuracy: 0.5794000029563904
Epoch 50, CIFAR-10 Batch 4:  Loss: 1.1075713634490967
Accuracy: 0.5860000252723694
Epoch 50, CIFAR-10 Batch 5:  Loss: 1.1608600616455078
Accuracy: 0.5849999785423279
Epoch 51, CIFAR-10 Batch 1:  Loss: 1.2936923503875732
Accuracy: 0.5842000246047974
Epoch 51, CIFAR-10 Batch 2:  Loss: 1.1887719631195068
Accuracy: 0.5784000158309937
Epoch 51, CIFAR-10 Batch 3:  Loss: 0.8994476199150085
Accuracy: 0.5856000185012817
Epoch 51, CIFAR-10 Batch 4:  Loss: 1.1050289869308472
Accuracy: 0.5863999724388123
Epoch 51, CIFAR-10 Batch 5:  Loss: 1.1340734958648682
Accuracy: 0.5830000042915344
Epoch 52, CIFAR-10 Batch 1:  Loss: 1.317996621131897
Accuracy: 0.579800009727478
Epoch 52, CIFAR-10 Batch 2:  Loss: 1.1530681848526
Accuracy: 0.58160001039505
Epoch 52, CIFAR-10 Batch 3:  Loss: 0.8891971707344055
Accuracy: 0.5871999859809875
Epoch 52, CIFAR-10 Batch 4:  Loss: 1.1131904125213623
Accuracy: 0.5856000185012817
Epoch 52, CIFAR-10 Batch 5:  Loss: 1.1477655172348022
Accuracy: 0.5795999765396118
Epoch 53, CIFAR-10 Batch 1:  Loss: 1.3156611919403076
Accuracy: 0.5842000246047974
Epoch 53, CIFAR-10 Batch 2:  Loss: 1.1821526288986206
Accuracy: 0.5824000239372253
Epoch 53, CIFAR-10 Batch 3:  Loss: 0.8787368535995483
Accuracy: 0.5827999711036682
Epoch 53, CIFAR-10 Batch 4:  Loss: 1.128464937210083
Accuracy: 0.5874000191688538
Epoch 53, CIFAR-10 Batch 5:  Loss: 1.1266518831253052
Accuracy: 0.5809999704360962
Epoch 54, CIFAR-10 Batch 1:  Loss: 1.2582536935806274
Accuracy: 0.5856000185012817
Epoch 54, CIFAR-10 Batch 2:  Loss: 1.2020505666732788
Accuracy: 0.5853999853134155
Epoch 54, CIFAR-10 Batch 3:  Loss: 0.8998781442642212
Accuracy: 0.5875999927520752
Epoch 54, CIFAR-10 Batch 4:  Loss: 1.097459077835083
Accuracy: 0.5896000266075134
Epoch 54, CIFAR-10 Batch 5:  Loss: 1.168188214302063
Accuracy: 0.5834000110626221
Epoch 55, CIFAR-10 Batch 1:  Loss: 1.2922509908676147
Accuracy: 0.5870000123977661
Epoch 55, CIFAR-10 Batch 2:  Loss: 1.1627647876739502
Accuracy: 0.5817999839782715
Epoch 55, CIFAR-10 Batch 3:  Loss: 0.9106591939926147
Accuracy: 0.5820000171661377
Epoch 55, CIFAR-10 Batch 4:  Loss: 1.1048173904418945
Accuracy: 0.5861999988555908
Epoch 55, CIFAR-10 Batch 5:  Loss: 1.134143352508545
Accuracy: 0.5820000171661377
Epoch 56, CIFAR-10 Batch 1:  Loss: 1.2631312608718872
Accuracy: 0.5874000191688538
Epoch 56, CIFAR-10 Batch 2:  Loss: 1.1558765172958374
Accuracy: 0.5802000164985657
Epoch 56, CIFAR-10 Batch 3:  Loss: 0.9001428484916687
Accuracy: 0.5853999853134155
Epoch 56, CIFAR-10 Batch 4:  Loss: 1.1002365350723267
Accuracy: 0.5879999995231628
Epoch 56, CIFAR-10 Batch 5:  Loss: 1.1441389322280884
Accuracy: 0.5799999833106995
Epoch 57, CIFAR-10 Batch 1:  Loss: 1.2511593103408813
Accuracy: 0.5830000042915344
Epoch 57, CIFAR-10 Batch 2:  Loss: 1.1452850103378296
Accuracy: 0.5834000110626221
Epoch 57, CIFAR-10 Batch 3:  Loss: 0.8956989049911499
Accuracy: 0.5842000246047974
Epoch 57, CIFAR-10 Batch 4:  Loss: 1.1189491748809814
Accuracy: 0.5853999853134155
Epoch 57, CIFAR-10 Batch 5:  Loss: 1.1301971673965454
Accuracy: 0.5878000259399414
Epoch 58, CIFAR-10 Batch 1:  Loss: 1.275181770324707
Accuracy: 0.5839999914169312
Epoch 58, CIFAR-10 Batch 2:  Loss: 1.1668723821640015
Accuracy: 0.5821999907493591
Epoch 58, CIFAR-10 Batch 3:  Loss: 0.8944703936576843
Accuracy: 0.5831999778747559
Epoch 58, CIFAR-10 Batch 4:  Loss: 1.1040194034576416
Accuracy: 0.5892000198364258
Epoch 58, CIFAR-10 Batch 5:  Loss: 1.138244390487671
Accuracy: 0.5856000185012817
Epoch 59, CIFAR-10 Batch 1:  Loss: 1.2560052871704102
Accuracy: 0.5817999839782715
Epoch 59, CIFAR-10 Batch 2:  Loss: 1.156298279762268
Accuracy: 0.5852000117301941
Epoch 59, CIFAR-10 Batch 3:  Loss: 0.9064702987670898
Accuracy: 0.5906000137329102
Epoch 59, CIFAR-10 Batch 4:  Loss: 1.0769788026809692
Accuracy: 0.5849999785423279
Epoch 59, CIFAR-10 Batch 5:  Loss: 1.1613123416900635
Accuracy: 0.5845999717712402
Epoch 60, CIFAR-10 Batch 1:  Loss: 1.2490259408950806
Accuracy: 0.5817999839782715
Epoch 60, CIFAR-10 Batch 2:  Loss: 1.1320829391479492
Accuracy: 0.5856000185012817
Epoch 60, CIFAR-10 Batch 3:  Loss: 0.8946016430854797
Accuracy: 0.5866000056266785
Epoch 60, CIFAR-10 Batch 4:  Loss: 1.0815536975860596
Accuracy: 0.5881999731063843
Epoch 60, CIFAR-10 Batch 5:  Loss: 1.1499783992767334
Accuracy: 0.5861999988555908
Epoch 61, CIFAR-10 Batch 1:  Loss: 1.227595567703247
Accuracy: 0.5799999833106995
Epoch 61, CIFAR-10 Batch 2:  Loss: 1.1348378658294678
Accuracy: 0.5899999737739563
Epoch 61, CIFAR-10 Batch 3:  Loss: 0.892085075378418
Accuracy: 0.5874000191688538
Epoch 61, CIFAR-10 Batch 4:  Loss: 1.1001514196395874
Accuracy: 0.5889999866485596
Epoch 61, CIFAR-10 Batch 5:  Loss: 1.1493703126907349
Accuracy: 0.5848000049591064
Epoch 62, CIFAR-10 Batch 1:  Loss: 1.2775287628173828
Accuracy: 0.5834000110626221
Epoch 62, CIFAR-10 Batch 2:  Loss: 1.138907790184021
Accuracy: 0.5849999785423279
Epoch 62, CIFAR-10 Batch 3:  Loss: 0.8961292505264282
Accuracy: 0.5911999940872192
Epoch 62, CIFAR-10 Batch 4:  Loss: 1.0782195329666138
Accuracy: 0.5916000008583069
Epoch 62, CIFAR-10 Batch 5:  Loss: 1.1561216115951538
Accuracy: 0.5879999995231628
Epoch 63, CIFAR-10 Batch 1:  Loss: 1.2337573766708374
Accuracy: 0.5898000001907349
Epoch 63, CIFAR-10 Batch 2:  Loss: 1.1543058156967163
Accuracy: 0.5853999853134155
Epoch 63, CIFAR-10 Batch 3:  Loss: 0.8861093521118164
Accuracy: 0.5960000157356262
Epoch 63, CIFAR-10 Batch 4:  Loss: 1.1118013858795166
Accuracy: 0.5870000123977661
Epoch 63, CIFAR-10 Batch 5:  Loss: 1.1403605937957764
Accuracy: 0.5885999798774719
Epoch 64, CIFAR-10 Batch 1:  Loss: 1.2377307415008545
Accuracy: 0.5924000144004822
Epoch 64, CIFAR-10 Batch 2:  Loss: 1.1560369729995728
Accuracy: 0.5889999866485596
Epoch 64, CIFAR-10 Batch 3:  Loss: 0.9011387825012207
Accuracy: 0.5916000008583069
Epoch 64, CIFAR-10 Batch 4:  Loss: 1.0664538145065308
Accuracy: 0.5917999744415283
Epoch 64, CIFAR-10 Batch 5:  Loss: 1.147503137588501
Accuracy: 0.5838000178337097
Epoch 65, CIFAR-10 Batch 1:  Loss: 1.199785590171814
Accuracy: 0.5906000137329102
Epoch 65, CIFAR-10 Batch 2:  Loss: 1.1926242113113403
Accuracy: 0.5878000259399414
Epoch 65, CIFAR-10 Batch 3:  Loss: 0.8956875801086426
Accuracy: 0.59579998254776
Epoch 65, CIFAR-10 Batch 4:  Loss: 1.0719597339630127
Accuracy: 0.597000002861023
Epoch 65, CIFAR-10 Batch 5:  Loss: 1.1351608037948608
Accuracy: 0.5879999995231628
Epoch 66, CIFAR-10 Batch 1:  Loss: 1.2092726230621338
Accuracy: 0.592199981212616
Epoch 66, CIFAR-10 Batch 2:  Loss: 1.1259225606918335
Accuracy: 0.5920000076293945
Epoch 66, CIFAR-10 Batch 3:  Loss: 0.8805299997329712
Accuracy: 0.5906000137329102
Epoch 66, CIFAR-10 Batch 4:  Loss: 1.0462778806686401
Accuracy: 0.5960000157356262
Epoch 66, CIFAR-10 Batch 5:  Loss: 1.1328799724578857
Accuracy: 0.5896000266075134
Epoch 67, CIFAR-10 Batch 1:  Loss: 1.2223224639892578
Accuracy: 0.5950000286102295
Epoch 67, CIFAR-10 Batch 2:  Loss: 1.1555683612823486
Accuracy: 0.5956000089645386
Epoch 67, CIFAR-10 Batch 3:  Loss: 0.900708794593811
Accuracy: 0.5947999954223633
Epoch 67, CIFAR-10 Batch 4:  Loss: 1.0638208389282227
Accuracy: 0.5968000292778015
Epoch 67, CIFAR-10 Batch 5:  Loss: 1.114680290222168
Accuracy: 0.5965999960899353
Epoch 68, CIFAR-10 Batch 1:  Loss: 1.2030622959136963
Accuracy: 0.5932000279426575
Epoch 68, CIFAR-10 Batch 2:  Loss: 1.1620426177978516
Accuracy: 0.5870000123977661
Epoch 68, CIFAR-10 Batch 3:  Loss: 0.8940452337265015
Accuracy: 0.5942000150680542
Epoch 68, CIFAR-10 Batch 4:  Loss: 1.0512845516204834
Accuracy: 0.5964000225067139
Epoch 68, CIFAR-10 Batch 5:  Loss: 1.1094213724136353
Accuracy: 0.5943999886512756
Epoch 69, CIFAR-10 Batch 1:  Loss: 1.1914221048355103
Accuracy: 0.5920000076293945
Epoch 69, CIFAR-10 Batch 2:  Loss: 1.138350248336792
Accuracy: 0.592199981212616
Epoch 69, CIFAR-10 Batch 3:  Loss: 0.8907293081283569
Accuracy: 0.5956000089645386
Epoch 69, CIFAR-10 Batch 4:  Loss: 1.0258373022079468
Accuracy: 0.5956000089645386
Epoch 69, CIFAR-10 Batch 5:  Loss: 1.1121591329574585
Accuracy: 0.5910000205039978
Epoch 70, CIFAR-10 Batch 1:  Loss: 1.2133737802505493
Accuracy: 0.5885999798774719
Epoch 70, CIFAR-10 Batch 2:  Loss: 1.118607759475708
Accuracy: 0.5920000076293945
Epoch 70, CIFAR-10 Batch 3:  Loss: 0.8894081115722656
Accuracy: 0.5965999960899353
Epoch 70, CIFAR-10 Batch 4:  Loss: 1.0400207042694092
Accuracy: 0.59579998254776
Epoch 70, CIFAR-10 Batch 5:  Loss: 1.135514497756958
Accuracy: 0.5888000130653381
Epoch 71, CIFAR-10 Batch 1:  Loss: 1.231856346130371
Accuracy: 0.5946000218391418
Epoch 71, CIFAR-10 Batch 2:  Loss: 1.191916823387146
Accuracy: 0.5971999764442444
Epoch 71, CIFAR-10 Batch 3:  Loss: 0.8832408785820007
Accuracy: 0.5965999960899353
Epoch 71, CIFAR-10 Batch 4:  Loss: 1.0473988056182861
Accuracy: 0.5924000144004822
Epoch 71, CIFAR-10 Batch 5:  Loss: 1.0881214141845703
Accuracy: 0.5942000150680542
Epoch 72, CIFAR-10 Batch 1:  Loss: 1.2454761266708374
Accuracy: 0.5982000231742859
Epoch 72, CIFAR-10 Batch 2:  Loss: 1.1500980854034424
Accuracy: 0.5842000246047974
Epoch 72, CIFAR-10 Batch 3:  Loss: 0.885031521320343
Accuracy: 0.5979999899864197
Epoch 72, CIFAR-10 Batch 4:  Loss: 1.0403368473052979
Accuracy: 0.5978000164031982
Epoch 72, CIFAR-10 Batch 5:  Loss: 1.1167452335357666
Accuracy: 0.5906000137329102
Epoch 73, CIFAR-10 Batch 1:  Loss: 1.2130111455917358
Accuracy: 0.597000002861023
Epoch 73, CIFAR-10 Batch 2:  Loss: 1.1597375869750977
Accuracy: 0.593999981880188
Epoch 73, CIFAR-10 Batch 3:  Loss: 0.9038475155830383
Accuracy: 0.597599983215332
Epoch 73, CIFAR-10 Batch 4:  Loss: 1.0438258647918701
Accuracy: 0.6018000245094299
Epoch 73, CIFAR-10 Batch 5:  Loss: 1.116700530052185
Accuracy: 0.5932000279426575
Epoch 74, CIFAR-10 Batch 1:  Loss: 1.2350714206695557
Accuracy: 0.5943999886512756
Epoch 74, CIFAR-10 Batch 2:  Loss: 1.1456658840179443
Accuracy: 0.5884000062942505
Epoch 74, CIFAR-10 Batch 3:  Loss: 0.8832907676696777
Accuracy: 0.5982000231742859
Epoch 74, CIFAR-10 Batch 4:  Loss: 1.0573394298553467
Accuracy: 0.6001999974250793
Epoch 74, CIFAR-10 Batch 5:  Loss: 1.1000436544418335
Accuracy: 0.5860000252723694
Epoch 75, CIFAR-10 Batch 1:  Loss: 1.2217814922332764
Accuracy: 0.5916000008583069
Epoch 75, CIFAR-10 Batch 2:  Loss: 1.130324125289917
Accuracy: 0.5971999764442444
Epoch 75, CIFAR-10 Batch 3:  Loss: 0.8730513453483582
Accuracy: 0.6010000109672546
Epoch 75, CIFAR-10 Batch 4:  Loss: 1.0504423379898071
Accuracy: 0.5946000218391418
Epoch 75, CIFAR-10 Batch 5:  Loss: 1.1236560344696045
Accuracy: 0.5924000144004822
Epoch 76, CIFAR-10 Batch 1:  Loss: 1.2147471904754639
Accuracy: 0.592199981212616
Epoch 76, CIFAR-10 Batch 2:  Loss: 1.149590253829956
Accuracy: 0.5914000272750854
Epoch 76, CIFAR-10 Batch 3:  Loss: 0.8693532943725586
Accuracy: 0.5964000225067139
Epoch 76, CIFAR-10 Batch 4:  Loss: 1.04861581325531
Accuracy: 0.5971999764442444
Epoch 76, CIFAR-10 Batch 5:  Loss: 1.1271871328353882
Accuracy: 0.5928000211715698
Epoch 77, CIFAR-10 Batch 1:  Loss: 1.1969618797302246
Accuracy: 0.5907999873161316
Epoch 77, CIFAR-10 Batch 2:  Loss: 1.1461340188980103
Accuracy: 0.6000000238418579
Epoch 77, CIFAR-10 Batch 3:  Loss: 0.8881592750549316
Accuracy: 0.5971999764442444
Epoch 77, CIFAR-10 Batch 4:  Loss: 1.0667070150375366
Accuracy: 0.6018000245094299
Epoch 77, CIFAR-10 Batch 5:  Loss: 1.115246057510376
Accuracy: 0.5978000164031982
Epoch 78, CIFAR-10 Batch 1:  Loss: 1.2080217599868774
Accuracy: 0.5953999757766724
Epoch 78, CIFAR-10 Batch 2:  Loss: 1.1399543285369873
Accuracy: 0.5947999954223633
Epoch 78, CIFAR-10 Batch 3:  Loss: 0.873201847076416
Accuracy: 0.6003999710083008
Epoch 78, CIFAR-10 Batch 4:  Loss: 1.0313259363174438
Accuracy: 0.6018000245094299
Epoch 78, CIFAR-10 Batch 5:  Loss: 1.0950850248336792
Accuracy: 0.5952000021934509
Epoch 79, CIFAR-10 Batch 1:  Loss: 1.1933199167251587
Accuracy: 0.5956000089645386
Epoch 79, CIFAR-10 Batch 2:  Loss: 1.1355739831924438
Accuracy: 0.5932000279426575
Epoch 79, CIFAR-10 Batch 3:  Loss: 0.9036613702774048
Accuracy: 0.5947999954223633
Epoch 79, CIFAR-10 Batch 4:  Loss: 1.0334843397140503
Accuracy: 0.6032000184059143
Epoch 79, CIFAR-10 Batch 5:  Loss: 1.1158102750778198
Accuracy: 0.5929999947547913
Epoch 80, CIFAR-10 Batch 1:  Loss: 1.1954528093338013
Accuracy: 0.5965999960899353
Epoch 80, CIFAR-10 Batch 2:  Loss: 1.0797532796859741
Accuracy: 0.5971999764442444
Epoch 80, CIFAR-10 Batch 3:  Loss: 0.8931825757026672
Accuracy: 0.5992000102996826
Epoch 80, CIFAR-10 Batch 4:  Loss: 1.0367425680160522
Accuracy: 0.5961999893188477
Epoch 80, CIFAR-10 Batch 5:  Loss: 1.0904836654663086
Accuracy: 0.5947999954223633
Epoch 81, CIFAR-10 Batch 1:  Loss: 1.186044692993164
Accuracy: 0.5968000292778015
Epoch 81, CIFAR-10 Batch 2:  Loss: 1.1402157545089722
Accuracy: 0.6046000123023987
Epoch 81, CIFAR-10 Batch 3:  Loss: 0.9030699729919434
Accuracy: 0.598800003528595
Epoch 81, CIFAR-10 Batch 4:  Loss: 1.0324968099594116
Accuracy: 0.6014000177383423
Epoch 81, CIFAR-10 Batch 5:  Loss: 1.087310791015625
Accuracy: 0.5965999960899353
Epoch 82, CIFAR-10 Batch 1:  Loss: 1.2038624286651611
Accuracy: 0.5978000164031982
Epoch 82, CIFAR-10 Batch 2:  Loss: 1.1418027877807617
Accuracy: 0.598800003528595
Epoch 82, CIFAR-10 Batch 3:  Loss: 0.896294116973877
Accuracy: 0.598800003528595
Epoch 82, CIFAR-10 Batch 4:  Loss: 1.0405843257904053
Accuracy: 0.6019999980926514
Epoch 82, CIFAR-10 Batch 5:  Loss: 1.0800985097885132
Accuracy: 0.597000002861023
Epoch 83, CIFAR-10 Batch 1:  Loss: 1.2218376398086548
Accuracy: 0.6019999980926514
Epoch 83, CIFAR-10 Batch 2:  Loss: 1.158456802368164
Accuracy: 0.6007999777793884
Epoch 83, CIFAR-10 Batch 3:  Loss: 0.9073217511177063
Accuracy: 0.5979999899864197
Epoch 83, CIFAR-10 Batch 4:  Loss: 1.040824294090271
Accuracy: 0.6043999791145325
Epoch 83, CIFAR-10 Batch 5:  Loss: 1.0947048664093018
Accuracy: 0.5925999879837036
Epoch 84, CIFAR-10 Batch 1:  Loss: 1.238887071609497
Accuracy: 0.5974000096321106
Epoch 84, CIFAR-10 Batch 2:  Loss: 1.1304738521575928
Accuracy: 0.5899999737739563
Epoch 84, CIFAR-10 Batch 3:  Loss: 0.9112070798873901
Accuracy: 0.6032000184059143
Epoch 84, CIFAR-10 Batch 4:  Loss: 1.056429147720337
Accuracy: 0.6021999716758728
Epoch 84, CIFAR-10 Batch 5:  Loss: 1.075434923171997
Accuracy: 0.597000002861023
Epoch 85, CIFAR-10 Batch 1:  Loss: 1.1999670267105103
Accuracy: 0.597599983215332
Epoch 85, CIFAR-10 Batch 2:  Loss: 1.1848392486572266
Accuracy: 0.5952000021934509
Epoch 85, CIFAR-10 Batch 3:  Loss: 0.8772889375686646
Accuracy: 0.6029999852180481
Epoch 85, CIFAR-10 Batch 4:  Loss: 1.0487244129180908
Accuracy: 0.6025999784469604
Epoch 85, CIFAR-10 Batch 5:  Loss: 1.1084661483764648
Accuracy: 0.5982000231742859
Epoch 86, CIFAR-10 Batch 1:  Loss: 1.2462772130966187
Accuracy: 0.6068000197410583
Epoch 86, CIFAR-10 Batch 2:  Loss: 1.1419613361358643
Accuracy: 0.597599983215332
Epoch 86, CIFAR-10 Batch 3:  Loss: 0.8823696374893188
Accuracy: 0.6037999987602234
Epoch 86, CIFAR-10 Batch 4:  Loss: 1.043556809425354
Accuracy: 0.6028000116348267
Epoch 86, CIFAR-10 Batch 5:  Loss: 1.0983941555023193
Accuracy: 0.6039999723434448
Epoch 87, CIFAR-10 Batch 1:  Loss: 1.209115982055664
Accuracy: 0.5952000021934509
Epoch 87, CIFAR-10 Batch 2:  Loss: 1.1557810306549072
Accuracy: 0.597000002861023
Epoch 87, CIFAR-10 Batch 3:  Loss: 0.9126768112182617
Accuracy: 0.5996000170707703
Epoch 87, CIFAR-10 Batch 4:  Loss: 1.0358537435531616
Accuracy: 0.6033999919891357
Epoch 87, CIFAR-10 Batch 5:  Loss: 1.0752707719802856
Accuracy: 0.5985999703407288
Epoch 88, CIFAR-10 Batch 1:  Loss: 1.2148430347442627
Accuracy: 0.6010000109672546
Epoch 88, CIFAR-10 Batch 2:  Loss: 1.1274837255477905
Accuracy: 0.6011999845504761
Epoch 88, CIFAR-10 Batch 3:  Loss: 0.8993080854415894
Accuracy: 0.6032000184059143
Epoch 88, CIFAR-10 Batch 4:  Loss: 1.0428279638290405
Accuracy: 0.6043999791145325
Epoch 88, CIFAR-10 Batch 5:  Loss: 1.1113579273223877
Accuracy: 0.5997999906539917
Epoch 89, CIFAR-10 Batch 1:  Loss: 1.1769871711730957
Accuracy: 0.6046000123023987
Epoch 89, CIFAR-10 Batch 2:  Loss: 1.1208938360214233
Accuracy: 0.605400025844574
Epoch 89, CIFAR-10 Batch 3:  Loss: 0.8784118890762329
Accuracy: 0.6068000197410583
Epoch 89, CIFAR-10 Batch 4:  Loss: 1.0320669412612915
Accuracy: 0.605400025844574
Epoch 89, CIFAR-10 Batch 5:  Loss: 1.1166300773620605
Accuracy: 0.5924000144004822
Epoch 90, CIFAR-10 Batch 1:  Loss: 1.2118017673492432
Accuracy: 0.6014000177383423
Epoch 90, CIFAR-10 Batch 2:  Loss: 1.1075325012207031
Accuracy: 0.5942000150680542
Epoch 90, CIFAR-10 Batch 3:  Loss: 0.8857771158218384
Accuracy: 0.6047999858856201
Epoch 90, CIFAR-10 Batch 4:  Loss: 1.0602874755859375
Accuracy: 0.6015999913215637
Epoch 90, CIFAR-10 Batch 5:  Loss: 1.1073271036148071
Accuracy: 0.602400004863739
Epoch 91, CIFAR-10 Batch 1:  Loss: 1.177608847618103
Accuracy: 0.603600025177002
Epoch 91, CIFAR-10 Batch 2:  Loss: 1.1301766633987427
Accuracy: 0.6001999974250793
Epoch 91, CIFAR-10 Batch 3:  Loss: 0.8989594578742981
Accuracy: 0.605400025844574
Epoch 91, CIFAR-10 Batch 4:  Loss: 1.051249623298645
Accuracy: 0.6057999730110168
Epoch 91, CIFAR-10 Batch 5:  Loss: 1.0977349281311035
Accuracy: 0.6025999784469604
Epoch 92, CIFAR-10 Batch 1:  Loss: 1.2062702178955078
Accuracy: 0.6033999919891357
Epoch 92, CIFAR-10 Batch 2:  Loss: 1.1284927129745483
Accuracy: 0.6007999777793884
Epoch 92, CIFAR-10 Batch 3:  Loss: 0.8875082731246948
Accuracy: 0.6033999919891357
Epoch 92, CIFAR-10 Batch 4:  Loss: 1.0607483386993408
Accuracy: 0.6074000000953674
Epoch 92, CIFAR-10 Batch 5:  Loss: 1.1213315725326538
Accuracy: 0.6007999777793884
Epoch 93, CIFAR-10 Batch 1:  Loss: 1.206247329711914
Accuracy: 0.6028000116348267
Epoch 93, CIFAR-10 Batch 2:  Loss: 1.1601790189743042
Accuracy: 0.5932000279426575
Epoch 93, CIFAR-10 Batch 3:  Loss: 0.9006615877151489
Accuracy: 0.600600004196167
Epoch 93, CIFAR-10 Batch 4:  Loss: 1.0262424945831299
Accuracy: 0.6019999980926514
Epoch 93, CIFAR-10 Batch 5:  Loss: 1.1131587028503418
Accuracy: 0.5992000102996826
Epoch 94, CIFAR-10 Batch 1:  Loss: 1.1776139736175537
Accuracy: 0.6039999723434448
Epoch 94, CIFAR-10 Batch 2:  Loss: 1.1218878030776978
Accuracy: 0.6000000238418579
Epoch 94, CIFAR-10 Batch 3:  Loss: 0.9024869799613953
Accuracy: 0.6001999974250793
Epoch 94, CIFAR-10 Batch 4:  Loss: 1.0395492315292358
Accuracy: 0.6050000190734863
Epoch 94, CIFAR-10 Batch 5:  Loss: 1.0720915794372559
Accuracy: 0.6000000238418579
Epoch 95, CIFAR-10 Batch 1:  Loss: 1.2429569959640503
Accuracy: 0.6019999980926514
Epoch 95, CIFAR-10 Batch 2:  Loss: 1.1018000841140747
Accuracy: 0.598800003528595
Epoch 95, CIFAR-10 Batch 3:  Loss: 0.9012463688850403
Accuracy: 0.6033999919891357
Epoch 95, CIFAR-10 Batch 4:  Loss: 1.0436391830444336
Accuracy: 0.6039999723434448
Epoch 95, CIFAR-10 Batch 5:  Loss: 1.0846362113952637
Accuracy: 0.597599983215332
Epoch 96, CIFAR-10 Batch 1:  Loss: 1.197636365890503
Accuracy: 0.604200005531311
Epoch 96, CIFAR-10 Batch 2:  Loss: 1.1619292497634888
Accuracy: 0.6003999710083008
Epoch 96, CIFAR-10 Batch 3:  Loss: 0.9180946350097656
Accuracy: 0.6046000123023987
Epoch 96, CIFAR-10 Batch 4:  Loss: 1.0322587490081787
Accuracy: 0.604200005531311
Epoch 96, CIFAR-10 Batch 5:  Loss: 1.079482078552246
Accuracy: 0.6015999913215637
Epoch 97, CIFAR-10 Batch 1:  Loss: 1.2143518924713135
Accuracy: 0.6029999852180481
Epoch 97, CIFAR-10 Batch 2:  Loss: 1.1260817050933838
Accuracy: 0.5996000170707703
Epoch 97, CIFAR-10 Batch 3:  Loss: 0.9045670628547668
Accuracy: 0.6019999980926514
Epoch 97, CIFAR-10 Batch 4:  Loss: 1.039480447769165
Accuracy: 0.6037999987602234
Epoch 97, CIFAR-10 Batch 5:  Loss: 1.0369899272918701
Accuracy: 0.6039999723434448
Epoch 98, CIFAR-10 Batch 1:  Loss: 1.2321369647979736
Accuracy: 0.6015999913215637
Epoch 98, CIFAR-10 Batch 2:  Loss: 1.166778564453125
Accuracy: 0.6032000184059143
Epoch 98, CIFAR-10 Batch 3:  Loss: 0.8970649838447571
Accuracy: 0.6068000197410583
Epoch 98, CIFAR-10 Batch 4:  Loss: 1.0292885303497314
Accuracy: 0.6021999716758728
Epoch 98, CIFAR-10 Batch 5:  Loss: 1.051815390586853
Accuracy: 0.6051999926567078
Epoch 99, CIFAR-10 Batch 1:  Loss: 1.257728099822998
Accuracy: 0.6028000116348267
Epoch 99, CIFAR-10 Batch 2:  Loss: 1.1152814626693726
Accuracy: 0.6039999723434448
Epoch 99, CIFAR-10 Batch 3:  Loss: 0.8835471868515015
Accuracy: 0.6065999865531921
Epoch 99, CIFAR-10 Batch 4:  Loss: 1.0152164697647095
Accuracy: 0.6074000000953674
Epoch 99, CIFAR-10 Batch 5:  Loss: 1.0601476430892944
Accuracy: 0.6011999845504761
Epoch 100, CIFAR-10 Batch 1:  Loss: 1.205919861793518
Accuracy: 0.6051999926567078
Epoch 100, CIFAR-10 Batch 2:  Loss: 1.1604387760162354
Accuracy: 0.607200026512146
Epoch 100, CIFAR-10 Batch 3:  Loss: 0.8741569519042969
Accuracy: 0.6074000000953674
Epoch 100, CIFAR-10 Batch 4:  Loss: 1.0290213823318481
Accuracy: 0.603600025177002
Epoch 100, CIFAR-10 Batch 5:  Loss: 1.0671662092208862
Accuracy: 0.6083999872207642

Checkpoint

The model has been saved to disk.

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.


In [18]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_test.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for test_feature_batch, test_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: test_feature_batch, loaded_y: test_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


Testing Accuracy: 0.6063893312101911

Why 50-80% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 80%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.