Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

# Use Floyd's cifar-10 dataset if present
floyd_cifar10_location = '/input/cifar-10/python.tar.gz'
if isfile(floyd_cifar10_location):
    tar_gz_path = floyd_cifar10_location
else:
    tar_gz_path = 'cifar-10-python.tar.gz'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(tar_gz_path):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            tar_gz_path,
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open(tar_gz_path) as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


CIFAR-10 Dataset: 171MB [01:21, 2.09MB/s]                              
All files found!

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

  • airplane
  • automobile
  • bird
  • cat
  • deer
  • dog
  • frog
  • horse
  • ship
  • truck

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.


In [2]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id = 1
sample_id = 5
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 1:
Samples: 10000
Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}
First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]

Example of Image 5:
Image - Min Value: 0 Max Value: 252
Image - Shape: (32, 32, 3)
Label - Label Id: 1 Name: automobile

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.


In [3]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    # TODO: Implement Function
    #Using point to point will adjust based on image so we will use 255 max instead - x.ptp(0)
    return (x - x.min(0)) / 255 

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_normalize(normalize)


Tests Passed

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.

Hint: Don't reinvent the wheel.


In [4]:
def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    # TODO: Implement Function
    return np.eye(10)[x]


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_one_hot_encode(one_hot_encode)


Tests Passed

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.


In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [6]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

Note: If you're finding it hard to dedicate enough time for this course each week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages to build each layer, except the layers you build in the "Convolutional and Max Pooling Layer" section. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

However, if you would like to get the most out of this course, try to solve all the problems without using anything from the TF Layers packages. You can still use classes from other packages that happen to have the same name as ones you find in TF Layers! For example, instead of using the TF Layers version of the conv2d class, tf.layers.conv2d, you would want to use the TF Neural Network version of conv2d, tf.nn.conv2d.

Let's begin!

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

  • Implement neural_net_image_input
    • Return a TF Placeholder
    • Set the shape using image_shape with batch size set to None.
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_label_input
    • Return a TF Placeholder
    • Set the shape using n_classes with batch size set to None.
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_keep_prob_input
    • Return a TF Placeholder for dropout keep probability.
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.

These names will be used at the end of the project to load your saved model.

Note: None for shapes in TensorFlow allow for a dynamic size.


In [7]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a batch of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32,[None, image_shape[0],image_shape[1],image_shape[2]],name="x")


def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32, [None, n_classes], name="y")


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32, name="keep_prob")


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Image Input Tests Passed.
Label Input Tests Passed.
Keep Prob Tests Passed.

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • Apply a convolution to x_tensor using weight and conv_strides.
    • We recommend you use same padding, but you're welcome to use any padding.
  • Add bias
  • Add a nonlinear activation to the convolution.
  • Apply Max Pooling using pool_ksize and pool_strides.
    • We recommend you use same padding, but you're welcome to use any padding.

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer, but you can still use TensorFlow's Neural Network package. You may still use the shortcut option for all the other layers.


In [8]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_ksize: kernal size 2-D Tuple for the convolutional layer
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """
    # TODO: Implement Function
    weight = tf.Variable(tf.truncated_normal([conv_ksize[0], conv_ksize[1], int(x_tensor.shape[3]), conv_num_outputs], mean=0.0, stddev=0.05))
    bias = tf.Variable(tf.zeros(conv_num_outputs))
    conv = tf.nn.conv2d(x_tensor, weight, strides=[1, conv_strides[0], conv_strides[1], 1], padding='SAME')
    conv = tf.nn.bias_add(conv, bias)
    conv = tf.nn.relu(conv)
    return tf.nn.max_pool(conv, ksize=[1, pool_ksize[0],pool_ksize[1],1], strides=[1, pool_strides[0], pool_strides[1], 1], padding='SAME')
    


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_con_pool(conv2d_maxpool)


Tests Passed

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [9]:
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    # TODO: Implement Function
    return tf.contrib.layers.flatten(x_tensor)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_flatten(flatten)


Tests Passed

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [10]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    return tf.contrib.layers.fully_connected(x_tensor,num_outputs, activation_fn=tf.nn.relu)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_fully_conn(fully_conn)


Tests Passed

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.

Note: Activation, softmax, or cross entropy should not be applied to this.


In [11]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    shape = x_tensor.get_shape().as_list()
    flatImgSize = int(shape[1])
    out = tf.add(tf.matmul(x_tensor, tf.Variable(tf.truncated_normal([flatImgSize, num_outputs], stddev=0.05, mean=0.0))), tf.Variable(tf.zeros([num_outputs])))
    return out


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_output(output)


Tests Passed

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • Apply a Flatten Layer
  • Apply 1, 2, or 3 Fully Connected Layers
  • Apply an Output Layer
  • Return the output
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.

In [12]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """
    # TODO: Apply 1, 2, or 3 Convolution and Max Pool layers
    #    Play around with different number of outputs, kernel size and stride
    # Function Definition from Above:
    #    conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides)
    n_conv_outputs = [32,16,8,4,2]
    conv_ksize = (3,3)
    conv_strides = (1,1)
    pool_ksize = (3,3)
    pool_strides = (1,1)
    conv1 = conv2d_maxpool(x, n_conv_outputs[1], conv_ksize, conv_strides, pool_ksize, pool_strides)
    
    conv2_ksize = (2,2)
    conv2_strides = (1,1)
    pool2_ksize = (2,2)
    pool2_strides = (1,1)
    conv2 = conv2d_maxpool(conv1, n_conv_outputs[0], conv2_ksize, conv2_strides, pool2_ksize, pool2_strides)

    conv2 = conv2d_maxpool(conv2, n_conv_outputs[0], conv2_ksize, conv2_strides, pool2_ksize, pool2_strides)

    # TODO: Apply a Flatten Layer
    # Function Definition from Above:
    #   flatten(x_tensor)
    flat_layer = flatten(conv2)

    # TODO: Apply 1, 2, or 3 Fully Connected Layers
    #    Play around with different number of outputs
    # Function Definition from Above:
    #   fully_conn(x_tensor, num_outputs)
    fc1 = fully_conn(flat_layer, n_conv_outputs[0])
    #fc1 = tf.nn.dropout(fc1, keep_prob)
    fc1 = fully_conn(fc1, n_conv_outputs[1])
    fc1 = fully_conn(fc1, n_conv_outputs[2])
    #Only performing the droput at the end after fully connected
    fc1 = tf.nn.dropout(fc1, keep_prob)
    
    # TODO: Apply an Output Layer
    #    Set this to the number of classes
    # Function Definition from Above:
    #   output(x_tensor, num_outputs)
    
    out = output(fc1, 10)
    # TODO: return output
    return out


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""

##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

tests.test_conv_net(conv_net)


Neural Network Built!

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

  • x for image input
  • y for labels
  • keep_prob for keep probability for dropout

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

Note: Nothing needs to be returned. This function is only optimizing the neural network.


In [13]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    # TODO: Implement Function
    session.run(optimizer, feed_dict={x: feature_batch, y: label_batch, keep_prob:keep_probability})


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_train_nn(train_neural_network)


Tests Passed

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.


In [14]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """
    # TODO: Implement Function
    loss = session.run(cost, feed_dict={x:feature_batch, y:label_batch, keep_prob:1.})
    print('Loss: {}'.format(loss))
    valid_accuracy = sess.run(accuracy, feed_dict={x:valid_features, y:valid_labels, keep_prob:1.})
    print('Accuracy: {}'.format(valid_accuracy))

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
    • 64
    • 128
    • 256
    • ...
  • Set keep_probability to the probability of keeping a node using dropout

In [15]:
# TODO: Tune Parameters
epochs = 100
batch_size = 512
keep_probability = 0.80

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.


In [16]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  Loss: 2.2573797702789307
Accuracy: 0.11059999465942383
Epoch  2, CIFAR-10 Batch 1:  Loss: 2.175696849822998
Accuracy: 0.1915999948978424
Epoch  3, CIFAR-10 Batch 1:  Loss: 2.112135410308838
Accuracy: 0.2525999844074249
Epoch  4, CIFAR-10 Batch 1:  Loss: 2.0275766849517822
Accuracy: 0.30319997668266296
Epoch  5, CIFAR-10 Batch 1:  Loss: 1.9276554584503174
Accuracy: 0.31859996914863586
Epoch  6, CIFAR-10 Batch 1:  Loss: 1.893453598022461
Accuracy: 0.2911999821662903
Epoch  7, CIFAR-10 Batch 1:  Loss: 1.828675389289856
Accuracy: 0.340999960899353
Epoch  8, CIFAR-10 Batch 1:  Loss: 1.7998957633972168
Accuracy: 0.36139994859695435
Epoch  9, CIFAR-10 Batch 1:  Loss: 1.767749547958374
Accuracy: 0.3413999676704407
Epoch 10, CIFAR-10 Batch 1:  Loss: 1.742598056793213
Accuracy: 0.3803999722003937
Epoch 11, CIFAR-10 Batch 1:  Loss: 1.7035611867904663
Accuracy: 0.3840000033378601
Epoch 12, CIFAR-10 Batch 1:  Loss: 1.6982359886169434
Accuracy: 0.3837999701499939
Epoch 13, CIFAR-10 Batch 1:  Loss: 1.6703108549118042
Accuracy: 0.37999996542930603
Epoch 14, CIFAR-10 Batch 1:  Loss: 1.6389694213867188
Accuracy: 0.3959999680519104
Epoch 15, CIFAR-10 Batch 1:  Loss: 1.6402238607406616
Accuracy: 0.3933999836444855
Epoch 16, CIFAR-10 Batch 1:  Loss: 1.614281177520752
Accuracy: 0.40299999713897705
Epoch 17, CIFAR-10 Batch 1:  Loss: 1.5842562913894653
Accuracy: 0.40860000252723694
Epoch 18, CIFAR-10 Batch 1:  Loss: 1.5660920143127441
Accuracy: 0.4099999964237213
Epoch 19, CIFAR-10 Batch 1:  Loss: 1.5538076162338257
Accuracy: 0.4065999686717987
Epoch 20, CIFAR-10 Batch 1:  Loss: 1.5570260286331177
Accuracy: 0.4139999449253082
Epoch 21, CIFAR-10 Batch 1:  Loss: 1.5478293895721436
Accuracy: 0.41659995913505554
Epoch 22, CIFAR-10 Batch 1:  Loss: 1.5159437656402588
Accuracy: 0.4197999835014343
Epoch 23, CIFAR-10 Batch 1:  Loss: 1.4984056949615479
Accuracy: 0.42179998755455017
Epoch 24, CIFAR-10 Batch 1:  Loss: 1.514433741569519
Accuracy: 0.41839998960494995
Epoch 25, CIFAR-10 Batch 1:  Loss: 1.460356593132019
Accuracy: 0.4359999895095825
Epoch 26, CIFAR-10 Batch 1:  Loss: 1.4568618535995483
Accuracy: 0.4291999936103821
Epoch 27, CIFAR-10 Batch 1:  Loss: 1.4213581085205078
Accuracy: 0.4315999746322632
Epoch 28, CIFAR-10 Batch 1:  Loss: 1.4112993478775024
Accuracy: 0.4407999515533447
Epoch 29, CIFAR-10 Batch 1:  Loss: 1.4016731977462769
Accuracy: 0.4426000118255615
Epoch 30, CIFAR-10 Batch 1:  Loss: 1.391150712966919
Accuracy: 0.45419999957084656
Epoch 31, CIFAR-10 Batch 1:  Loss: 1.377537488937378
Accuracy: 0.4461999535560608
Epoch 32, CIFAR-10 Batch 1:  Loss: 1.3421659469604492
Accuracy: 0.45559996366500854
Epoch 33, CIFAR-10 Batch 1:  Loss: 1.4114216566085815
Accuracy: 0.4365999698638916
Epoch 34, CIFAR-10 Batch 1:  Loss: 1.348879098892212
Accuracy: 0.4541999399662018
Epoch 35, CIFAR-10 Batch 1:  Loss: 1.2930885553359985
Accuracy: 0.4811999499797821
Epoch 36, CIFAR-10 Batch 1:  Loss: 1.2710270881652832
Accuracy: 0.47579991817474365
Epoch 37, CIFAR-10 Batch 1:  Loss: 1.3153105974197388
Accuracy: 0.44999995827674866
Epoch 38, CIFAR-10 Batch 1:  Loss: 1.2796435356140137
Accuracy: 0.4665999710559845
Epoch 39, CIFAR-10 Batch 1:  Loss: 1.242238163948059
Accuracy: 0.48819994926452637
Epoch 40, CIFAR-10 Batch 1:  Loss: 1.198078989982605
Accuracy: 0.48799997568130493
Epoch 41, CIFAR-10 Batch 1:  Loss: 1.1928324699401855
Accuracy: 0.4851999282836914
Epoch 42, CIFAR-10 Batch 1:  Loss: 1.1729059219360352
Accuracy: 0.5019999742507935
Epoch 43, CIFAR-10 Batch 1:  Loss: 1.1627604961395264
Accuracy: 0.5055999159812927
Epoch 44, CIFAR-10 Batch 1:  Loss: 1.1598169803619385
Accuracy: 0.5095999836921692
Epoch 45, CIFAR-10 Batch 1:  Loss: 1.152099609375
Accuracy: 0.5051999688148499
Epoch 46, CIFAR-10 Batch 1:  Loss: 1.1330995559692383
Accuracy: 0.4925999641418457
Epoch 47, CIFAR-10 Batch 1:  Loss: 1.1236870288848877
Accuracy: 0.5017999410629272
Epoch 48, CIFAR-10 Batch 1:  Loss: 1.108673334121704
Accuracy: 0.49039995670318604
Epoch 49, CIFAR-10 Batch 1:  Loss: 1.1059304475784302
Accuracy: 0.49119994044303894
Epoch 50, CIFAR-10 Batch 1:  Loss: 1.062838077545166
Accuracy: 0.5029999017715454
Epoch 51, CIFAR-10 Batch 1:  Loss: 1.0614362955093384
Accuracy: 0.4989999532699585
Epoch 52, CIFAR-10 Batch 1:  Loss: 1.0501898527145386
Accuracy: 0.509399950504303
Epoch 53, CIFAR-10 Batch 1:  Loss: 1.0861728191375732
Accuracy: 0.49219998717308044
Epoch 54, CIFAR-10 Batch 1:  Loss: 1.0648319721221924
Accuracy: 0.5005999207496643
Epoch 55, CIFAR-10 Batch 1:  Loss: 1.0470914840698242
Accuracy: 0.5037999749183655
Epoch 56, CIFAR-10 Batch 1:  Loss: 1.0118399858474731
Accuracy: 0.5077999234199524
Epoch 57, CIFAR-10 Batch 1:  Loss: 1.0142220258712769
Accuracy: 0.5009999871253967
Epoch 58, CIFAR-10 Batch 1:  Loss: 0.9953745007514954
Accuracy: 0.50819993019104
Epoch 59, CIFAR-10 Batch 1:  Loss: 0.9815672039985657
Accuracy: 0.5113999247550964
Epoch 60, CIFAR-10 Batch 1:  Loss: 0.9923112392425537
Accuracy: 0.5091999769210815
Epoch 61, CIFAR-10 Batch 1:  Loss: 0.9572737216949463
Accuracy: 0.5181999206542969
Epoch 62, CIFAR-10 Batch 1:  Loss: 0.9231612682342529
Accuracy: 0.5209999680519104
Epoch 63, CIFAR-10 Batch 1:  Loss: 0.9836158156394958
Accuracy: 0.49679994583129883
Epoch 64, CIFAR-10 Batch 1:  Loss: 0.9179794788360596
Accuracy: 0.5185999274253845
Epoch 65, CIFAR-10 Batch 1:  Loss: 0.9580536484718323
Accuracy: 0.4927999973297119
Epoch 66, CIFAR-10 Batch 1:  Loss: 0.9499638080596924
Accuracy: 0.5017999410629272
Epoch 67, CIFAR-10 Batch 1:  Loss: 0.9076530933380127
Accuracy: 0.5025999546051025
Epoch 68, CIFAR-10 Batch 1:  Loss: 0.8697338104248047
Accuracy: 0.5109999179840088
Epoch 69, CIFAR-10 Batch 1:  Loss: 0.8985743522644043
Accuracy: 0.5
Epoch 70, CIFAR-10 Batch 1:  Loss: 0.8776019811630249
Accuracy: 0.5107999444007874
Epoch 71, CIFAR-10 Batch 1:  Loss: 0.8436453938484192
Accuracy: 0.5107999444007874
Epoch 72, CIFAR-10 Batch 1:  Loss: 0.8438991904258728
Accuracy: 0.5169999599456787
Epoch 73, CIFAR-10 Batch 1:  Loss: 0.9095757603645325
Accuracy: 0.49619999527931213
Epoch 74, CIFAR-10 Batch 1:  Loss: 0.8560107350349426
Accuracy: 0.5149999260902405
Epoch 75, CIFAR-10 Batch 1:  Loss: 0.8930160999298096
Accuracy: 0.504599928855896
Epoch 76, CIFAR-10 Batch 1:  Loss: 0.8628991842269897
Accuracy: 0.5113999843597412
Epoch 77, CIFAR-10 Batch 1:  Loss: 0.8351559638977051
Accuracy: 0.5199998617172241
Epoch 78, CIFAR-10 Batch 1:  Loss: 0.8286650776863098
Accuracy: 0.5039999485015869
Epoch 79, CIFAR-10 Batch 1:  Loss: 0.7916836738586426
Accuracy: 0.5103999376296997
Epoch 80, CIFAR-10 Batch 1:  Loss: 0.8006038069725037
Accuracy: 0.5001999735832214
Epoch 81, CIFAR-10 Batch 1:  Loss: 0.7679431438446045
Accuracy: 0.4999999701976776
Epoch 82, CIFAR-10 Batch 1:  Loss: 0.8150166273117065
Accuracy: 0.48899996280670166
Epoch 83, CIFAR-10 Batch 1:  Loss: 0.820084810256958
Accuracy: 0.49719998240470886
Epoch 84, CIFAR-10 Batch 1:  Loss: 0.8273974657058716
Accuracy: 0.521399974822998
Epoch 85, CIFAR-10 Batch 1:  Loss: 0.8039061427116394
Accuracy: 0.5151999592781067
Epoch 86, CIFAR-10 Batch 1:  Loss: 0.7572504878044128
Accuracy: 0.5175999402999878
Epoch 87, CIFAR-10 Batch 1:  Loss: 0.7461071014404297
Accuracy: 0.5057999491691589
Epoch 88, CIFAR-10 Batch 1:  Loss: 0.7465401887893677
Accuracy: 0.5041999220848083
Epoch 89, CIFAR-10 Batch 1:  Loss: 0.7402679324150085
Accuracy: 0.5163999795913696
Epoch 90, CIFAR-10 Batch 1:  Loss: 0.7893995046615601
Accuracy: 0.5061999559402466
Epoch 91, CIFAR-10 Batch 1:  Loss: 0.7440340518951416
Accuracy: 0.5067999362945557
Epoch 92, CIFAR-10 Batch 1:  Loss: 0.7502336502075195
Accuracy: 0.5073999166488647
Epoch 93, CIFAR-10 Batch 1:  Loss: 0.833055853843689
Accuracy: 0.49759992957115173
Epoch 94, CIFAR-10 Batch 1:  Loss: 0.7761378884315491
Accuracy: 0.50819993019104
Epoch 95, CIFAR-10 Batch 1:  Loss: 0.7699291110038757
Accuracy: 0.5201998949050903
Epoch 96, CIFAR-10 Batch 1:  Loss: 0.821605384349823
Accuracy: 0.5249999761581421
Epoch 97, CIFAR-10 Batch 1:  Loss: 0.756725549697876
Accuracy: 0.5229999423027039
Epoch 98, CIFAR-10 Batch 1:  Loss: 0.7202743291854858
Accuracy: 0.5215999484062195
Epoch 99, CIFAR-10 Batch 1:  Loss: 0.7416094541549683
Accuracy: 0.5151999592781067
Epoch 100, CIFAR-10 Batch 1:  Loss: 0.6955342888832092
Accuracy: 0.520799994468689

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.


In [17]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  Loss: 2.2633249759674072
Accuracy: 0.15440000593662262
Epoch  1, CIFAR-10 Batch 2:  Loss: 2.181443929672241
Accuracy: 0.16920001804828644
Epoch  1, CIFAR-10 Batch 3:  Loss: 2.1602776050567627
Accuracy: 0.1939999759197235
Epoch  1, CIFAR-10 Batch 4:  Loss: 2.1555514335632324
Accuracy: 0.20559999346733093
Epoch  1, CIFAR-10 Batch 5:  Loss: 2.0931220054626465
Accuracy: 0.21660000085830688
Epoch  2, CIFAR-10 Batch 1:  Loss: 2.0225720405578613
Accuracy: 0.2401999831199646
Epoch  2, CIFAR-10 Batch 2:  Loss: 1.9395936727523804
Accuracy: 0.2627999782562256
Epoch  2, CIFAR-10 Batch 3:  Loss: 1.830711007118225
Accuracy: 0.28060001134872437
Epoch  2, CIFAR-10 Batch 4:  Loss: 1.8168290853500366
Accuracy: 0.27459999918937683
Epoch  2, CIFAR-10 Batch 5:  Loss: 1.8628175258636475
Accuracy: 0.2733999788761139
Epoch  3, CIFAR-10 Batch 1:  Loss: 1.8838478326797485
Accuracy: 0.2971999943256378
Epoch  3, CIFAR-10 Batch 2:  Loss: 1.8226817846298218
Accuracy: 0.3086000084877014
Epoch  3, CIFAR-10 Batch 3:  Loss: 1.6668353080749512
Accuracy: 0.3163999915122986
Epoch  3, CIFAR-10 Batch 4:  Loss: 1.6609470844268799
Accuracy: 0.32440000772476196
Epoch  3, CIFAR-10 Batch 5:  Loss: 1.8004132509231567
Accuracy: 0.32179996371269226
Epoch  4, CIFAR-10 Batch 1:  Loss: 1.8039581775665283
Accuracy: 0.3317999839782715
Epoch  4, CIFAR-10 Batch 2:  Loss: 1.7397719621658325
Accuracy: 0.32499998807907104
Epoch  4, CIFAR-10 Batch 3:  Loss: 1.609684705734253
Accuracy: 0.34379997849464417
Epoch  4, CIFAR-10 Batch 4:  Loss: 1.6082959175109863
Accuracy: 0.3410000205039978
Epoch  4, CIFAR-10 Batch 5:  Loss: 1.7036826610565186
Accuracy: 0.3361999988555908
Epoch  5, CIFAR-10 Batch 1:  Loss: 1.7697241306304932
Accuracy: 0.3351999819278717
Epoch  5, CIFAR-10 Batch 2:  Loss: 1.6846535205841064
Accuracy: 0.34299999475479126
Epoch  5, CIFAR-10 Batch 3:  Loss: 1.6071397066116333
Accuracy: 0.33719998598098755
Epoch  5, CIFAR-10 Batch 4:  Loss: 1.5937600135803223
Accuracy: 0.3513999879360199
Epoch  5, CIFAR-10 Batch 5:  Loss: 1.646228313446045
Accuracy: 0.3375999629497528
Epoch  6, CIFAR-10 Batch 1:  Loss: 1.6751502752304077
Accuracy: 0.3755999803543091
Epoch  6, CIFAR-10 Batch 2:  Loss: 1.6378659009933472
Accuracy: 0.36739999055862427
Epoch  6, CIFAR-10 Batch 3:  Loss: 1.5234496593475342
Accuracy: 0.36959999799728394
Epoch  6, CIFAR-10 Batch 4:  Loss: 1.544702172279358
Accuracy: 0.37699997425079346
Epoch  6, CIFAR-10 Batch 5:  Loss: 1.6141471862792969
Accuracy: 0.3779999911785126
Epoch  7, CIFAR-10 Batch 1:  Loss: 1.6386995315551758
Accuracy: 0.3887999653816223
Epoch  7, CIFAR-10 Batch 2:  Loss: 1.61037015914917
Accuracy: 0.3757999837398529
Epoch  7, CIFAR-10 Batch 3:  Loss: 1.495757818222046
Accuracy: 0.37839996814727783
Epoch  7, CIFAR-10 Batch 4:  Loss: 1.5115876197814941
Accuracy: 0.39239996671676636
Epoch  7, CIFAR-10 Batch 5:  Loss: 1.5920864343643188
Accuracy: 0.3853999972343445
Epoch  8, CIFAR-10 Batch 1:  Loss: 1.6249072551727295
Accuracy: 0.3935999572277069
Epoch  8, CIFAR-10 Batch 2:  Loss: 1.562855839729309
Accuracy: 0.39979997277259827
Epoch  8, CIFAR-10 Batch 3:  Loss: 1.454609990119934
Accuracy: 0.4057999849319458
Epoch  8, CIFAR-10 Batch 4:  Loss: 1.4832979440689087
Accuracy: 0.4039999842643738
Epoch  8, CIFAR-10 Batch 5:  Loss: 1.5581328868865967
Accuracy: 0.39219993352890015
Epoch  9, CIFAR-10 Batch 1:  Loss: 1.6160438060760498
Accuracy: 0.3999999463558197
Epoch  9, CIFAR-10 Batch 2:  Loss: 1.538158655166626
Accuracy: 0.41259995102882385
Epoch  9, CIFAR-10 Batch 3:  Loss: 1.4378854036331177
Accuracy: 0.4057999849319458
Epoch  9, CIFAR-10 Batch 4:  Loss: 1.4685537815093994
Accuracy: 0.4293999671936035
Epoch  9, CIFAR-10 Batch 5:  Loss: 1.5323907136917114
Accuracy: 0.4047999680042267
Epoch 10, CIFAR-10 Batch 1:  Loss: 1.580732822418213
Accuracy: 0.415800005197525
Epoch 10, CIFAR-10 Batch 2:  Loss: 1.53793466091156
Accuracy: 0.423399955034256
Epoch 10, CIFAR-10 Batch 3:  Loss: 1.399452567100525
Accuracy: 0.4325999617576599
Epoch 10, CIFAR-10 Batch 4:  Loss: 1.4414596557617188
Accuracy: 0.4365999698638916
Epoch 10, CIFAR-10 Batch 5:  Loss: 1.5101971626281738
Accuracy: 0.41679999232292175
Epoch 11, CIFAR-10 Batch 1:  Loss: 1.5777572393417358
Accuracy: 0.4311999976634979
Epoch 11, CIFAR-10 Batch 2:  Loss: 1.5068860054016113
Accuracy: 0.44300001859664917
Epoch 11, CIFAR-10 Batch 3:  Loss: 1.3828212022781372
Accuracy: 0.43219995498657227
Epoch 11, CIFAR-10 Batch 4:  Loss: 1.420131802558899
Accuracy: 0.44679996371269226
Epoch 11, CIFAR-10 Batch 5:  Loss: 1.4619646072387695
Accuracy: 0.4323999583721161
Epoch 12, CIFAR-10 Batch 1:  Loss: 1.5365490913391113
Accuracy: 0.44159996509552
Epoch 12, CIFAR-10 Batch 2:  Loss: 1.4998383522033691
Accuracy: 0.45339998602867126
Epoch 12, CIFAR-10 Batch 3:  Loss: 1.331249475479126
Accuracy: 0.4559999704360962
Epoch 12, CIFAR-10 Batch 4:  Loss: 1.38797926902771
Accuracy: 0.45979994535446167
Epoch 12, CIFAR-10 Batch 5:  Loss: 1.4324486255645752
Accuracy: 0.44020000100135803
Epoch 13, CIFAR-10 Batch 1:  Loss: 1.5034502744674683
Accuracy: 0.4578000009059906
Epoch 13, CIFAR-10 Batch 2:  Loss: 1.4681172370910645
Accuracy: 0.4631999731063843
Epoch 13, CIFAR-10 Batch 3:  Loss: 1.2970253229141235
Accuracy: 0.4695999324321747
Epoch 13, CIFAR-10 Batch 4:  Loss: 1.3561328649520874
Accuracy: 0.4657999575138092
Epoch 13, CIFAR-10 Batch 5:  Loss: 1.3986973762512207
Accuracy: 0.46459999680519104
Epoch 14, CIFAR-10 Batch 1:  Loss: 1.4701786041259766
Accuracy: 0.4617999792098999
Epoch 14, CIFAR-10 Batch 2:  Loss: 1.3997160196304321
Accuracy: 0.47360000014305115
Epoch 14, CIFAR-10 Batch 3:  Loss: 1.2705295085906982
Accuracy: 0.47999995946884155
Epoch 14, CIFAR-10 Batch 4:  Loss: 1.334731936454773
Accuracy: 0.47579994797706604
Epoch 14, CIFAR-10 Batch 5:  Loss: 1.3537280559539795
Accuracy: 0.47099995613098145
Epoch 15, CIFAR-10 Batch 1:  Loss: 1.4548348188400269
Accuracy: 0.47099992632865906
Epoch 15, CIFAR-10 Batch 2:  Loss: 1.3855232000350952
Accuracy: 0.48239991068840027
Epoch 15, CIFAR-10 Batch 3:  Loss: 1.2269172668457031
Accuracy: 0.4893999695777893
Epoch 15, CIFAR-10 Batch 4:  Loss: 1.3235220909118652
Accuracy: 0.4859999120235443
Epoch 15, CIFAR-10 Batch 5:  Loss: 1.3195377588272095
Accuracy: 0.494799941778183
Epoch 16, CIFAR-10 Batch 1:  Loss: 1.4039626121520996
Accuracy: 0.49559998512268066
Epoch 16, CIFAR-10 Batch 2:  Loss: 1.3410069942474365
Accuracy: 0.49639996886253357
Epoch 16, CIFAR-10 Batch 3:  Loss: 1.1961922645568848
Accuracy: 0.5005999803543091
Epoch 16, CIFAR-10 Batch 4:  Loss: 1.303510308265686
Accuracy: 0.4899999499320984
Epoch 16, CIFAR-10 Batch 5:  Loss: 1.305029273033142
Accuracy: 0.5033999681472778
Epoch 17, CIFAR-10 Batch 1:  Loss: 1.388779878616333
Accuracy: 0.4929999113082886
Epoch 17, CIFAR-10 Batch 2:  Loss: 1.3450772762298584
Accuracy: 0.4975999593734741
Epoch 17, CIFAR-10 Batch 3:  Loss: 1.1853891611099243
Accuracy: 0.5025999546051025
Epoch 17, CIFAR-10 Batch 4:  Loss: 1.2764694690704346
Accuracy: 0.4949999451637268
Epoch 17, CIFAR-10 Batch 5:  Loss: 1.2411192655563354
Accuracy: 0.5117999315261841
Epoch 18, CIFAR-10 Batch 1:  Loss: 1.3619805574417114
Accuracy: 0.5035999417304993
Epoch 18, CIFAR-10 Batch 2:  Loss: 1.3138693571090698
Accuracy: 0.5083999633789062
Epoch 18, CIFAR-10 Batch 3:  Loss: 1.161639928817749
Accuracy: 0.5127999186515808
Epoch 18, CIFAR-10 Batch 4:  Loss: 1.2113995552062988
Accuracy: 0.5151999592781067
Epoch 18, CIFAR-10 Batch 5:  Loss: 1.220860242843628
Accuracy: 0.5149999260902405
Epoch 19, CIFAR-10 Batch 1:  Loss: 1.3408929109573364
Accuracy: 0.5185999274253845
Epoch 19, CIFAR-10 Batch 2:  Loss: 1.3017395734786987
Accuracy: 0.5059999227523804
Epoch 19, CIFAR-10 Batch 3:  Loss: 1.148461937904358
Accuracy: 0.5245999097824097
Epoch 19, CIFAR-10 Batch 4:  Loss: 1.187922716140747
Accuracy: 0.5201998949050903
Epoch 19, CIFAR-10 Batch 5:  Loss: 1.240034580230713
Accuracy: 0.5181999802589417
Epoch 20, CIFAR-10 Batch 1:  Loss: 1.3344670534133911
Accuracy: 0.5135999321937561
Epoch 20, CIFAR-10 Batch 2:  Loss: 1.2851662635803223
Accuracy: 0.514799952507019
Epoch 20, CIFAR-10 Batch 3:  Loss: 1.1259186267852783
Accuracy: 0.5215999484062195
Epoch 20, CIFAR-10 Batch 4:  Loss: 1.158904790878296
Accuracy: 0.5313999652862549
Epoch 20, CIFAR-10 Batch 5:  Loss: 1.1862280368804932
Accuracy: 0.5245999097824097
Epoch 21, CIFAR-10 Batch 1:  Loss: 1.3139666318893433
Accuracy: 0.5212000012397766
Epoch 21, CIFAR-10 Batch 2:  Loss: 1.2944971323013306
Accuracy: 0.5039999485015869
Epoch 21, CIFAR-10 Batch 3:  Loss: 1.1116204261779785
Accuracy: 0.5231999158859253
Epoch 21, CIFAR-10 Batch 4:  Loss: 1.1455817222595215
Accuracy: 0.526199996471405
Epoch 21, CIFAR-10 Batch 5:  Loss: 1.1562800407409668
Accuracy: 0.5365999937057495
Epoch 22, CIFAR-10 Batch 1:  Loss: 1.2769991159439087
Accuracy: 0.5311999320983887
Epoch 22, CIFAR-10 Batch 2:  Loss: 1.2702252864837646
Accuracy: 0.49999991059303284
Epoch 22, CIFAR-10 Batch 3:  Loss: 1.0797555446624756
Accuracy: 0.5375999212265015
Epoch 22, CIFAR-10 Batch 4:  Loss: 1.1373727321624756
Accuracy: 0.5265999436378479
Epoch 22, CIFAR-10 Batch 5:  Loss: 1.1528886556625366
Accuracy: 0.539199948310852
Epoch 23, CIFAR-10 Batch 1:  Loss: 1.2864161729812622
Accuracy: 0.5315999388694763
Epoch 23, CIFAR-10 Batch 2:  Loss: 1.2551653385162354
Accuracy: 0.5145999193191528
Epoch 23, CIFAR-10 Batch 3:  Loss: 1.0725902318954468
Accuracy: 0.5371999740600586
Epoch 23, CIFAR-10 Batch 4:  Loss: 1.1126155853271484
Accuracy: 0.5351999402046204
Epoch 23, CIFAR-10 Batch 5:  Loss: 1.1659091711044312
Accuracy: 0.5297999382019043
Epoch 24, CIFAR-10 Batch 1:  Loss: 1.2646286487579346
Accuracy: 0.5335999131202698
Epoch 24, CIFAR-10 Batch 2:  Loss: 1.2152743339538574
Accuracy: 0.5261999368667603
Epoch 24, CIFAR-10 Batch 3:  Loss: 1.0590795278549194
Accuracy: 0.5445999503135681
Epoch 24, CIFAR-10 Batch 4:  Loss: 1.103790044784546
Accuracy: 0.5367999076843262
Epoch 24, CIFAR-10 Batch 5:  Loss: 1.1473066806793213
Accuracy: 0.5315999984741211
Epoch 25, CIFAR-10 Batch 1:  Loss: 1.2390321493148804
Accuracy: 0.5407999753952026
Epoch 25, CIFAR-10 Batch 2:  Loss: 1.1843528747558594
Accuracy: 0.5205999612808228
Epoch 25, CIFAR-10 Batch 3:  Loss: 1.045661449432373
Accuracy: 0.5521999001502991
Epoch 25, CIFAR-10 Batch 4:  Loss: 1.0769693851470947
Accuracy: 0.5489999651908875
Epoch 25, CIFAR-10 Batch 5:  Loss: 1.0926032066345215
Accuracy: 0.5491999387741089
Epoch 26, CIFAR-10 Batch 1:  Loss: 1.2331596612930298
Accuracy: 0.5409999489784241
Epoch 26, CIFAR-10 Batch 2:  Loss: 1.1690809726715088
Accuracy: 0.5493999123573303
Epoch 26, CIFAR-10 Batch 3:  Loss: 1.0608620643615723
Accuracy: 0.5435999631881714
Epoch 26, CIFAR-10 Batch 4:  Loss: 1.0575385093688965
Accuracy: 0.5509999394416809
Epoch 26, CIFAR-10 Batch 5:  Loss: 1.0737295150756836
Accuracy: 0.5525999665260315
Epoch 27, CIFAR-10 Batch 1:  Loss: 1.2023253440856934
Accuracy: 0.5513999462127686
Epoch 27, CIFAR-10 Batch 2:  Loss: 1.1249961853027344
Accuracy: 0.5461999177932739
Epoch 27, CIFAR-10 Batch 3:  Loss: 1.0381057262420654
Accuracy: 0.5481998920440674
Epoch 27, CIFAR-10 Batch 4:  Loss: 1.033244013786316
Accuracy: 0.5547999143600464
Epoch 27, CIFAR-10 Batch 5:  Loss: 1.059263825416565
Accuracy: 0.5579999685287476
Epoch 28, CIFAR-10 Batch 1:  Loss: 1.208022952079773
Accuracy: 0.549799919128418
Epoch 28, CIFAR-10 Batch 2:  Loss: 1.1387524604797363
Accuracy: 0.549799919128418
Epoch 28, CIFAR-10 Batch 3:  Loss: 1.0000916719436646
Accuracy: 0.5533999800682068
Epoch 28, CIFAR-10 Batch 4:  Loss: 1.0021194219589233
Accuracy: 0.5605999231338501
Epoch 28, CIFAR-10 Batch 5:  Loss: 1.0273146629333496
Accuracy: 0.5591999888420105
Epoch 29, CIFAR-10 Batch 1:  Loss: 1.1831772327423096
Accuracy: 0.5521999597549438
Epoch 29, CIFAR-10 Batch 2:  Loss: 1.0879467725753784
Accuracy: 0.5595999360084534
Epoch 29, CIFAR-10 Batch 3:  Loss: 1.0538554191589355
Accuracy: 0.5369999408721924
Epoch 29, CIFAR-10 Batch 4:  Loss: 1.0276662111282349
Accuracy: 0.5621999502182007
Epoch 29, CIFAR-10 Batch 5:  Loss: 1.0281212329864502
Accuracy: 0.5657998919487
Epoch 30, CIFAR-10 Batch 1:  Loss: 1.1746821403503418
Accuracy: 0.5541999340057373
Epoch 30, CIFAR-10 Batch 2:  Loss: 1.0772907733917236
Accuracy: 0.5609999299049377
Epoch 30, CIFAR-10 Batch 3:  Loss: 1.001535177230835
Accuracy: 0.5543999075889587
Epoch 30, CIFAR-10 Batch 4:  Loss: 1.0217013359069824
Accuracy: 0.561799943447113
Epoch 30, CIFAR-10 Batch 5:  Loss: 1.0310790538787842
Accuracy: 0.5571998953819275
Epoch 31, CIFAR-10 Batch 1:  Loss: 1.1395783424377441
Accuracy: 0.5627999305725098
Epoch 31, CIFAR-10 Batch 2:  Loss: 1.0819720029830933
Accuracy: 0.5605999231338501
Epoch 31, CIFAR-10 Batch 3:  Loss: 0.9922069311141968
Accuracy: 0.5539999008178711
Epoch 31, CIFAR-10 Batch 4:  Loss: 1.0118319988250732
Accuracy: 0.5629999041557312
Epoch 31, CIFAR-10 Batch 5:  Loss: 1.0242466926574707
Accuracy: 0.5605999231338501
Epoch 32, CIFAR-10 Batch 1:  Loss: 1.1533122062683105
Accuracy: 0.558199942111969
Epoch 32, CIFAR-10 Batch 2:  Loss: 1.0524804592132568
Accuracy: 0.5665999054908752
Epoch 32, CIFAR-10 Batch 3:  Loss: 0.980884850025177
Accuracy: 0.5589998960494995
Epoch 32, CIFAR-10 Batch 4:  Loss: 1.0001230239868164
Accuracy: 0.5609999299049377
Epoch 32, CIFAR-10 Batch 5:  Loss: 0.9707255363464355
Accuracy: 0.5721999406814575
Epoch 33, CIFAR-10 Batch 1:  Loss: 1.094424843788147
Accuracy: 0.5723999738693237
Epoch 33, CIFAR-10 Batch 2:  Loss: 1.021077036857605
Accuracy: 0.5729999542236328
Epoch 33, CIFAR-10 Batch 3:  Loss: 0.930464506149292
Accuracy: 0.5721999406814575
Epoch 33, CIFAR-10 Batch 4:  Loss: 0.9566107392311096
Accuracy: 0.5717999339103699
Epoch 33, CIFAR-10 Batch 5:  Loss: 0.9727429151535034
Accuracy: 0.5703998804092407
Epoch 34, CIFAR-10 Batch 1:  Loss: 1.0960946083068848
Accuracy: 0.5625998973846436
Epoch 34, CIFAR-10 Batch 2:  Loss: 1.02163565158844
Accuracy: 0.5765999555587769
Epoch 34, CIFAR-10 Batch 3:  Loss: 0.9356513023376465
Accuracy: 0.5775998830795288
Epoch 34, CIFAR-10 Batch 4:  Loss: 0.9578460454940796
Accuracy: 0.5699999332427979
Epoch 34, CIFAR-10 Batch 5:  Loss: 0.9702461957931519
Accuracy: 0.5669999122619629
Epoch 35, CIFAR-10 Batch 1:  Loss: 1.0801620483398438
Accuracy: 0.5663999319076538
Epoch 35, CIFAR-10 Batch 2:  Loss: 0.9826191663742065
Accuracy: 0.5805999636650085
Epoch 35, CIFAR-10 Batch 3:  Loss: 0.9196344614028931
Accuracy: 0.5707999467849731
Epoch 35, CIFAR-10 Batch 4:  Loss: 0.9697281122207642
Accuracy: 0.5661998987197876
Epoch 35, CIFAR-10 Batch 5:  Loss: 0.9603524804115295
Accuracy: 0.574999988079071
Epoch 36, CIFAR-10 Batch 1:  Loss: 1.0285736322402954
Accuracy: 0.5761999487876892
Epoch 36, CIFAR-10 Batch 2:  Loss: 0.9945148229598999
Accuracy: 0.5763999223709106
Epoch 36, CIFAR-10 Batch 3:  Loss: 0.8935932517051697
Accuracy: 0.5781998634338379
Epoch 36, CIFAR-10 Batch 4:  Loss: 0.9410237669944763
Accuracy: 0.5709999203681946
Epoch 36, CIFAR-10 Batch 5:  Loss: 0.9619677066802979
Accuracy: 0.5727999210357666
Epoch 37, CIFAR-10 Batch 1:  Loss: 1.0373934507369995
Accuracy: 0.5759998559951782
Epoch 37, CIFAR-10 Batch 2:  Loss: 0.9902430176734924
Accuracy: 0.5797999501228333
Epoch 37, CIFAR-10 Batch 3:  Loss: 0.9017811417579651
Accuracy: 0.5717999935150146
Epoch 37, CIFAR-10 Batch 4:  Loss: 0.9513282179832458
Accuracy: 0.5635998845100403
Epoch 37, CIFAR-10 Batch 5:  Loss: 1.0050239562988281
Accuracy: 0.5557999610900879
Epoch 38, CIFAR-10 Batch 1:  Loss: 1.0356336832046509
Accuracy: 0.5769999027252197
Epoch 38, CIFAR-10 Batch 2:  Loss: 0.9729406237602234
Accuracy: 0.5739998817443848
Epoch 38, CIFAR-10 Batch 3:  Loss: 0.8660759329795837
Accuracy: 0.5811999440193176
Epoch 38, CIFAR-10 Batch 4:  Loss: 0.9190993309020996
Accuracy: 0.5769999027252197
Epoch 38, CIFAR-10 Batch 5:  Loss: 0.9484561681747437
Accuracy: 0.5659999251365662
Epoch 39, CIFAR-10 Batch 1:  Loss: 0.9995683431625366
Accuracy: 0.5743999481201172
Epoch 39, CIFAR-10 Batch 2:  Loss: 0.9601972103118896
Accuracy: 0.5809999704360962
Epoch 39, CIFAR-10 Batch 3:  Loss: 0.8585291504859924
Accuracy: 0.5925999283790588
Epoch 39, CIFAR-10 Batch 4:  Loss: 0.8951642513275146
Accuracy: 0.5769999623298645
Epoch 39, CIFAR-10 Batch 5:  Loss: 0.9093440771102905
Accuracy: 0.5743999481201172
Epoch 40, CIFAR-10 Batch 1:  Loss: 0.9618939161300659
Accuracy: 0.5813999176025391
Epoch 40, CIFAR-10 Batch 2:  Loss: 0.9465867877006531
Accuracy: 0.5745999217033386
Epoch 40, CIFAR-10 Batch 3:  Loss: 0.8375347852706909
Accuracy: 0.5927999019622803
Epoch 40, CIFAR-10 Batch 4:  Loss: 0.8908629417419434
Accuracy: 0.5757999420166016
Epoch 40, CIFAR-10 Batch 5:  Loss: 0.9332102537155151
Accuracy: 0.5657999515533447
Epoch 41, CIFAR-10 Batch 1:  Loss: 1.0249629020690918
Accuracy: 0.5651999711990356
Epoch 41, CIFAR-10 Batch 2:  Loss: 0.96689373254776
Accuracy: 0.5725998878479004
Epoch 41, CIFAR-10 Batch 3:  Loss: 0.8688427209854126
Accuracy: 0.5833998918533325
Epoch 41, CIFAR-10 Batch 4:  Loss: 0.8780255317687988
Accuracy: 0.5817999243736267
Epoch 41, CIFAR-10 Batch 5:  Loss: 0.9096155166625977
Accuracy: 0.5723999738693237
Epoch 42, CIFAR-10 Batch 1:  Loss: 0.9939773082733154
Accuracy: 0.5709999203681946
Epoch 42, CIFAR-10 Batch 2:  Loss: 0.9314470887184143
Accuracy: 0.5835999250411987
Epoch 42, CIFAR-10 Batch 3:  Loss: 0.8482932448387146
Accuracy: 0.5865999460220337
Epoch 42, CIFAR-10 Batch 4:  Loss: 0.8721327185630798
Accuracy: 0.5851998925209045
Epoch 42, CIFAR-10 Batch 5:  Loss: 0.8986796140670776
Accuracy: 0.5827999711036682
Epoch 43, CIFAR-10 Batch 1:  Loss: 0.9742856025695801
Accuracy: 0.5815999507904053
Epoch 43, CIFAR-10 Batch 2:  Loss: 0.9164944887161255
Accuracy: 0.579599916934967
Epoch 43, CIFAR-10 Batch 3:  Loss: 0.8257826566696167
Accuracy: 0.5855998992919922
Epoch 43, CIFAR-10 Batch 4:  Loss: 0.8609651327133179
Accuracy: 0.5809999704360962
Epoch 43, CIFAR-10 Batch 5:  Loss: 0.8778703808784485
Accuracy: 0.5845999121665955
Epoch 44, CIFAR-10 Batch 1:  Loss: 0.9487231373786926
Accuracy: 0.5815999507904053
Epoch 44, CIFAR-10 Batch 2:  Loss: 0.9220314025878906
Accuracy: 0.5849999189376831
Epoch 44, CIFAR-10 Batch 3:  Loss: 0.8123381733894348
Accuracy: 0.5951999425888062
Epoch 44, CIFAR-10 Batch 4:  Loss: 0.8274081945419312
Accuracy: 0.5879999399185181
Epoch 44, CIFAR-10 Batch 5:  Loss: 0.8835380673408508
Accuracy: 0.5831999778747559
Epoch 45, CIFAR-10 Batch 1:  Loss: 0.9556437134742737
Accuracy: 0.578999936580658
Epoch 45, CIFAR-10 Batch 2:  Loss: 0.8912386298179626
Accuracy: 0.5965998768806458
Epoch 45, CIFAR-10 Batch 3:  Loss: 0.8045428991317749
Accuracy: 0.5897999405860901
Epoch 45, CIFAR-10 Batch 4:  Loss: 0.8324261903762817
Accuracy: 0.5905998945236206
Epoch 45, CIFAR-10 Batch 5:  Loss: 0.8677601218223572
Accuracy: 0.5837998986244202
Epoch 46, CIFAR-10 Batch 1:  Loss: 0.9555705189704895
Accuracy: 0.5749999284744263
Epoch 46, CIFAR-10 Batch 2:  Loss: 0.850996196269989
Accuracy: 0.5921999216079712
Epoch 46, CIFAR-10 Batch 3:  Loss: 0.7872698903083801
Accuracy: 0.5957999229431152
Epoch 46, CIFAR-10 Batch 4:  Loss: 0.8259141445159912
Accuracy: 0.5885999202728271
Epoch 46, CIFAR-10 Batch 5:  Loss: 0.8353768587112427
Accuracy: 0.591999888420105
Epoch 47, CIFAR-10 Batch 1:  Loss: 0.916469931602478
Accuracy: 0.5817999243736267
Epoch 47, CIFAR-10 Batch 2:  Loss: 0.8487566113471985
Accuracy: 0.5933999419212341
Epoch 47, CIFAR-10 Batch 3:  Loss: 0.7914491891860962
Accuracy: 0.5879999399185181
Epoch 47, CIFAR-10 Batch 4:  Loss: 0.8099326491355896
Accuracy: 0.5899998545646667
Epoch 47, CIFAR-10 Batch 5:  Loss: 0.831154465675354
Accuracy: 0.5899998545646667
Epoch 48, CIFAR-10 Batch 1:  Loss: 0.9351351261138916
Accuracy: 0.5861998796463013
Epoch 48, CIFAR-10 Batch 2:  Loss: 0.8456677794456482
Accuracy: 0.6025998592376709
Epoch 48, CIFAR-10 Batch 3:  Loss: 0.7932615280151367
Accuracy: 0.5879999399185181
Epoch 48, CIFAR-10 Batch 4:  Loss: 0.8186569213867188
Accuracy: 0.5869999527931213
Epoch 48, CIFAR-10 Batch 5:  Loss: 0.8248192667961121
Accuracy: 0.5929999351501465
Epoch 49, CIFAR-10 Batch 1:  Loss: 0.9178931713104248
Accuracy: 0.5837999582290649
Epoch 49, CIFAR-10 Batch 2:  Loss: 0.8433337211608887
Accuracy: 0.5951999425888062
Epoch 49, CIFAR-10 Batch 3:  Loss: 0.8038197755813599
Accuracy: 0.5923998951911926
Epoch 49, CIFAR-10 Batch 4:  Loss: 0.8331674337387085
Accuracy: 0.5745998620986938
Epoch 49, CIFAR-10 Batch 5:  Loss: 0.8285201191902161
Accuracy: 0.5857999324798584
Epoch 50, CIFAR-10 Batch 1:  Loss: 0.8986369967460632
Accuracy: 0.5899999141693115
Epoch 50, CIFAR-10 Batch 2:  Loss: 0.8533978462219238
Accuracy: 0.587399959564209
Epoch 50, CIFAR-10 Batch 3:  Loss: 0.8082410097122192
Accuracy: 0.5857999324798584
Epoch 50, CIFAR-10 Batch 4:  Loss: 0.8220840692520142
Accuracy: 0.5751999616622925
Epoch 50, CIFAR-10 Batch 5:  Loss: 0.8007887005805969
Accuracy: 0.5865999460220337
Epoch 51, CIFAR-10 Batch 1:  Loss: 0.9080939292907715
Accuracy: 0.5847998857498169
Epoch 51, CIFAR-10 Batch 2:  Loss: 0.8597026467323303
Accuracy: 0.5885999202728271
Epoch 51, CIFAR-10 Batch 3:  Loss: 0.7838227152824402
Accuracy: 0.5915999412536621
Epoch 51, CIFAR-10 Batch 4:  Loss: 0.7960761189460754
Accuracy: 0.5839999914169312
Epoch 51, CIFAR-10 Batch 5:  Loss: 0.7876166105270386
Accuracy: 0.5885999202728271
Epoch 52, CIFAR-10 Batch 1:  Loss: 0.8707050085067749
Accuracy: 0.5945999026298523
Epoch 52, CIFAR-10 Batch 2:  Loss: 0.8918219804763794
Accuracy: 0.5821998715400696
Epoch 52, CIFAR-10 Batch 3:  Loss: 0.7585785388946533
Accuracy: 0.600399911403656
Epoch 52, CIFAR-10 Batch 4:  Loss: 0.7862184643745422
Accuracy: 0.5839999318122864
Epoch 52, CIFAR-10 Batch 5:  Loss: 0.8019547462463379
Accuracy: 0.5827998518943787
Epoch 53, CIFAR-10 Batch 1:  Loss: 0.9054005146026611
Accuracy: 0.5899999141693115
Epoch 53, CIFAR-10 Batch 2:  Loss: 0.8536127209663391
Accuracy: 0.5923999547958374
Epoch 53, CIFAR-10 Batch 3:  Loss: 0.7643057703971863
Accuracy: 0.602199912071228
Epoch 53, CIFAR-10 Batch 4:  Loss: 0.81089186668396
Accuracy: 0.5819999575614929
Epoch 53, CIFAR-10 Batch 5:  Loss: 0.8061373829841614
Accuracy: 0.5923999547958374
Epoch 54, CIFAR-10 Batch 1:  Loss: 0.8740873336791992
Accuracy: 0.5927999019622803
Epoch 54, CIFAR-10 Batch 2:  Loss: 0.8606545329093933
Accuracy: 0.5923998951911926
Epoch 54, CIFAR-10 Batch 3:  Loss: 0.7497280240058899
Accuracy: 0.605199933052063
Epoch 54, CIFAR-10 Batch 4:  Loss: 0.7962896823883057
Accuracy: 0.5841999053955078
Epoch 54, CIFAR-10 Batch 5:  Loss: 0.8310375809669495
Accuracy: 0.5787999629974365
Epoch 55, CIFAR-10 Batch 1:  Loss: 0.854174017906189
Accuracy: 0.5891999006271362
Epoch 55, CIFAR-10 Batch 2:  Loss: 0.840709924697876
Accuracy: 0.595599889755249
Epoch 55, CIFAR-10 Batch 3:  Loss: 0.7507706880569458
Accuracy: 0.5949999094009399
Epoch 55, CIFAR-10 Batch 4:  Loss: 0.7696541547775269
Accuracy: 0.5879999399185181
Epoch 55, CIFAR-10 Batch 5:  Loss: 0.797068178653717
Accuracy: 0.5953999161720276
Epoch 56, CIFAR-10 Batch 1:  Loss: 0.8352367281913757
Accuracy: 0.5909999012947083
Epoch 56, CIFAR-10 Batch 2:  Loss: 0.8869938254356384
Accuracy: 0.5707998871803284
Epoch 56, CIFAR-10 Batch 3:  Loss: 0.7558261752128601
Accuracy: 0.5933999419212341
Epoch 56, CIFAR-10 Batch 4:  Loss: 0.7824114561080933
Accuracy: 0.5827999114990234
Epoch 56, CIFAR-10 Batch 5:  Loss: 0.7794389724731445
Accuracy: 0.5965999364852905
Epoch 57, CIFAR-10 Batch 1:  Loss: 0.8018303513526917
Accuracy: 0.5979999303817749
Epoch 57, CIFAR-10 Batch 2:  Loss: 0.8896214962005615
Accuracy: 0.571199893951416
Epoch 57, CIFAR-10 Batch 3:  Loss: 0.7645206451416016
Accuracy: 0.590799868106842
Epoch 57, CIFAR-10 Batch 4:  Loss: 0.8151804804801941
Accuracy: 0.572999894618988
Epoch 57, CIFAR-10 Batch 5:  Loss: 0.7588966488838196
Accuracy: 0.5987999439239502
Epoch 58, CIFAR-10 Batch 1:  Loss: 0.8048614859580994
Accuracy: 0.5973999500274658
Epoch 58, CIFAR-10 Batch 2:  Loss: 0.8231015205383301
Accuracy: 0.5893998742103577
Epoch 58, CIFAR-10 Batch 3:  Loss: 0.7375573515892029
Accuracy: 0.6077999472618103
Epoch 58, CIFAR-10 Batch 4:  Loss: 0.7452040314674377
Accuracy: 0.6043998599052429
Epoch 58, CIFAR-10 Batch 5:  Loss: 0.7360413074493408
Accuracy: 0.612799882888794
Epoch 59, CIFAR-10 Batch 1:  Loss: 0.7701840400695801
Accuracy: 0.6027998924255371
Epoch 59, CIFAR-10 Batch 2:  Loss: 0.8123863935470581
Accuracy: 0.596799910068512
Epoch 59, CIFAR-10 Batch 3:  Loss: 0.7217042446136475
Accuracy: 0.6019998788833618
Epoch 59, CIFAR-10 Batch 4:  Loss: 0.7409265637397766
Accuracy: 0.6073998808860779
Epoch 59, CIFAR-10 Batch 5:  Loss: 0.719189465045929
Accuracy: 0.611799955368042
Epoch 60, CIFAR-10 Batch 1:  Loss: 0.7553321719169617
Accuracy: 0.6101999282836914
Epoch 60, CIFAR-10 Batch 2:  Loss: 0.7847632169723511
Accuracy: 0.5925999879837036
Epoch 60, CIFAR-10 Batch 3:  Loss: 0.7073389887809753
Accuracy: 0.613399863243103
Epoch 60, CIFAR-10 Batch 4:  Loss: 0.728878378868103
Accuracy: 0.5981999039649963
Epoch 60, CIFAR-10 Batch 5:  Loss: 0.7201211452484131
Accuracy: 0.6115999221801758
Epoch 61, CIFAR-10 Batch 1:  Loss: 0.7576557397842407
Accuracy: 0.6073999404907227
Epoch 61, CIFAR-10 Batch 2:  Loss: 0.8002921938896179
Accuracy: 0.5885999202728271
Epoch 61, CIFAR-10 Batch 3:  Loss: 0.7032354474067688
Accuracy: 0.6107999682426453
Epoch 61, CIFAR-10 Batch 4:  Loss: 0.7020531892776489
Accuracy: 0.6191999316215515
Epoch 61, CIFAR-10 Batch 5:  Loss: 0.7167812585830688
Accuracy: 0.6107999086380005
Epoch 62, CIFAR-10 Batch 1:  Loss: 0.7472986578941345
Accuracy: 0.605199933052063
Epoch 62, CIFAR-10 Batch 2:  Loss: 0.7586398124694824
Accuracy: 0.6017999053001404
Epoch 62, CIFAR-10 Batch 3:  Loss: 0.6952897906303406
Accuracy: 0.6063998937606812
Epoch 62, CIFAR-10 Batch 4:  Loss: 0.708458662033081
Accuracy: 0.6137999296188354
Epoch 62, CIFAR-10 Batch 5:  Loss: 0.7235467433929443
Accuracy: 0.6089998483657837
Epoch 63, CIFAR-10 Batch 1:  Loss: 0.7445811629295349
Accuracy: 0.6107999682426453
Epoch 63, CIFAR-10 Batch 2:  Loss: 0.7710694074630737
Accuracy: 0.593799889087677
Epoch 63, CIFAR-10 Batch 3:  Loss: 0.703356146812439
Accuracy: 0.6069998741149902
Epoch 63, CIFAR-10 Batch 4:  Loss: 0.6959435343742371
Accuracy: 0.6101999282836914
Epoch 63, CIFAR-10 Batch 5:  Loss: 0.6877334117889404
Accuracy: 0.614599883556366
Epoch 64, CIFAR-10 Batch 1:  Loss: 0.7449659705162048
Accuracy: 0.5993999242782593
Epoch 64, CIFAR-10 Batch 2:  Loss: 0.7656837105751038
Accuracy: 0.5849998593330383
Epoch 64, CIFAR-10 Batch 3:  Loss: 0.7106620669364929
Accuracy: 0.5993999242782593
Epoch 64, CIFAR-10 Batch 4:  Loss: 0.704264223575592
Accuracy: 0.6055998802185059
Epoch 64, CIFAR-10 Batch 5:  Loss: 0.6919073462486267
Accuracy: 0.6093999147415161
Epoch 65, CIFAR-10 Batch 1:  Loss: 0.7265405058860779
Accuracy: 0.5973999500274658
Epoch 65, CIFAR-10 Batch 2:  Loss: 0.7938323616981506
Accuracy: 0.5901999473571777
Epoch 65, CIFAR-10 Batch 3:  Loss: 0.6798508167266846
Accuracy: 0.6047999262809753
Epoch 65, CIFAR-10 Batch 4:  Loss: 0.6807630062103271
Accuracy: 0.6135998964309692
Epoch 65, CIFAR-10 Batch 5:  Loss: 0.671312689781189
Accuracy: 0.6027998924255371
Epoch 66, CIFAR-10 Batch 1:  Loss: 0.7082055807113647
Accuracy: 0.6057999134063721
Epoch 66, CIFAR-10 Batch 2:  Loss: 0.7720198631286621
Accuracy: 0.5919999480247498
Epoch 66, CIFAR-10 Batch 3:  Loss: 0.7459042072296143
Accuracy: 0.592799961566925
Epoch 66, CIFAR-10 Batch 4:  Loss: 0.697917640209198
Accuracy: 0.6063998937606812
Epoch 66, CIFAR-10 Batch 5:  Loss: 0.6727396249771118
Accuracy: 0.6039999723434448
Epoch 67, CIFAR-10 Batch 1:  Loss: 0.6852396726608276
Accuracy: 0.6133999228477478
Epoch 67, CIFAR-10 Batch 2:  Loss: 0.7754674553871155
Accuracy: 0.5879998803138733
Epoch 67, CIFAR-10 Batch 3:  Loss: 0.7271060943603516
Accuracy: 0.6005999445915222
Epoch 67, CIFAR-10 Batch 4:  Loss: 0.699964702129364
Accuracy: 0.611799955368042
Epoch 67, CIFAR-10 Batch 5:  Loss: 0.6747146844863892
Accuracy: 0.6055999398231506
Epoch 68, CIFAR-10 Batch 1:  Loss: 0.6890732049942017
Accuracy: 0.604200005531311
Epoch 68, CIFAR-10 Batch 2:  Loss: 0.7428317666053772
Accuracy: 0.608799934387207
Epoch 68, CIFAR-10 Batch 3:  Loss: 0.7002039551734924
Accuracy: 0.598599910736084
Epoch 68, CIFAR-10 Batch 4:  Loss: 0.7650289535522461
Accuracy: 0.5915999412536621
Epoch 68, CIFAR-10 Batch 5:  Loss: 0.6747499108314514
Accuracy: 0.6117998957633972
Epoch 69, CIFAR-10 Batch 1:  Loss: 0.6786004900932312
Accuracy: 0.6103999018669128
Epoch 69, CIFAR-10 Batch 2:  Loss: 0.73206627368927
Accuracy: 0.6109999418258667
Epoch 69, CIFAR-10 Batch 3:  Loss: 0.728716254234314
Accuracy: 0.6005998849868774
Epoch 69, CIFAR-10 Batch 4:  Loss: 0.7082545757293701
Accuracy: 0.6063999533653259
Epoch 69, CIFAR-10 Batch 5:  Loss: 0.718620240688324
Accuracy: 0.5937998294830322
Epoch 70, CIFAR-10 Batch 1:  Loss: 0.7180863618850708
Accuracy: 0.610599935054779
Epoch 70, CIFAR-10 Batch 2:  Loss: 0.7977471351623535
Accuracy: 0.5757998824119568
Epoch 70, CIFAR-10 Batch 3:  Loss: 0.6964613795280457
Accuracy: 0.6115999221801758
Epoch 70, CIFAR-10 Batch 4:  Loss: 0.6732594966888428
Accuracy: 0.6153998970985413
Epoch 70, CIFAR-10 Batch 5:  Loss: 0.6714450120925903
Accuracy: 0.6007999181747437
Epoch 71, CIFAR-10 Batch 1:  Loss: 0.6979813575744629
Accuracy: 0.613399863243103
Epoch 71, CIFAR-10 Batch 2:  Loss: 0.7758666276931763
Accuracy: 0.5897998809814453
Epoch 71, CIFAR-10 Batch 3:  Loss: 0.6966779232025146
Accuracy: 0.6013999581336975
Epoch 71, CIFAR-10 Batch 4:  Loss: 0.6689870357513428
Accuracy: 0.6157999038696289
Epoch 71, CIFAR-10 Batch 5:  Loss: 0.6885764598846436
Accuracy: 0.5943999290466309
Epoch 72, CIFAR-10 Batch 1:  Loss: 0.6916587948799133
Accuracy: 0.606999933719635
Epoch 72, CIFAR-10 Batch 2:  Loss: 0.7909512519836426
Accuracy: 0.5871999263763428
Epoch 72, CIFAR-10 Batch 3:  Loss: 0.6916224956512451
Accuracy: 0.5927999019622803
Epoch 72, CIFAR-10 Batch 4:  Loss: 0.7002488374710083
Accuracy: 0.614599883556366
Epoch 72, CIFAR-10 Batch 5:  Loss: 0.672966480255127
Accuracy: 0.6003999710083008
Epoch 73, CIFAR-10 Batch 1:  Loss: 0.6957487463951111
Accuracy: 0.6041998863220215
Epoch 73, CIFAR-10 Batch 2:  Loss: 0.7382797002792358
Accuracy: 0.5971999168395996
Epoch 73, CIFAR-10 Batch 3:  Loss: 0.6981697082519531
Accuracy: 0.5929999351501465
Epoch 73, CIFAR-10 Batch 4:  Loss: 0.6689131259918213
Accuracy: 0.6175999641418457
Epoch 73, CIFAR-10 Batch 5:  Loss: 0.6716704368591309
Accuracy: 0.5933998823165894
Epoch 74, CIFAR-10 Batch 1:  Loss: 0.6589395403862
Accuracy: 0.6029999256134033
Epoch 74, CIFAR-10 Batch 2:  Loss: 0.7051798701286316
Accuracy: 0.600399911403656
Epoch 74, CIFAR-10 Batch 3:  Loss: 0.6778781414031982
Accuracy: 0.5975999236106873
Epoch 74, CIFAR-10 Batch 4:  Loss: 0.6512851715087891
Accuracy: 0.6093999147415161
Epoch 74, CIFAR-10 Batch 5:  Loss: 0.6444223523139954
Accuracy: 0.6059999465942383
Epoch 75, CIFAR-10 Batch 1:  Loss: 0.6485205888748169
Accuracy: 0.6093999147415161
Epoch 75, CIFAR-10 Batch 2:  Loss: 0.7165626287460327
Accuracy: 0.602400004863739
Epoch 75, CIFAR-10 Batch 3:  Loss: 0.6879436373710632
Accuracy: 0.5943999290466309
Epoch 75, CIFAR-10 Batch 4:  Loss: 0.6879680156707764
Accuracy: 0.6039998531341553
Epoch 75, CIFAR-10 Batch 5:  Loss: 0.6586769819259644
Accuracy: 0.6095998883247375
Epoch 76, CIFAR-10 Batch 1:  Loss: 0.7000489234924316
Accuracy: 0.5869998931884766
Epoch 76, CIFAR-10 Batch 2:  Loss: 0.6817396283149719
Accuracy: 0.6069998741149902
Epoch 76, CIFAR-10 Batch 3:  Loss: 0.6637300252914429
Accuracy: 0.5887999534606934
Epoch 76, CIFAR-10 Batch 4:  Loss: 0.6505124568939209
Accuracy: 0.6141999363899231
Epoch 76, CIFAR-10 Batch 5:  Loss: 0.6151851415634155
Accuracy: 0.6117998957633972
Epoch 77, CIFAR-10 Batch 1:  Loss: 0.6797032952308655
Accuracy: 0.6025998592376709
Epoch 77, CIFAR-10 Batch 2:  Loss: 0.7004138827323914
Accuracy: 0.5995999574661255
Epoch 77, CIFAR-10 Batch 3:  Loss: 0.652447521686554
Accuracy: 0.597399890422821
Epoch 77, CIFAR-10 Batch 4:  Loss: 0.6859069466590881
Accuracy: 0.6055999398231506
Epoch 77, CIFAR-10 Batch 5:  Loss: 0.6114071607589722
Accuracy: 0.6121999025344849
Epoch 78, CIFAR-10 Batch 1:  Loss: 0.6373865604400635
Accuracy: 0.6097999215126038
Epoch 78, CIFAR-10 Batch 2:  Loss: 0.6825782656669617
Accuracy: 0.6075999140739441
Epoch 78, CIFAR-10 Batch 3:  Loss: 0.6154797673225403
Accuracy: 0.6125999093055725
Epoch 78, CIFAR-10 Batch 4:  Loss: 0.6636402606964111
Accuracy: 0.6067999005317688
Epoch 78, CIFAR-10 Batch 5:  Loss: 0.5954704880714417
Accuracy: 0.6143999695777893
Epoch 79, CIFAR-10 Batch 1:  Loss: 0.6672543287277222
Accuracy: 0.6011999249458313
Epoch 79, CIFAR-10 Batch 2:  Loss: 0.6725432872772217
Accuracy: 0.6089999079704285
Epoch 79, CIFAR-10 Batch 3:  Loss: 0.6247918605804443
Accuracy: 0.607999861240387
Epoch 79, CIFAR-10 Batch 4:  Loss: 0.6376395225524902
Accuracy: 0.6075999140739441
Epoch 79, CIFAR-10 Batch 5:  Loss: 0.5857803821563721
Accuracy: 0.6119999289512634
Epoch 80, CIFAR-10 Batch 1:  Loss: 0.6367892026901245
Accuracy: 0.6021999716758728
Epoch 80, CIFAR-10 Batch 2:  Loss: 0.7175338268280029
Accuracy: 0.5971999168395996
Epoch 80, CIFAR-10 Batch 3:  Loss: 0.6236100196838379
Accuracy: 0.6075999140739441
Epoch 80, CIFAR-10 Batch 4:  Loss: 0.648184597492218
Accuracy: 0.6093999147415161
Epoch 80, CIFAR-10 Batch 5:  Loss: 0.6047731637954712
Accuracy: 0.6055998802185059
Epoch 81, CIFAR-10 Batch 1:  Loss: 0.6604344248771667
Accuracy: 0.6047998666763306
Epoch 81, CIFAR-10 Batch 2:  Loss: 0.7628571391105652
Accuracy: 0.5727999210357666
Epoch 81, CIFAR-10 Batch 3:  Loss: 0.6203940510749817
Accuracy: 0.6089999079704285
Epoch 81, CIFAR-10 Batch 4:  Loss: 0.6494494676589966
Accuracy: 0.6095999479293823
Epoch 81, CIFAR-10 Batch 5:  Loss: 0.6211603879928589
Accuracy: 0.6053999066352844
Epoch 82, CIFAR-10 Batch 1:  Loss: 0.7111682891845703
Accuracy: 0.5969999432563782
Epoch 82, CIFAR-10 Batch 2:  Loss: 0.7490618228912354
Accuracy: 0.579800009727478
Epoch 82, CIFAR-10 Batch 3:  Loss: 0.6673431992530823
Accuracy: 0.6041998863220215
Epoch 82, CIFAR-10 Batch 4:  Loss: 0.6767010688781738
Accuracy: 0.5991998910903931
Epoch 82, CIFAR-10 Batch 5:  Loss: 0.614227831363678
Accuracy: 0.6041998863220215
Epoch 83, CIFAR-10 Batch 1:  Loss: 0.6775831580162048
Accuracy: 0.6005999445915222
Epoch 83, CIFAR-10 Batch 2:  Loss: 0.6798211336135864
Accuracy: 0.6031998991966248
Epoch 83, CIFAR-10 Batch 3:  Loss: 0.5923741459846497
Accuracy: 0.6183999180793762
Epoch 83, CIFAR-10 Batch 4:  Loss: 0.6655900478363037
Accuracy: 0.6169998645782471
Epoch 83, CIFAR-10 Batch 5:  Loss: 0.6249979734420776
Accuracy: 0.6045998930931091
Epoch 84, CIFAR-10 Batch 1:  Loss: 0.6627495288848877
Accuracy: 0.5951999425888062
Epoch 84, CIFAR-10 Batch 2:  Loss: 0.6590653657913208
Accuracy: 0.6023998856544495
Epoch 84, CIFAR-10 Batch 3:  Loss: 0.6249649524688721
Accuracy: 0.6115999221801758
Epoch 84, CIFAR-10 Batch 4:  Loss: 0.6202149987220764
Accuracy: 0.6225998997688293
Epoch 84, CIFAR-10 Batch 5:  Loss: 0.6058585047721863
Accuracy: 0.6089999079704285
Epoch 85, CIFAR-10 Batch 1:  Loss: 0.6054605841636658
Accuracy: 0.6075998544692993
Epoch 85, CIFAR-10 Batch 2:  Loss: 0.6573390960693359
Accuracy: 0.6053999066352844
Epoch 85, CIFAR-10 Batch 3:  Loss: 0.6555970311164856
Accuracy: 0.5977998971939087
Epoch 85, CIFAR-10 Batch 4:  Loss: 0.6162811517715454
Accuracy: 0.6075998544692993
Epoch 85, CIFAR-10 Batch 5:  Loss: 0.5670382976531982
Accuracy: 0.6149998903274536
Epoch 86, CIFAR-10 Batch 1:  Loss: 0.6121599674224854
Accuracy: 0.6107999086380005
Epoch 86, CIFAR-10 Batch 2:  Loss: 0.6247626543045044
Accuracy: 0.6103999614715576
Epoch 86, CIFAR-10 Batch 3:  Loss: 0.6185280084609985
Accuracy: 0.5983998775482178
Epoch 86, CIFAR-10 Batch 4:  Loss: 0.5946686267852783
Accuracy: 0.6179998517036438
Epoch 86, CIFAR-10 Batch 5:  Loss: 0.5829805135726929
Accuracy: 0.601599931716919
Epoch 87, CIFAR-10 Batch 1:  Loss: 0.5900936126708984
Accuracy: 0.6103999614715576
Epoch 87, CIFAR-10 Batch 2:  Loss: 0.6001125574111938
Accuracy: 0.6137999296188354
Epoch 87, CIFAR-10 Batch 3:  Loss: 0.7081781029701233
Accuracy: 0.5793998837471008
Epoch 87, CIFAR-10 Batch 4:  Loss: 0.6046320796012878
Accuracy: 0.6175999045372009
Epoch 87, CIFAR-10 Batch 5:  Loss: 0.6201643347740173
Accuracy: 0.5901999473571777
Epoch 88, CIFAR-10 Batch 1:  Loss: 0.6024314165115356
Accuracy: 0.6143999099731445
Epoch 88, CIFAR-10 Batch 2:  Loss: 0.5962972640991211
Accuracy: 0.6131998896598816
Epoch 88, CIFAR-10 Batch 3:  Loss: 0.6326159238815308
Accuracy: 0.5885999202728271
Epoch 88, CIFAR-10 Batch 4:  Loss: 0.6304265856742859
Accuracy: 0.6103999018669128
Epoch 88, CIFAR-10 Batch 5:  Loss: 0.6122735738754272
Accuracy: 0.5937999486923218
Epoch 89, CIFAR-10 Batch 1:  Loss: 0.6008129119873047
Accuracy: 0.6079999208450317
Epoch 89, CIFAR-10 Batch 2:  Loss: 0.5867737531661987
Accuracy: 0.610599935054779
Epoch 89, CIFAR-10 Batch 3:  Loss: 0.6151114106178284
Accuracy: 0.5851999521255493
Epoch 89, CIFAR-10 Batch 4:  Loss: 0.6613045930862427
Accuracy: 0.5965998768806458
Epoch 89, CIFAR-10 Batch 5:  Loss: 0.5937601923942566
Accuracy: 0.6087998747825623
Epoch 90, CIFAR-10 Batch 1:  Loss: 0.6264822483062744
Accuracy: 0.5963999032974243
Epoch 90, CIFAR-10 Batch 2:  Loss: 0.6548365354537964
Accuracy: 0.5787999033927917
Epoch 90, CIFAR-10 Batch 3:  Loss: 0.6109287142753601
Accuracy: 0.6103999614715576
Epoch 90, CIFAR-10 Batch 4:  Loss: 0.6420559287071228
Accuracy: 0.5991998910903931
Epoch 90, CIFAR-10 Batch 5:  Loss: 0.5759457945823669
Accuracy: 0.618399977684021
Epoch 91, CIFAR-10 Batch 1:  Loss: 0.6439650058746338
Accuracy: 0.5959999561309814
Epoch 91, CIFAR-10 Batch 2:  Loss: 0.5970332026481628
Accuracy: 0.6073999404907227
Epoch 91, CIFAR-10 Batch 3:  Loss: 0.6106422543525696
Accuracy: 0.6105998754501343
Epoch 91, CIFAR-10 Batch 4:  Loss: 0.618037223815918
Accuracy: 0.6041998863220215
Epoch 91, CIFAR-10 Batch 5:  Loss: 0.5523041486740112
Accuracy: 0.621799886226654
Epoch 92, CIFAR-10 Batch 1:  Loss: 0.6090563535690308
Accuracy: 0.5991998910903931
Epoch 92, CIFAR-10 Batch 2:  Loss: 0.6213269233703613
Accuracy: 0.6009999513626099
Epoch 92, CIFAR-10 Batch 3:  Loss: 0.5793441534042358
Accuracy: 0.6055998802185059
Epoch 92, CIFAR-10 Batch 4:  Loss: 0.6015868186950684
Accuracy: 0.6047999262809753
Epoch 92, CIFAR-10 Batch 5:  Loss: 0.540429413318634
Accuracy: 0.6141998767852783
Epoch 93, CIFAR-10 Batch 1:  Loss: 0.5728280544281006
Accuracy: 0.6071999073028564
Epoch 93, CIFAR-10 Batch 2:  Loss: 0.5793380737304688
Accuracy: 0.6111999154090881
Epoch 93, CIFAR-10 Batch 3:  Loss: 0.5717971920967102
Accuracy: 0.607999861240387
Epoch 93, CIFAR-10 Batch 4:  Loss: 0.6428052186965942
Accuracy: 0.5921999216079712
Epoch 93, CIFAR-10 Batch 5:  Loss: 0.5374500751495361
Accuracy: 0.619999885559082
Epoch 94, CIFAR-10 Batch 1:  Loss: 0.5704935789108276
Accuracy: 0.6093999147415161
Epoch 94, CIFAR-10 Batch 2:  Loss: 0.5863072276115417
Accuracy: 0.6153998970985413
Epoch 94, CIFAR-10 Batch 3:  Loss: 0.5947328805923462
Accuracy: 0.6053999662399292
Epoch 94, CIFAR-10 Batch 4:  Loss: 0.6022171974182129
Accuracy: 0.5935999155044556
Epoch 94, CIFAR-10 Batch 5:  Loss: 0.5204883813858032
Accuracy: 0.6183999180793762
Epoch 95, CIFAR-10 Batch 1:  Loss: 0.5696545243263245
Accuracy: 0.6131998896598816
Epoch 95, CIFAR-10 Batch 2:  Loss: 0.5641316175460815
Accuracy: 0.6153998970985413
Epoch 95, CIFAR-10 Batch 3:  Loss: 0.5915483236312866
Accuracy: 0.6161999106407166
Epoch 95, CIFAR-10 Batch 4:  Loss: 0.5802979469299316
Accuracy: 0.6025999188423157
Epoch 95, CIFAR-10 Batch 5:  Loss: 0.5593999624252319
Accuracy: 0.6045999526977539
Epoch 96, CIFAR-10 Batch 1:  Loss: 0.5640511512756348
Accuracy: 0.6085999011993408
Epoch 96, CIFAR-10 Batch 2:  Loss: 0.5655643343925476
Accuracy: 0.6175999045372009
Epoch 96, CIFAR-10 Batch 3:  Loss: 0.5371377468109131
Accuracy: 0.6125999093055725
Epoch 96, CIFAR-10 Batch 4:  Loss: 0.5368645787239075
Accuracy: 0.6099998950958252
Epoch 96, CIFAR-10 Batch 5:  Loss: 0.5160567760467529
Accuracy: 0.6131998896598816
Epoch 97, CIFAR-10 Batch 1:  Loss: 0.5772199630737305
Accuracy: 0.6053999066352844
Epoch 97, CIFAR-10 Batch 2:  Loss: 0.5590757727622986
Accuracy: 0.614599883556366
Epoch 97, CIFAR-10 Batch 3:  Loss: 0.5405239462852478
Accuracy: 0.6105998754501343
Epoch 97, CIFAR-10 Batch 4:  Loss: 0.5422168970108032
Accuracy: 0.6031998991966248
Epoch 97, CIFAR-10 Batch 5:  Loss: 0.5271194577217102
Accuracy: 0.6071999073028564
Epoch 98, CIFAR-10 Batch 1:  Loss: 0.5425102710723877
Accuracy: 0.606999933719635
Epoch 98, CIFAR-10 Batch 2:  Loss: 0.5393474698066711
Accuracy: 0.613599956035614
Epoch 98, CIFAR-10 Batch 3:  Loss: 0.5395009517669678
Accuracy: 0.6089999675750732
Epoch 98, CIFAR-10 Batch 4:  Loss: 0.5293261408805847
Accuracy: 0.6097999215126038
Epoch 98, CIFAR-10 Batch 5:  Loss: 0.5044466257095337
Accuracy: 0.6029999256134033
Epoch 99, CIFAR-10 Batch 1:  Loss: 0.563018798828125
Accuracy: 0.6055998802185059
Epoch 99, CIFAR-10 Batch 2:  Loss: 0.5658721327781677
Accuracy: 0.6161999702453613
Epoch 99, CIFAR-10 Batch 3:  Loss: 0.5271870493888855
Accuracy: 0.6119999289512634
Epoch 99, CIFAR-10 Batch 4:  Loss: 0.5220988392829895
Accuracy: 0.6123999357223511
Epoch 99, CIFAR-10 Batch 5:  Loss: 0.5114074349403381
Accuracy: 0.6035999059677124
Epoch 100, CIFAR-10 Batch 1:  Loss: 0.5345227122306824
Accuracy: 0.6019999384880066
Epoch 100, CIFAR-10 Batch 2:  Loss: 0.5572676062583923
Accuracy: 0.6059999465942383
Epoch 100, CIFAR-10 Batch 3:  Loss: 0.5369055271148682
Accuracy: 0.6015998721122742
Epoch 100, CIFAR-10 Batch 4:  Loss: 0.5666159391403198
Accuracy: 0.598599910736084
Epoch 100, CIFAR-10 Batch 5:  Loss: 0.5421890020370483
Accuracy: 0.5941999554634094

Checkpoint

The model has been saved to disk.

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.


In [18]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_test.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for test_feature_batch, test_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: test_feature_batch, loaded_y: test_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


Testing Accuracy: 0.5991555601358414

Why 50-80% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 80%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.


In [ ]: