Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile('cifar-10-python.tar.gz'):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            'cifar-10-python.tar.gz',
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open('cifar-10-python.tar.gz') as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


All files found!

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

  • airplane 1
  • automobile 2
  • bird 3
  • cat 4
  • deer 5
  • dog 6
  • frog 7
  • horse 8
  • ship 9
  • truck 10

  • Total 10 classes (Aras changed above/this section a bit)

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.


In [2]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id = 1
sample_id = 5
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 1:
Samples: 10000
Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}
First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]

Example of Image 5:
Image - Min Value: 0 Max Value: 252
Image - Shape: (32, 32, 3)
Label - Label Id: 1 Name: automobile

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.


In [3]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    # TODO: Implement Function
    ## image data shape = [t, i,j,k], t= num_img_per_batch (basically the list of images), i,j,k=height,width, and depth/channel
    return x/255


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_normalize(normalize)


Tests Passed

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.

Hint: Don't reinvent the wheel.


In [4]:
# import helper ## I did this because sklearn.preprocessing was defined in there
from sklearn import preprocessing  ## from sklearn lib import preprocessing lib/sublib/functionality/class

def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    # TODO: Implement Function

    ## This was in the helper.py which belongs to the generic helper functions
    #     def display_image_predictions(features, labels, predictions):
    #     n_classes = 10
    #     label_names = _load_label_names()
    #     label_binarizer = LabelBinarizer()
    #     label_binarizer.fit(range(n_classes))
    #     label_ids = label_binarizer.inverse_transform(np.array(labels))
    label_binarizer = preprocessing.LabelBinarizer() ## instantiate and initialized the one-hot encoder from class to one-hot
    n_class = 10 ## total num_classes
    label_binarizer.fit(range(n_class)) ## fit the one-vec to the range of number of classes, 10 in this case (dataset)
    return label_binarizer.transform(x) ## transform the class labels to one-hot vec


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_one_hot_encode(one_hot_encode)


Tests Passed

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.


In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [6]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

If you're finding it hard to dedicate enough time for this course a week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use TensorFlow Layers or TensorFlow Layers (contrib) to build each layer, except "Convolutional & Max Pooling" layer. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

If you would like to get the most of this course, try to solve all the problems without TF Layers. Let's begin!

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

  • Implement neural_net_image_input
    • Return a TF Placeholder
    • Set the shape using image_shape with batch size set to None.
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_label_input
    • Return a TF Placeholder
    • Set the shape using n_classes with batch size set to None.
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_keep_prob_input
    • Return a TF Placeholder for dropout keep probability.
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.

These names will be used at the end of the project to load your saved model.

Note: None for shapes in TensorFlow allow for a dynamic size.


In [7]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a bach of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    # TODO: Implement Function
    ## name the placeholder?? why should I do that? I can return it directly like below
    ## It is also worth mentioning that the overal image_shape = [i,j,k] meaning row, col, channels/depth or 
    ## i, j, k
    ## h, w, depth (deep-wide learning)
    ## r, c, channels
    ## y, x, z
    ## Data_structure AKA data_shape are usually defined dshape = [i, j, k] as a tensor/Mat/Vec or even a scalar
    ## This is kind of tricky: image_shape is probablly pointing at the img_hight, img_width, and image_depth as well
    ## x_tensor is probably the input image or images or input batch
    return tf.placeholder(tf.float32, 
                          [None, image_shape[0], image_shape[1], image_shape[2]], 
                          name='x')


def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    # TODO: Implement Function
    ## tf.placehoolder(data_type, data_structure (data_shape))
    return tf.placeholder(tf.int32, [None, n_classes], name='y') ## the t/batch_size/num_img_per_batch = None & n/num_dimension = n_classes


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    # TODO: Implement Function
    ## Basically the main command should be like tf.placeholder(tf.float32, [None, 1]) since it is a scalar but I can also use a variable for it as well.
    return tf.placeholder(dtype=tf.float32, name='keep_prob') ## this is basically a scalar but it is data_type/dtype is not INT but float since it is a probability value 0-1 (is it really??).


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Image Input Tests Passed.
Label Input Tests Passed.
Keep Prob Tests Passed.

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • Apply a convolution to x_tensor using weight and conv_strides.
    • We recommend you use same padding, but you're welcome to use any padding.
  • Add bias
  • Add a nonlinear activation to the convolution.
  • Apply Max Pooling using pool_ksize and pool_strides.
    • We recommend you use same padding, but you're welcome to use any padding.

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer. You're free to use any TensorFlow package for all the other layers.


In [8]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_ksize: kernal size 2-D Tuple for the convolutional layer ## Aras: This might be probablly 3-D Tuple???
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """
    # TODO: Implement Function
    x_tensor_shape = x_tensor.get_shape() ## x_tensor is 4D tensor [t/0, i/1, j/2, k/3] => x_tensor 4D tensor
    k = x_tensor.get_shape()[3:4].num_elements() ## there shoudl be a simpler way!! num_elements/length
    w_shape = [conv_ksize[0], conv_ksize[1], k, conv_num_outputs] # conv_num_output 1D tuple or a scalar
    w = tf.Variable(tf.truncated_normal(w_shape, stddev=0.05))
    
    ## conv_kernel_strides
    stride = [1, conv_strides[0], conv_strides[1], 1] # t == batch/num_data-img-images, i=hight/row/y, j=width/x/col, k=depth is usually 1
    y_tensor = tf.nn.conv2d(x_tensor, w, stride, padding='VALID') ## apply convolution

    ## Let's create biases
    #     b = tf.Variable(tf.zeros(conv_num_outputs))
    b = tf.zeros(conv_num_outputs)
    #conv_layer = tf.nn.bias_add(conv_layer, bias) ## add biases
    y_tensor += b
    
    ## adding relu function/activate function and the output is the h_tensor hidden layer output
    h_tensor = tf.nn.relu(y_tensor) ## apply non-linearity, i.e. ReLU function
    
    ## pooling: can be max pooling or can be average pooling. Do not know any other kind.
    ## pool_ksize is a 2D tuple = [i, j]
    kernel = [1, pool_ksize[0], pool_ksize[1], 1] # the same as stride, it is 4D tuple [batch=1, ksize, kchannels=1]
    ## pool_strides is a 2D tuple [i, j]
    stride = [1, pool_strides[0], pool_ksize[1], 1] # [batch=1 (usually one input tensor per batch), ksize (2d tuple), k=1 (depth/num of channels)]
    return tf.nn.max_pool(h_tensor, kernel, stride, padding='VALID')

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_con_pool(conv2d_maxpool)


Tests Passed

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.


In [9]:
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    # TODO: Implement Function
    # x_tensor shape = [batch, img_w, img_height, img_depth]
    # img_flat = img_w * img_h * img_d
    n_size = x_tensor.get_shape()[1:4].num_elements() #1,2,3
    ## The final output should be x_tensor_flat = [t_size, n_size] in which t_size is totally unchanged
    #t_size = x_tensor.get_shape()[0:1].num_elements() # only [0] which ic sthe first dimension/num_elements in 1st dimension
    # now we should do reshape 
    return tf.reshape(x_tensor, [-1, n_size])

# def conv_net(x, weights, biases, dropout):
#     # Layer 1 - 28*28*1 to 14*14*32
#     conv1 = conv2d(x, weights['wc1'], biases['bc1'])
#     conv1 = maxpool2d(conv1, k=2)

#     # Layer 2 - 14*14*32 to 7*7*64
#     conv2 = conv2d(conv1, weights['wc2'], biases['bc2'])
#     conv2 = maxpool2d(conv2, k=2)

#     # Fully connected layer - 7*7*64 to 1024
#     fc1 = tf.reshape(conv2, [-1, weights['wd1'].get_shape().as_list()[0]])
#     fc1 = tf.add(tf.matmul(fc1, weights['wd1']), biases['bd1'])
#     fc1 = tf.nn.relu(fc1)
#     fc1 = tf.nn.dropout(fc1, dropout)

#     # Output Layer - class prediction - 1024 to 10
#     out = tf.add(tf.matmul(fc1, weights['out']), biases['out'])
#     return out


########## This is how Siraj implemented this layer.
# def flatten_layer(layer):
#     # Get the shape of the input layer.
#     layer_shape = layer.get_shape()

#     # The shape of the input layer is assumed to be:
#     # layer_shape == [num_images, img_height, img_width, num_channels]

#     # The number of features is: img_height * img_width * num_channels
#     # We can use a function from TensorFlow to calculate this.
#     num_features = layer_shape[1:4].num_elements()
    
#     # Reshape the layer to [num_images, num_features].
#     # Note that we just set the size of the second dimension
#     # to num_features and the size of the first dimension to -1
#     # which means the size in that dimension is calculated
#     # so the total size of the tensor is unchanged from the reshaping.
#     layer_flat = tf.reshape(layer, [-1, num_features])

#     # The shape of the flattened layer is now:
#     # [num_images, img_height * img_width * num_channels]

#     # Return both the flattened layer and the number of features.
#     return layer_flat, num_features

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_flatten(flatten)


Tests Passed

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.


In [10]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    # This is basically a typical MLP input units and hidden units/ neurons with input data dimensions/variables
    # w*x+b
    ## X_tensor is two dimensional tensor [t, n]. That is why w shape is [n, num_outputs]
    ## Since this is a FC (fully connected layer), n is 1 since the size of input units are equal to 1.
    ### In this layer, wx+b and then we apply ReLU/Sigmoid, and ..
    ## Let's define out w with variables
    n_size = x_tensor.get_shape()[1:2].num_elements() # [t, n]
    w = tf.Variable(tf.truncated_normal(shape=[n_size, num_outputs], stddev=0.05)) # normal dist function has mean and stddev
    b = tf.zeros(num_outputs)
    y_tensor = tf.matmul(x_tensor, w) + b
    ### Apply ReLU activatinon/non-linearity
    return tf.nn.relu(y_tensor)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_fully_conn(fully_conn)


Tests Passed

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.

Note: Activation, softmax, or cross entropy shouldn't be applied to this.


In [11]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    ## The only differemce between this layer and the previous fully connected layer is the softmax for classification.
    ## Instead of ReLU, softmax should be used.
    n_size = x_tensor.get_shape()[1:2].num_elements() ## to get the num_features
    ## Let's assign w and b for wx+b
    w = tf.Variable(tf.truncated_normal(shape=[n_size, num_outputs], stddev=0.05))
    b = tf.zeros(num_outputs)
    return tf.matmul(x_tensor, w) + b
    #return tf.nn.softmax(y_tensor) ## this should NOT be applied because in error with cross entropy softmax will be applied.
    ## That is why only logits wx+b is needed for this layer.


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_output(output)


Tests Passed

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • Apply a Flatten Layer
  • Apply 1, 2, or 3 Fully Connected Layers
  • Apply an Output Layer
  • Return the output
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.

In [18]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """
    # TODO: Apply 1, 2, or 3 Convolution and Max Pool layers

    # The 1st convolutional layer
    h_tensor = conv2d_maxpool(x_tensor=x, conv_num_outputs=20, conv_ksize=[3, 3], conv_strides=[1, 1], 
                              pool_ksize=[2, 2], pool_strides=[1, 1])
    
    ## The 2nd convolutional layer is added to increase the validation accuracy above 50%
    h_tensor = conv2d_maxpool(x_tensor=h_tensor, conv_num_outputs=20, conv_ksize=[3, 3], conv_strides=[1, 1], 
                              pool_ksize=[2, 2], pool_strides=[1, 1])

    # TODO: Apply a Flatten Layer
    x_tensor_flattened = flatten(x_tensor=h_tensor)
    

    # TODO: Apply 1, 2, or 3 Fully Connected Layers
    h_tensor = fully_conn(x_tensor=x_tensor_flattened, num_outputs=20)

    ## Where should Dropout be applied?
    ## tf.nn.dropout(x, keep_prob, noise_shape=None, seed=None, name=None)
    h_tensor = tf.nn.dropout(x=h_tensor, keep_prob=keep_prob, noise_shape=None, seed=None, name=None)
  
    # TODO: Apply an Output Layer
    # TODO: return output
    num_classes = 10  ## This is the toal number of classes for the clssification task
    return output(x_tensor=h_tensor, num_outputs=num_classes)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""

##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

tests.test_conv_net(conv_net)


Neural Network Built!

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

  • x for image input
  • y for labels
  • keep_prob for keep probability for dropout

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

Note: Nothing needs to be returned. This function is only optimizing the neural network.


In [19]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    # TODO: Implement Function

    # Feed the dictionary (x, y, dropout_prob) from Numpy (Python) into TensorFlow (Tensors)
    feed_dict_train = {x:feature_batch, y:label_batch, keep_prob:keep_probability}

    # Run the optimizer on the fed training dict (TF training data).
    session.run(optimizer, feed_dict=feed_dict_train)
    
    pass
        
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_train_nn(train_neural_network)


Tests Passed

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.


In [20]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """
    # TODO: Implement Function
    
    ## Placeholders for input/feature and output/labels -> I am NOT sure if redefining the TFPlaceholders/tf.placeholders are neccessary again for this part.
    ## Feed_dict_train has to be redefined again I guess.
    feed_dict_train = {x:feature_batch, y:label_batch, keep_prob:1.0}
    ## Print out loss using TF cost function in a session
    cost_train = session.run(cost, feed_dict=feed_dict_train)
    accuracy_train = session.run(accuracy, feed_dict=feed_dict_train)
    print("trainging cost: ", cost_train, "accuracy: ", accuracy_train)

    ## Using the placeholders declared globally before and at the top
    feed_dict_valid = {x:valid_features, y:valid_labels, keep_prob:1.0}
    ## Print out validation accuracy using TF accuracy function with valid_features and valid_labels
    cost_valid = session.run(cost, feed_dict=feed_dict_valid)
    accuracy_valid = session.run(accuracy, feed_dict=feed_dict_valid)
    print("validation cost: ", cost_valid, "accuracy: ", accuracy_valid)
    
    
    pass

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
    • 64
    • 128
    • 256
    • ...
  • Set keep_probability to the probability of keeping a node using dropout

In [21]:
# TODO: Tune Parameters
epochs = 100 #None
batch_size = 64 # recommanded as min #32 #L1d Cache from lsppu command #None
keep_probability = 0.5 #None

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.


In [22]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  trainging cost:  2.20174 accuracy:  0.15
validation cost:  2.13537 accuracy:  0.1758
Epoch  2, CIFAR-10 Batch 1:  trainging cost:  2.17666 accuracy:  0.275
validation cost:  2.02108 accuracy:  0.243
Epoch  3, CIFAR-10 Batch 1:  trainging cost:  2.09295 accuracy:  0.3
validation cost:  1.88334 accuracy:  0.3128
Epoch  4, CIFAR-10 Batch 1:  trainging cost:  2.08236 accuracy:  0.275
validation cost:  1.80959 accuracy:  0.3402
Epoch  5, CIFAR-10 Batch 1:  trainging cost:  2.05333 accuracy:  0.45
validation cost:  1.75386 accuracy:  0.3608
Epoch  6, CIFAR-10 Batch 1:  trainging cost:  2.02974 accuracy:  0.25
validation cost:  1.70122 accuracy:  0.3838
Epoch  7, CIFAR-10 Batch 1:  trainging cost:  2.00613 accuracy:  0.325
validation cost:  1.71892 accuracy:  0.3846
Epoch  8, CIFAR-10 Batch 1:  trainging cost:  1.81111 accuracy:  0.425
validation cost:  1.62851 accuracy:  0.4318
Epoch  9, CIFAR-10 Batch 1:  trainging cost:  1.78442 accuracy:  0.35
validation cost:  1.58589 accuracy:  0.4372
Epoch 10, CIFAR-10 Batch 1:  trainging cost:  1.77105 accuracy:  0.425
validation cost:  1.57224 accuracy:  0.4406
Epoch 11, CIFAR-10 Batch 1:  trainging cost:  1.76993 accuracy:  0.45
validation cost:  1.57003 accuracy:  0.4294
Epoch 12, CIFAR-10 Batch 1:  trainging cost:  1.72282 accuracy:  0.425
validation cost:  1.53703 accuracy:  0.446
Epoch 13, CIFAR-10 Batch 1:  trainging cost:  1.65401 accuracy:  0.4
validation cost:  1.49834 accuracy:  0.4684
Epoch 14, CIFAR-10 Batch 1:  trainging cost:  1.61322 accuracy:  0.425
validation cost:  1.50801 accuracy:  0.4684
Epoch 15, CIFAR-10 Batch 1:  trainging cost:  1.57755 accuracy:  0.425
validation cost:  1.50121 accuracy:  0.4618
Epoch 16, CIFAR-10 Batch 1:  trainging cost:  1.55769 accuracy:  0.4
validation cost:  1.4751 accuracy:  0.4802
Epoch 17, CIFAR-10 Batch 1:  trainging cost:  1.60191 accuracy:  0.425
validation cost:  1.48322 accuracy:  0.4726
Epoch 18, CIFAR-10 Batch 1:  trainging cost:  1.58241 accuracy:  0.5
validation cost:  1.49168 accuracy:  0.4638
Epoch 19, CIFAR-10 Batch 1:  trainging cost:  1.50081 accuracy:  0.425
validation cost:  1.46349 accuracy:  0.4722
Epoch 20, CIFAR-10 Batch 1:  trainging cost:  1.49232 accuracy:  0.45
validation cost:  1.45729 accuracy:  0.4806
Epoch 21, CIFAR-10 Batch 1:  trainging cost:  1.53268 accuracy:  0.4
validation cost:  1.46573 accuracy:  0.4796
Epoch 22, CIFAR-10 Batch 1:  trainging cost:  1.50489 accuracy:  0.45
validation cost:  1.47224 accuracy:  0.4754
Epoch 23, CIFAR-10 Batch 1:  trainging cost:  1.44074 accuracy:  0.5
validation cost:  1.45907 accuracy:  0.4802
Epoch 24, CIFAR-10 Batch 1:  trainging cost:  1.44442 accuracy:  0.45
validation cost:  1.4686 accuracy:  0.4746
Epoch 25, CIFAR-10 Batch 1:  trainging cost:  1.3952 accuracy:  0.5
validation cost:  1.42829 accuracy:  0.4892
Epoch 26, CIFAR-10 Batch 1:  trainging cost:  1.40077 accuracy:  0.45
validation cost:  1.43788 accuracy:  0.4902
Epoch 27, CIFAR-10 Batch 1:  trainging cost:  1.3777 accuracy:  0.4
validation cost:  1.3948 accuracy:  0.5032
Epoch 28, CIFAR-10 Batch 1:  trainging cost:  1.2995 accuracy:  0.5
validation cost:  1.3934 accuracy:  0.5056
Epoch 29, CIFAR-10 Batch 1:  trainging cost:  1.33372 accuracy:  0.5
validation cost:  1.43761 accuracy:  0.4914
Epoch 30, CIFAR-10 Batch 1:  trainging cost:  1.27572 accuracy:  0.525
validation cost:  1.3999 accuracy:  0.502
Epoch 31, CIFAR-10 Batch 1:  trainging cost:  1.28804 accuracy:  0.525
validation cost:  1.42287 accuracy:  0.4952
Epoch 32, CIFAR-10 Batch 1:  trainging cost:  1.26566 accuracy:  0.55
validation cost:  1.41951 accuracy:  0.4958
Epoch 33, CIFAR-10 Batch 1:  trainging cost:  1.21211 accuracy:  0.55
validation cost:  1.40718 accuracy:  0.4968
Epoch 34, CIFAR-10 Batch 1:  trainging cost:  1.22901 accuracy:  0.5
validation cost:  1.37328 accuracy:  0.5104
Epoch 35, CIFAR-10 Batch 1:  trainging cost:  1.17884 accuracy:  0.575
validation cost:  1.41704 accuracy:  0.503
Epoch 36, CIFAR-10 Batch 1:  trainging cost:  1.16773 accuracy:  0.575
validation cost:  1.38475 accuracy:  0.5138
Epoch 37, CIFAR-10 Batch 1:  trainging cost:  1.14558 accuracy:  0.55
validation cost:  1.37611 accuracy:  0.5132
Epoch 38, CIFAR-10 Batch 1:  trainging cost:  1.18348 accuracy:  0.6
validation cost:  1.43404 accuracy:  0.5012
Epoch 39, CIFAR-10 Batch 1:  trainging cost:  1.12787 accuracy:  0.625
validation cost:  1.37944 accuracy:  0.5172
Epoch 40, CIFAR-10 Batch 1:  trainging cost:  1.09127 accuracy:  0.625
validation cost:  1.38837 accuracy:  0.5224
Epoch 41, CIFAR-10 Batch 1:  trainging cost:  1.07844 accuracy:  0.675
validation cost:  1.44533 accuracy:  0.4876
Epoch 42, CIFAR-10 Batch 1:  trainging cost:  1.0501 accuracy:  0.575
validation cost:  1.38627 accuracy:  0.5218
Epoch 43, CIFAR-10 Batch 1:  trainging cost:  1.05601 accuracy:  0.6
validation cost:  1.39706 accuracy:  0.5114
Epoch 44, CIFAR-10 Batch 1:  trainging cost:  0.999393 accuracy:  0.675
validation cost:  1.38801 accuracy:  0.5234
Epoch 45, CIFAR-10 Batch 1:  trainging cost:  0.950627 accuracy:  0.675
validation cost:  1.38025 accuracy:  0.5208
Epoch 46, CIFAR-10 Batch 1:  trainging cost:  0.981807 accuracy:  0.675
validation cost:  1.42468 accuracy:  0.5114
Epoch 47, CIFAR-10 Batch 1:  trainging cost:  0.942426 accuracy:  0.725
validation cost:  1.3931 accuracy:  0.5204
Epoch 48, CIFAR-10 Batch 1:  trainging cost:  0.971446 accuracy:  0.7
validation cost:  1.40836 accuracy:  0.5182
Epoch 49, CIFAR-10 Batch 1:  trainging cost:  0.944241 accuracy:  0.675
validation cost:  1.40713 accuracy:  0.5216
Epoch 50, CIFAR-10 Batch 1:  trainging cost:  0.912532 accuracy:  0.65
validation cost:  1.41401 accuracy:  0.5182
Epoch 51, CIFAR-10 Batch 1:  trainging cost:  0.879146 accuracy:  0.8
validation cost:  1.42428 accuracy:  0.514
Epoch 52, CIFAR-10 Batch 1:  trainging cost:  0.817731 accuracy:  0.75
validation cost:  1.41843 accuracy:  0.5152
Epoch 53, CIFAR-10 Batch 1:  trainging cost:  0.902369 accuracy:  0.8
validation cost:  1.42894 accuracy:  0.5182
Epoch 54, CIFAR-10 Batch 1:  trainging cost:  0.865044 accuracy:  0.7
validation cost:  1.42864 accuracy:  0.517
Epoch 55, CIFAR-10 Batch 1:  trainging cost:  0.854032 accuracy:  0.675
validation cost:  1.43666 accuracy:  0.5196
Epoch 56, CIFAR-10 Batch 1:  trainging cost:  0.81716 accuracy:  0.775
validation cost:  1.44778 accuracy:  0.5116
Epoch 57, CIFAR-10 Batch 1:  trainging cost:  0.803679 accuracy:  0.75
validation cost:  1.4326 accuracy:  0.521
Epoch 58, CIFAR-10 Batch 1:  trainging cost:  0.827864 accuracy:  0.775
validation cost:  1.45062 accuracy:  0.5244
Epoch 59, CIFAR-10 Batch 1:  trainging cost:  0.853289 accuracy:  0.775
validation cost:  1.44281 accuracy:  0.5236
Epoch 60, CIFAR-10 Batch 1:  trainging cost:  0.818087 accuracy:  0.725
validation cost:  1.4316 accuracy:  0.5238
Epoch 61, CIFAR-10 Batch 1:  trainging cost:  0.78061 accuracy:  0.8
validation cost:  1.4658 accuracy:  0.5234
Epoch 62, CIFAR-10 Batch 1:  trainging cost:  0.800008 accuracy:  0.725
validation cost:  1.48945 accuracy:  0.5102
Epoch 63, CIFAR-10 Batch 1:  trainging cost:  0.800268 accuracy:  0.75
validation cost:  1.47029 accuracy:  0.5176
Epoch 64, CIFAR-10 Batch 1:  trainging cost:  0.82055 accuracy:  0.725
validation cost:  1.53647 accuracy:  0.512
Epoch 65, CIFAR-10 Batch 1:  trainging cost:  0.797202 accuracy:  0.75
validation cost:  1.54935 accuracy:  0.5094
Epoch 66, CIFAR-10 Batch 1:  trainging cost:  0.758709 accuracy:  0.8
validation cost:  1.50623 accuracy:  0.5158
Epoch 67, CIFAR-10 Batch 1:  trainging cost:  0.806434 accuracy:  0.775
validation cost:  1.50965 accuracy:  0.5148
Epoch 68, CIFAR-10 Batch 1:  trainging cost:  0.758577 accuracy:  0.8
validation cost:  1.51077 accuracy:  0.5162
Epoch 69, CIFAR-10 Batch 1:  trainging cost:  0.755038 accuracy:  0.725
validation cost:  1.55356 accuracy:  0.5118
Epoch 70, CIFAR-10 Batch 1:  trainging cost:  0.729448 accuracy:  0.775
validation cost:  1.54655 accuracy:  0.5086
Epoch 71, CIFAR-10 Batch 1:  trainging cost:  0.748678 accuracy:  0.75
validation cost:  1.61496 accuracy:  0.4976
Epoch 72, CIFAR-10 Batch 1:  trainging cost:  0.763392 accuracy:  0.75
validation cost:  1.59771 accuracy:  0.4992
Epoch 73, CIFAR-10 Batch 1:  trainging cost:  0.729511 accuracy:  0.85
validation cost:  1.58077 accuracy:  0.5102
Epoch 74, CIFAR-10 Batch 1:  trainging cost:  0.693553 accuracy:  0.8
validation cost:  1.58153 accuracy:  0.5066
Epoch 75, CIFAR-10 Batch 1:  trainging cost:  0.699177 accuracy:  0.85
validation cost:  1.58631 accuracy:  0.504
Epoch 76, CIFAR-10 Batch 1:  trainging cost:  0.697579 accuracy:  0.825
validation cost:  1.57627 accuracy:  0.5044
Epoch 77, CIFAR-10 Batch 1:  trainging cost:  0.704571 accuracy:  0.825
validation cost:  1.54793 accuracy:  0.5186
Epoch 78, CIFAR-10 Batch 1:  trainging cost:  0.721335 accuracy:  0.75
validation cost:  1.54057 accuracy:  0.5168
Epoch 79, CIFAR-10 Batch 1:  trainging cost:  0.733687 accuracy:  0.725
validation cost:  1.56555 accuracy:  0.5116
Epoch 80, CIFAR-10 Batch 1:  trainging cost:  0.660894 accuracy:  0.85
validation cost:  1.60025 accuracy:  0.5102
Epoch 81, CIFAR-10 Batch 1:  trainging cost:  0.678982 accuracy:  0.775
validation cost:  1.6336 accuracy:  0.511
Epoch 82, CIFAR-10 Batch 1:  trainging cost:  0.673537 accuracy:  0.775
validation cost:  1.68833 accuracy:  0.4974
Epoch 83, CIFAR-10 Batch 1:  trainging cost:  0.615333 accuracy:  0.875
validation cost:  1.6029 accuracy:  0.5104
Epoch 84, CIFAR-10 Batch 1:  trainging cost:  0.687795 accuracy:  0.75
validation cost:  1.67305 accuracy:  0.5028
Epoch 85, CIFAR-10 Batch 1:  trainging cost:  0.643063 accuracy:  0.85
validation cost:  1.61753 accuracy:  0.5074
Epoch 86, CIFAR-10 Batch 1:  trainging cost:  0.647623 accuracy:  0.85
validation cost:  1.61956 accuracy:  0.5072
Epoch 87, CIFAR-10 Batch 1:  trainging cost:  0.684475 accuracy:  0.825
validation cost:  1.63534 accuracy:  0.5164
Epoch 88, CIFAR-10 Batch 1:  trainging cost:  0.625418 accuracy:  0.825
validation cost:  1.67263 accuracy:  0.5158
Epoch 89, CIFAR-10 Batch 1:  trainging cost:  0.595158 accuracy:  0.85
validation cost:  1.67431 accuracy:  0.5076
Epoch 90, CIFAR-10 Batch 1:  trainging cost:  0.67471 accuracy:  0.75
validation cost:  1.74082 accuracy:  0.5036
Epoch 91, CIFAR-10 Batch 1:  trainging cost:  0.638926 accuracy:  0.775
validation cost:  1.73234 accuracy:  0.5128
Epoch 92, CIFAR-10 Batch 1:  trainging cost:  0.60095 accuracy:  0.8
validation cost:  1.6953 accuracy:  0.5138
Epoch 93, CIFAR-10 Batch 1:  trainging cost:  0.587371 accuracy:  0.875
validation cost:  1.69647 accuracy:  0.5066
Epoch 94, CIFAR-10 Batch 1:  trainging cost:  0.639469 accuracy:  0.825
validation cost:  1.80899 accuracy:  0.51
Epoch 95, CIFAR-10 Batch 1:  trainging cost:  0.571246 accuracy:  0.825
validation cost:  1.75647 accuracy:  0.5166
Epoch 96, CIFAR-10 Batch 1:  trainging cost:  0.620057 accuracy:  0.85
validation cost:  1.72298 accuracy:  0.5114
Epoch 97, CIFAR-10 Batch 1:  trainging cost:  0.582675 accuracy:  0.85
validation cost:  1.71275 accuracy:  0.5058
Epoch 98, CIFAR-10 Batch 1:  trainging cost:  0.559631 accuracy:  0.85
validation cost:  1.7708 accuracy:  0.5066
Epoch 99, CIFAR-10 Batch 1:  trainging cost:  0.592813 accuracy:  0.8
validation cost:  1.76457 accuracy:  0.5058
Epoch 100, CIFAR-10 Batch 1:  trainging cost:  0.565094 accuracy:  0.85
validation cost:  1.77141 accuracy:  0.512

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.


In [23]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  trainging cost:  2.20062 accuracy:  0.15
validation cost:  2.09171 accuracy:  0.262
Epoch  1, CIFAR-10 Batch 2:  trainging cost:  1.91578 accuracy:  0.325
validation cost:  1.83999 accuracy:  0.345
Epoch  1, CIFAR-10 Batch 3:  trainging cost:  1.76105 accuracy:  0.375
validation cost:  1.78518 accuracy:  0.331
Epoch  1, CIFAR-10 Batch 4:  trainging cost:  1.78036 accuracy:  0.4
validation cost:  1.728 accuracy:  0.3846
Epoch  1, CIFAR-10 Batch 5:  trainging cost:  1.75093 accuracy:  0.35
validation cost:  1.76081 accuracy:  0.3732
Epoch  2, CIFAR-10 Batch 1:  trainging cost:  1.95584 accuracy:  0.3
validation cost:  1.67616 accuracy:  0.3938
Epoch  2, CIFAR-10 Batch 2:  trainging cost:  1.81444 accuracy:  0.3
validation cost:  1.67495 accuracy:  0.3996
Epoch  2, CIFAR-10 Batch 3:  trainging cost:  1.55641 accuracy:  0.45
validation cost:  1.63523 accuracy:  0.4032
Epoch  2, CIFAR-10 Batch 4:  trainging cost:  1.7886 accuracy:  0.3
validation cost:  1.66407 accuracy:  0.3984
Epoch  2, CIFAR-10 Batch 5:  trainging cost:  1.65143 accuracy:  0.525
validation cost:  1.63268 accuracy:  0.4132
Epoch  3, CIFAR-10 Batch 1:  trainging cost:  1.88886 accuracy:  0.35
validation cost:  1.59901 accuracy:  0.4306
Epoch  3, CIFAR-10 Batch 2:  trainging cost:  1.63162 accuracy:  0.525
validation cost:  1.63169 accuracy:  0.4232
Epoch  3, CIFAR-10 Batch 3:  trainging cost:  1.45511 accuracy:  0.475
validation cost:  1.56003 accuracy:  0.4354
Epoch  3, CIFAR-10 Batch 4:  trainging cost:  1.64286 accuracy:  0.35
validation cost:  1.55514 accuracy:  0.4428
Epoch  3, CIFAR-10 Batch 5:  trainging cost:  1.51757 accuracy:  0.525
validation cost:  1.54419 accuracy:  0.4568
Epoch  4, CIFAR-10 Batch 1:  trainging cost:  1.75524 accuracy:  0.45
validation cost:  1.52279 accuracy:  0.4586
Epoch  4, CIFAR-10 Batch 2:  trainging cost:  1.55745 accuracy:  0.475
validation cost:  1.51542 accuracy:  0.4694
Epoch  4, CIFAR-10 Batch 3:  trainging cost:  1.37575 accuracy:  0.525
validation cost:  1.49842 accuracy:  0.48
Epoch  4, CIFAR-10 Batch 4:  trainging cost:  1.51341 accuracy:  0.45
validation cost:  1.47415 accuracy:  0.4876
Epoch  4, CIFAR-10 Batch 5:  trainging cost:  1.49015 accuracy:  0.5
validation cost:  1.5045 accuracy:  0.4834
Epoch  5, CIFAR-10 Batch 1:  trainging cost:  1.70276 accuracy:  0.425
validation cost:  1.43932 accuracy:  0.494
Epoch  5, CIFAR-10 Batch 2:  trainging cost:  1.45754 accuracy:  0.55
validation cost:  1.5022 accuracy:  0.4738
Epoch  5, CIFAR-10 Batch 3:  trainging cost:  1.33473 accuracy:  0.625
validation cost:  1.4541 accuracy:  0.4908
Epoch  5, CIFAR-10 Batch 4:  trainging cost:  1.47915 accuracy:  0.45
validation cost:  1.44931 accuracy:  0.4938
Epoch  5, CIFAR-10 Batch 5:  trainging cost:  1.41989 accuracy:  0.6
validation cost:  1.45999 accuracy:  0.5004
Epoch  6, CIFAR-10 Batch 1:  trainging cost:  1.71819 accuracy:  0.45
validation cost:  1.42541 accuracy:  0.5074
Epoch  6, CIFAR-10 Batch 2:  trainging cost:  1.41527 accuracy:  0.525
validation cost:  1.42743 accuracy:  0.5056
Epoch  6, CIFAR-10 Batch 3:  trainging cost:  1.26378 accuracy:  0.675
validation cost:  1.40836 accuracy:  0.5152
Epoch  6, CIFAR-10 Batch 4:  trainging cost:  1.40183 accuracy:  0.5
validation cost:  1.44365 accuracy:  0.4902
Epoch  6, CIFAR-10 Batch 5:  trainging cost:  1.32533 accuracy:  0.625
validation cost:  1.39809 accuracy:  0.5242
Epoch  7, CIFAR-10 Batch 1:  trainging cost:  1.66094 accuracy:  0.45
validation cost:  1.35693 accuracy:  0.5322
Epoch  7, CIFAR-10 Batch 2:  trainging cost:  1.35376 accuracy:  0.5
validation cost:  1.43627 accuracy:  0.4996
Epoch  7, CIFAR-10 Batch 3:  trainging cost:  1.23874 accuracy:  0.625
validation cost:  1.39682 accuracy:  0.523
Epoch  7, CIFAR-10 Batch 4:  trainging cost:  1.40132 accuracy:  0.525
validation cost:  1.38237 accuracy:  0.5192
Epoch  7, CIFAR-10 Batch 5:  trainging cost:  1.29303 accuracy:  0.625
validation cost:  1.39051 accuracy:  0.5216
Epoch  8, CIFAR-10 Batch 1:  trainging cost:  1.49754 accuracy:  0.525
validation cost:  1.37229 accuracy:  0.527
Epoch  8, CIFAR-10 Batch 2:  trainging cost:  1.30017 accuracy:  0.6
validation cost:  1.41542 accuracy:  0.5198
Epoch  8, CIFAR-10 Batch 3:  trainging cost:  1.18343 accuracy:  0.55
validation cost:  1.34971 accuracy:  0.5348
Epoch  8, CIFAR-10 Batch 4:  trainging cost:  1.34021 accuracy:  0.475
validation cost:  1.32746 accuracy:  0.5386
Epoch  8, CIFAR-10 Batch 5:  trainging cost:  1.27602 accuracy:  0.55
validation cost:  1.36319 accuracy:  0.5366
Epoch  9, CIFAR-10 Batch 1:  trainging cost:  1.49057 accuracy:  0.575
validation cost:  1.35837 accuracy:  0.5186
Epoch  9, CIFAR-10 Batch 2:  trainging cost:  1.31108 accuracy:  0.575
validation cost:  1.37304 accuracy:  0.5194
Epoch  9, CIFAR-10 Batch 3:  trainging cost:  1.20756 accuracy:  0.625
validation cost:  1.32707 accuracy:  0.5422
Epoch  9, CIFAR-10 Batch 4:  trainging cost:  1.32737 accuracy:  0.475
validation cost:  1.32425 accuracy:  0.541
Epoch  9, CIFAR-10 Batch 5:  trainging cost:  1.23931 accuracy:  0.625
validation cost:  1.33871 accuracy:  0.5372
Epoch 10, CIFAR-10 Batch 1:  trainging cost:  1.4937 accuracy:  0.475
validation cost:  1.30398 accuracy:  0.5438
Epoch 10, CIFAR-10 Batch 2:  trainging cost:  1.22068 accuracy:  0.625
validation cost:  1.34145 accuracy:  0.5466
Epoch 10, CIFAR-10 Batch 3:  trainging cost:  1.11184 accuracy:  0.625
validation cost:  1.30235 accuracy:  0.5462
Epoch 10, CIFAR-10 Batch 4:  trainging cost:  1.3024 accuracy:  0.525
validation cost:  1.31085 accuracy:  0.538
Epoch 10, CIFAR-10 Batch 5:  trainging cost:  1.18534 accuracy:  0.675
validation cost:  1.31223 accuracy:  0.5368
Epoch 11, CIFAR-10 Batch 1:  trainging cost:  1.39748 accuracy:  0.475
validation cost:  1.28509 accuracy:  0.5532
Epoch 11, CIFAR-10 Batch 2:  trainging cost:  1.18325 accuracy:  0.6
validation cost:  1.31315 accuracy:  0.5504
Epoch 11, CIFAR-10 Batch 3:  trainging cost:  1.1386 accuracy:  0.6
validation cost:  1.28657 accuracy:  0.5406
Epoch 11, CIFAR-10 Batch 4:  trainging cost:  1.30694 accuracy:  0.6
validation cost:  1.28797 accuracy:  0.5404
Epoch 11, CIFAR-10 Batch 5:  trainging cost:  1.08362 accuracy:  0.7
validation cost:  1.26978 accuracy:  0.5564
Epoch 12, CIFAR-10 Batch 1:  trainging cost:  1.44198 accuracy:  0.55
validation cost:  1.25469 accuracy:  0.5584
Epoch 12, CIFAR-10 Batch 2:  trainging cost:  1.17908 accuracy:  0.6
validation cost:  1.29362 accuracy:  0.5484
Epoch 12, CIFAR-10 Batch 3:  trainging cost:  1.10155 accuracy:  0.625
validation cost:  1.26712 accuracy:  0.552
Epoch 12, CIFAR-10 Batch 4:  trainging cost:  1.26283 accuracy:  0.55
validation cost:  1.27067 accuracy:  0.5516
Epoch 12, CIFAR-10 Batch 5:  trainging cost:  1.08314 accuracy:  0.675
validation cost:  1.25006 accuracy:  0.5614
Epoch 13, CIFAR-10 Batch 1:  trainging cost:  1.33772 accuracy:  0.55
validation cost:  1.26061 accuracy:  0.56
Epoch 13, CIFAR-10 Batch 2:  trainging cost:  1.18409 accuracy:  0.675
validation cost:  1.29728 accuracy:  0.5522
Epoch 13, CIFAR-10 Batch 3:  trainging cost:  1.07419 accuracy:  0.675
validation cost:  1.25533 accuracy:  0.5534
Epoch 13, CIFAR-10 Batch 4:  trainging cost:  1.26266 accuracy:  0.525
validation cost:  1.25204 accuracy:  0.5568
Epoch 13, CIFAR-10 Batch 5:  trainging cost:  1.11974 accuracy:  0.7
validation cost:  1.26781 accuracy:  0.556
Epoch 14, CIFAR-10 Batch 1:  trainging cost:  1.36578 accuracy:  0.6
validation cost:  1.2413 accuracy:  0.561
Epoch 14, CIFAR-10 Batch 2:  trainging cost:  1.13818 accuracy:  0.625
validation cost:  1.28031 accuracy:  0.5662
Epoch 14, CIFAR-10 Batch 3:  trainging cost:  1.03506 accuracy:  0.625
validation cost:  1.24237 accuracy:  0.557
Epoch 14, CIFAR-10 Batch 4:  trainging cost:  1.28709 accuracy:  0.475
validation cost:  1.24686 accuracy:  0.5572
Epoch 14, CIFAR-10 Batch 5:  trainging cost:  1.04416 accuracy:  0.7
validation cost:  1.22797 accuracy:  0.5628
Epoch 15, CIFAR-10 Batch 1:  trainging cost:  1.30702 accuracy:  0.525
validation cost:  1.24514 accuracy:  0.5608
Epoch 15, CIFAR-10 Batch 2:  trainging cost:  1.15587 accuracy:  0.6
validation cost:  1.27953 accuracy:  0.5616
Epoch 15, CIFAR-10 Batch 3:  trainging cost:  1.06196 accuracy:  0.625
validation cost:  1.23857 accuracy:  0.5576
Epoch 15, CIFAR-10 Batch 4:  trainging cost:  1.26089 accuracy:  0.575
validation cost:  1.24074 accuracy:  0.5624
Epoch 15, CIFAR-10 Batch 5:  trainging cost:  1.06098 accuracy:  0.725
validation cost:  1.24782 accuracy:  0.5586
Epoch 16, CIFAR-10 Batch 1:  trainging cost:  1.27237 accuracy:  0.55
validation cost:  1.22025 accuracy:  0.5648
Epoch 16, CIFAR-10 Batch 2:  trainging cost:  1.15816 accuracy:  0.675
validation cost:  1.26943 accuracy:  0.5642
Epoch 16, CIFAR-10 Batch 3:  trainging cost:  1.06476 accuracy:  0.6
validation cost:  1.23368 accuracy:  0.5646
Epoch 16, CIFAR-10 Batch 4:  trainging cost:  1.21987 accuracy:  0.525
validation cost:  1.22442 accuracy:  0.5742
Epoch 16, CIFAR-10 Batch 5:  trainging cost:  1.01805 accuracy:  0.725
validation cost:  1.19247 accuracy:  0.5716
Epoch 17, CIFAR-10 Batch 1:  trainging cost:  1.26265 accuracy:  0.525
validation cost:  1.20933 accuracy:  0.5774
Epoch 17, CIFAR-10 Batch 2:  trainging cost:  1.0582 accuracy:  0.65
validation cost:  1.24548 accuracy:  0.5628
Epoch 17, CIFAR-10 Batch 3:  trainging cost:  1.06932 accuracy:  0.7
validation cost:  1.23531 accuracy:  0.5604
Epoch 17, CIFAR-10 Batch 4:  trainging cost:  1.19635 accuracy:  0.575
validation cost:  1.23426 accuracy:  0.5632
Epoch 17, CIFAR-10 Batch 5:  trainging cost:  1.01097 accuracy:  0.7
validation cost:  1.21511 accuracy:  0.565
Epoch 18, CIFAR-10 Batch 1:  trainging cost:  1.27824 accuracy:  0.5
validation cost:  1.20808 accuracy:  0.576
Epoch 18, CIFAR-10 Batch 2:  trainging cost:  1.09504 accuracy:  0.65
validation cost:  1.27371 accuracy:  0.5648
Epoch 18, CIFAR-10 Batch 3:  trainging cost:  1.0041 accuracy:  0.65
validation cost:  1.21038 accuracy:  0.5738
Epoch 18, CIFAR-10 Batch 4:  trainging cost:  1.23513 accuracy:  0.55
validation cost:  1.23499 accuracy:  0.558
Epoch 18, CIFAR-10 Batch 5:  trainging cost:  1.08347 accuracy:  0.7
validation cost:  1.21961 accuracy:  0.5638
Epoch 19, CIFAR-10 Batch 1:  trainging cost:  1.25524 accuracy:  0.575
validation cost:  1.19356 accuracy:  0.5784
Epoch 19, CIFAR-10 Batch 2:  trainging cost:  1.07335 accuracy:  0.675
validation cost:  1.25786 accuracy:  0.552
Epoch 19, CIFAR-10 Batch 3:  trainging cost:  1.06283 accuracy:  0.65
validation cost:  1.20737 accuracy:  0.5728
Epoch 19, CIFAR-10 Batch 4:  trainging cost:  1.22066 accuracy:  0.6
validation cost:  1.21539 accuracy:  0.5666
Epoch 19, CIFAR-10 Batch 5:  trainging cost:  1.05042 accuracy:  0.65
validation cost:  1.22369 accuracy:  0.5648
Epoch 20, CIFAR-10 Batch 1:  trainging cost:  1.20688 accuracy:  0.55
validation cost:  1.1748 accuracy:  0.585
Epoch 20, CIFAR-10 Batch 2:  trainging cost:  1.05561 accuracy:  0.675
validation cost:  1.22124 accuracy:  0.5656
Epoch 20, CIFAR-10 Batch 3:  trainging cost:  0.998243 accuracy:  0.625
validation cost:  1.18408 accuracy:  0.5792
Epoch 20, CIFAR-10 Batch 4:  trainging cost:  1.23862 accuracy:  0.6
validation cost:  1.20254 accuracy:  0.5728
Epoch 20, CIFAR-10 Batch 5:  trainging cost:  1.0524 accuracy:  0.75
validation cost:  1.20062 accuracy:  0.5752
Epoch 21, CIFAR-10 Batch 1:  trainging cost:  1.19502 accuracy:  0.575
validation cost:  1.19397 accuracy:  0.575
Epoch 21, CIFAR-10 Batch 2:  trainging cost:  1.04355 accuracy:  0.725
validation cost:  1.22985 accuracy:  0.5734
Epoch 21, CIFAR-10 Batch 3:  trainging cost:  1.07286 accuracy:  0.675
validation cost:  1.22047 accuracy:  0.5782
Epoch 21, CIFAR-10 Batch 4:  trainging cost:  1.19573 accuracy:  0.525
validation cost:  1.1981 accuracy:  0.5716
Epoch 21, CIFAR-10 Batch 5:  trainging cost:  1.0484 accuracy:  0.7
validation cost:  1.20062 accuracy:  0.5702
Epoch 22, CIFAR-10 Batch 1:  trainging cost:  1.24227 accuracy:  0.525
validation cost:  1.19521 accuracy:  0.5766
Epoch 22, CIFAR-10 Batch 2:  trainging cost:  1.03958 accuracy:  0.625
validation cost:  1.23015 accuracy:  0.5762
Epoch 22, CIFAR-10 Batch 3:  trainging cost:  1.05536 accuracy:  0.7
validation cost:  1.20739 accuracy:  0.5772
Epoch 22, CIFAR-10 Batch 4:  trainging cost:  1.14827 accuracy:  0.55
validation cost:  1.17314 accuracy:  0.5814
Epoch 22, CIFAR-10 Batch 5:  trainging cost:  1.13123 accuracy:  0.625
validation cost:  1.21261 accuracy:  0.5754
Epoch 23, CIFAR-10 Batch 1:  trainging cost:  1.15843 accuracy:  0.525
validation cost:  1.19594 accuracy:  0.582
Epoch 23, CIFAR-10 Batch 2:  trainging cost:  1.03639 accuracy:  0.65
validation cost:  1.24354 accuracy:  0.5676
Epoch 23, CIFAR-10 Batch 3:  trainging cost:  0.992285 accuracy:  0.7
validation cost:  1.20362 accuracy:  0.5762
Epoch 23, CIFAR-10 Batch 4:  trainging cost:  1.17252 accuracy:  0.5
validation cost:  1.2046 accuracy:  0.5778
Epoch 23, CIFAR-10 Batch 5:  trainging cost:  1.03268 accuracy:  0.65
validation cost:  1.18037 accuracy:  0.5874
Epoch 24, CIFAR-10 Batch 1:  trainging cost:  1.2219 accuracy:  0.55
validation cost:  1.204 accuracy:  0.576
Epoch 24, CIFAR-10 Batch 2:  trainging cost:  0.998918 accuracy:  0.7
validation cost:  1.18658 accuracy:  0.5852
Epoch 24, CIFAR-10 Batch 3:  trainging cost:  0.991029 accuracy:  0.775
validation cost:  1.18086 accuracy:  0.585
Epoch 24, CIFAR-10 Batch 4:  trainging cost:  1.18719 accuracy:  0.6
validation cost:  1.19593 accuracy:  0.5782
Epoch 24, CIFAR-10 Batch 5:  trainging cost:  0.999917 accuracy:  0.675
validation cost:  1.19124 accuracy:  0.578
Epoch 25, CIFAR-10 Batch 1:  trainging cost:  1.27982 accuracy:  0.525
validation cost:  1.22129 accuracy:  0.5766
Epoch 25, CIFAR-10 Batch 2:  trainging cost:  0.960748 accuracy:  0.725
validation cost:  1.19741 accuracy:  0.5772
Epoch 25, CIFAR-10 Batch 3:  trainging cost:  0.973065 accuracy:  0.725
validation cost:  1.18049 accuracy:  0.5838
Epoch 25, CIFAR-10 Batch 4:  trainging cost:  1.1374 accuracy:  0.625
validation cost:  1.18297 accuracy:  0.5844
Epoch 25, CIFAR-10 Batch 5:  trainging cost:  1.01002 accuracy:  0.675
validation cost:  1.17756 accuracy:  0.5816
Epoch 26, CIFAR-10 Batch 1:  trainging cost:  1.22175 accuracy:  0.525
validation cost:  1.17798 accuracy:  0.587
Epoch 26, CIFAR-10 Batch 2:  trainging cost:  1.06533 accuracy:  0.7
validation cost:  1.21864 accuracy:  0.5808
Epoch 26, CIFAR-10 Batch 3:  trainging cost:  0.993286 accuracy:  0.675
validation cost:  1.19118 accuracy:  0.5754
Epoch 26, CIFAR-10 Batch 4:  trainging cost:  1.11828 accuracy:  0.6
validation cost:  1.17846 accuracy:  0.5834
Epoch 26, CIFAR-10 Batch 5:  trainging cost:  1.00784 accuracy:  0.65
validation cost:  1.18763 accuracy:  0.5788
Epoch 27, CIFAR-10 Batch 1:  trainging cost:  1.14477 accuracy:  0.6
validation cost:  1.17212 accuracy:  0.585
Epoch 27, CIFAR-10 Batch 2:  trainging cost:  0.997079 accuracy:  0.725
validation cost:  1.24262 accuracy:  0.5656
Epoch 27, CIFAR-10 Batch 3:  trainging cost:  0.952808 accuracy:  0.725
validation cost:  1.16211 accuracy:  0.5814
Epoch 27, CIFAR-10 Batch 4:  trainging cost:  1.13986 accuracy:  0.625
validation cost:  1.17533 accuracy:  0.5834
Epoch 27, CIFAR-10 Batch 5:  trainging cost:  0.988662 accuracy:  0.625
validation cost:  1.17496 accuracy:  0.5814
Epoch 28, CIFAR-10 Batch 1:  trainging cost:  1.13146 accuracy:  0.575
validation cost:  1.20099 accuracy:  0.5818
Epoch 28, CIFAR-10 Batch 2:  trainging cost:  0.98605 accuracy:  0.675
validation cost:  1.18408 accuracy:  0.5902
Epoch 28, CIFAR-10 Batch 3:  trainging cost:  0.94841 accuracy:  0.675
validation cost:  1.16948 accuracy:  0.5824
Epoch 28, CIFAR-10 Batch 4:  trainging cost:  1.0967 accuracy:  0.575
validation cost:  1.19072 accuracy:  0.5822
Epoch 28, CIFAR-10 Batch 5:  trainging cost:  0.949832 accuracy:  0.65
validation cost:  1.16115 accuracy:  0.5972
Epoch 29, CIFAR-10 Batch 1:  trainging cost:  1.16259 accuracy:  0.525
validation cost:  1.19187 accuracy:  0.587
Epoch 29, CIFAR-10 Batch 2:  trainging cost:  0.949085 accuracy:  0.725
validation cost:  1.2083 accuracy:  0.5816
Epoch 29, CIFAR-10 Batch 3:  trainging cost:  0.934797 accuracy:  0.725
validation cost:  1.19049 accuracy:  0.5834
Epoch 29, CIFAR-10 Batch 4:  trainging cost:  1.07196 accuracy:  0.55
validation cost:  1.16147 accuracy:  0.582
Epoch 29, CIFAR-10 Batch 5:  trainging cost:  0.988906 accuracy:  0.65
validation cost:  1.18642 accuracy:  0.5926
Epoch 30, CIFAR-10 Batch 1:  trainging cost:  1.1016 accuracy:  0.525
validation cost:  1.1712 accuracy:  0.5886
Epoch 30, CIFAR-10 Batch 2:  trainging cost:  0.944719 accuracy:  0.65
validation cost:  1.1954 accuracy:  0.5838
Epoch 30, CIFAR-10 Batch 3:  trainging cost:  0.971996 accuracy:  0.7
validation cost:  1.20514 accuracy:  0.5734
Epoch 30, CIFAR-10 Batch 4:  trainging cost:  1.10662 accuracy:  0.6
validation cost:  1.16062 accuracy:  0.5896
Epoch 30, CIFAR-10 Batch 5:  trainging cost:  0.944233 accuracy:  0.675
validation cost:  1.17867 accuracy:  0.5842
Epoch 31, CIFAR-10 Batch 1:  trainging cost:  1.08381 accuracy:  0.6
validation cost:  1.14962 accuracy:  0.6014
Epoch 31, CIFAR-10 Batch 2:  trainging cost:  0.881839 accuracy:  0.7
validation cost:  1.16899 accuracy:  0.5892
Epoch 31, CIFAR-10 Batch 3:  trainging cost:  0.939308 accuracy:  0.675
validation cost:  1.16683 accuracy:  0.5882
Epoch 31, CIFAR-10 Batch 4:  trainging cost:  1.06917 accuracy:  0.625
validation cost:  1.16243 accuracy:  0.5902
Epoch 31, CIFAR-10 Batch 5:  trainging cost:  0.969348 accuracy:  0.675
validation cost:  1.17514 accuracy:  0.5856
Epoch 32, CIFAR-10 Batch 1:  trainging cost:  1.06338 accuracy:  0.6
validation cost:  1.16478 accuracy:  0.5928
Epoch 32, CIFAR-10 Batch 2:  trainging cost:  0.92578 accuracy:  0.75
validation cost:  1.19627 accuracy:  0.5848
Epoch 32, CIFAR-10 Batch 3:  trainging cost:  0.943527 accuracy:  0.675
validation cost:  1.19496 accuracy:  0.5766
Epoch 32, CIFAR-10 Batch 4:  trainging cost:  1.06974 accuracy:  0.625
validation cost:  1.16571 accuracy:  0.5872
Epoch 32, CIFAR-10 Batch 5:  trainging cost:  0.968704 accuracy:  0.725
validation cost:  1.1633 accuracy:  0.5894
Epoch 33, CIFAR-10 Batch 1:  trainging cost:  1.09008 accuracy:  0.55
validation cost:  1.16404 accuracy:  0.5962
Epoch 33, CIFAR-10 Batch 2:  trainging cost:  0.877634 accuracy:  0.8
validation cost:  1.18429 accuracy:  0.5816
Epoch 33, CIFAR-10 Batch 3:  trainging cost:  0.940408 accuracy:  0.65
validation cost:  1.15673 accuracy:  0.5964
Epoch 33, CIFAR-10 Batch 4:  trainging cost:  1.01924 accuracy:  0.65
validation cost:  1.1683 accuracy:  0.5932
Epoch 33, CIFAR-10 Batch 5:  trainging cost:  0.940549 accuracy:  0.625
validation cost:  1.15757 accuracy:  0.595
Epoch 34, CIFAR-10 Batch 1:  trainging cost:  1.05036 accuracy:  0.6
validation cost:  1.15354 accuracy:  0.5982
Epoch 34, CIFAR-10 Batch 2:  trainging cost:  0.857226 accuracy:  0.775
validation cost:  1.18205 accuracy:  0.5934
Epoch 34, CIFAR-10 Batch 3:  trainging cost:  0.930548 accuracy:  0.65
validation cost:  1.16827 accuracy:  0.5896
Epoch 34, CIFAR-10 Batch 4:  trainging cost:  0.989771 accuracy:  0.65
validation cost:  1.15902 accuracy:  0.5944
Epoch 34, CIFAR-10 Batch 5:  trainging cost:  0.974317 accuracy:  0.675
validation cost:  1.1819 accuracy:  0.5804
Epoch 35, CIFAR-10 Batch 1:  trainging cost:  1.01652 accuracy:  0.55
validation cost:  1.17101 accuracy:  0.593
Epoch 35, CIFAR-10 Batch 2:  trainging cost:  0.91488 accuracy:  0.75
validation cost:  1.19243 accuracy:  0.5888
Epoch 35, CIFAR-10 Batch 3:  trainging cost:  0.914844 accuracy:  0.7
validation cost:  1.17761 accuracy:  0.5846
Epoch 35, CIFAR-10 Batch 4:  trainging cost:  1.0419 accuracy:  0.6
validation cost:  1.17139 accuracy:  0.5904
Epoch 35, CIFAR-10 Batch 5:  trainging cost:  0.969667 accuracy:  0.675
validation cost:  1.16021 accuracy:  0.6014
Epoch 36, CIFAR-10 Batch 1:  trainging cost:  1.02699 accuracy:  0.625
validation cost:  1.18038 accuracy:  0.5914
Epoch 36, CIFAR-10 Batch 2:  trainging cost:  0.93506 accuracy:  0.75
validation cost:  1.21204 accuracy:  0.5822
Epoch 36, CIFAR-10 Batch 3:  trainging cost:  0.908628 accuracy:  0.725
validation cost:  1.16657 accuracy:  0.5882
Epoch 36, CIFAR-10 Batch 4:  trainging cost:  1.03223 accuracy:  0.65
validation cost:  1.17648 accuracy:  0.583
Epoch 36, CIFAR-10 Batch 5:  trainging cost:  0.96962 accuracy:  0.7
validation cost:  1.1782 accuracy:  0.589
Epoch 37, CIFAR-10 Batch 1:  trainging cost:  0.953643 accuracy:  0.7
validation cost:  1.1633 accuracy:  0.5926
Epoch 37, CIFAR-10 Batch 2:  trainging cost:  0.88824 accuracy:  0.775
validation cost:  1.19035 accuracy:  0.588
Epoch 37, CIFAR-10 Batch 3:  trainging cost:  0.89914 accuracy:  0.7
validation cost:  1.17207 accuracy:  0.5946
Epoch 37, CIFAR-10 Batch 4:  trainging cost:  1.04582 accuracy:  0.575
validation cost:  1.17069 accuracy:  0.5906
Epoch 37, CIFAR-10 Batch 5:  trainging cost:  0.937667 accuracy:  0.675
validation cost:  1.15954 accuracy:  0.5924
Epoch 38, CIFAR-10 Batch 1:  trainging cost:  1.0022 accuracy:  0.6
validation cost:  1.16802 accuracy:  0.592
Epoch 38, CIFAR-10 Batch 2:  trainging cost:  1.03002 accuracy:  0.65
validation cost:  1.25767 accuracy:  0.564
Epoch 38, CIFAR-10 Batch 3:  trainging cost:  0.896626 accuracy:  0.75
validation cost:  1.16342 accuracy:  0.5906
Epoch 38, CIFAR-10 Batch 4:  trainging cost:  1.05158 accuracy:  0.625
validation cost:  1.17586 accuracy:  0.5892
Epoch 38, CIFAR-10 Batch 5:  trainging cost:  0.906778 accuracy:  0.7
validation cost:  1.16569 accuracy:  0.5922
Epoch 39, CIFAR-10 Batch 1:  trainging cost:  0.997067 accuracy:  0.625
validation cost:  1.16602 accuracy:  0.5898
Epoch 39, CIFAR-10 Batch 2:  trainging cost:  0.868464 accuracy:  0.725
validation cost:  1.20094 accuracy:  0.581
Epoch 39, CIFAR-10 Batch 3:  trainging cost:  0.919016 accuracy:  0.775
validation cost:  1.16578 accuracy:  0.5908
Epoch 39, CIFAR-10 Batch 4:  trainging cost:  1.03007 accuracy:  0.65
validation cost:  1.17669 accuracy:  0.5806
Epoch 39, CIFAR-10 Batch 5:  trainging cost:  0.940187 accuracy:  0.625
validation cost:  1.16812 accuracy:  0.5942
Epoch 40, CIFAR-10 Batch 1:  trainging cost:  0.983735 accuracy:  0.6
validation cost:  1.17808 accuracy:  0.5902
Epoch 40, CIFAR-10 Batch 2:  trainging cost:  0.932041 accuracy:  0.7
validation cost:  1.21819 accuracy:  0.5736
Epoch 40, CIFAR-10 Batch 3:  trainging cost:  0.944271 accuracy:  0.725
validation cost:  1.16435 accuracy:  0.5932
Epoch 40, CIFAR-10 Batch 4:  trainging cost:  1.01566 accuracy:  0.625
validation cost:  1.16455 accuracy:  0.5922
Epoch 40, CIFAR-10 Batch 5:  trainging cost:  0.922548 accuracy:  0.675
validation cost:  1.17161 accuracy:  0.5926
Epoch 41, CIFAR-10 Batch 1:  trainging cost:  0.989363 accuracy:  0.675
validation cost:  1.18396 accuracy:  0.5834
Epoch 41, CIFAR-10 Batch 2:  trainging cost:  0.859862 accuracy:  0.75
validation cost:  1.19617 accuracy:  0.5824
Epoch 41, CIFAR-10 Batch 3:  trainging cost:  0.933264 accuracy:  0.65
validation cost:  1.18633 accuracy:  0.5788
Epoch 41, CIFAR-10 Batch 4:  trainging cost:  0.962401 accuracy:  0.625
validation cost:  1.15158 accuracy:  0.598
Epoch 41, CIFAR-10 Batch 5:  trainging cost:  0.909635 accuracy:  0.675
validation cost:  1.16698 accuracy:  0.5936
Epoch 42, CIFAR-10 Batch 1:  trainging cost:  0.992727 accuracy:  0.625
validation cost:  1.16839 accuracy:  0.5942
Epoch 42, CIFAR-10 Batch 2:  trainging cost:  0.801123 accuracy:  0.725
validation cost:  1.15831 accuracy:  0.597
Epoch 42, CIFAR-10 Batch 3:  trainging cost:  0.90498 accuracy:  0.725
validation cost:  1.16995 accuracy:  0.59
Epoch 42, CIFAR-10 Batch 4:  trainging cost:  0.975615 accuracy:  0.65
validation cost:  1.16394 accuracy:  0.5852
Epoch 42, CIFAR-10 Batch 5:  trainging cost:  0.964404 accuracy:  0.625
validation cost:  1.17563 accuracy:  0.5856
Epoch 43, CIFAR-10 Batch 1:  trainging cost:  0.913282 accuracy:  0.7
validation cost:  1.16698 accuracy:  0.5906
Epoch 43, CIFAR-10 Batch 2:  trainging cost:  0.82455 accuracy:  0.75
validation cost:  1.17446 accuracy:  0.5888
Epoch 43, CIFAR-10 Batch 3:  trainging cost:  0.87968 accuracy:  0.725
validation cost:  1.16 accuracy:  0.5884
Epoch 43, CIFAR-10 Batch 4:  trainging cost:  1.04839 accuracy:  0.625
validation cost:  1.17173 accuracy:  0.591
Epoch 43, CIFAR-10 Batch 5:  trainging cost:  1.00437 accuracy:  0.625
validation cost:  1.17402 accuracy:  0.5966
Epoch 44, CIFAR-10 Batch 1:  trainging cost:  0.997857 accuracy:  0.6
validation cost:  1.19028 accuracy:  0.5908
Epoch 44, CIFAR-10 Batch 2:  trainging cost:  0.865632 accuracy:  0.725
validation cost:  1.19455 accuracy:  0.5798
Epoch 44, CIFAR-10 Batch 3:  trainging cost:  0.899796 accuracy:  0.7
validation cost:  1.18267 accuracy:  0.5768
Epoch 44, CIFAR-10 Batch 4:  trainging cost:  0.979488 accuracy:  0.625
validation cost:  1.16245 accuracy:  0.5892
Epoch 44, CIFAR-10 Batch 5:  trainging cost:  0.883988 accuracy:  0.65
validation cost:  1.13851 accuracy:  0.6054
Epoch 45, CIFAR-10 Batch 1:  trainging cost:  0.991175 accuracy:  0.625
validation cost:  1.17988 accuracy:  0.5844
Epoch 45, CIFAR-10 Batch 2:  trainging cost:  0.82016 accuracy:  0.675
validation cost:  1.18986 accuracy:  0.5796
Epoch 45, CIFAR-10 Batch 3:  trainging cost:  0.964467 accuracy:  0.65
validation cost:  1.20752 accuracy:  0.573
Epoch 45, CIFAR-10 Batch 4:  trainging cost:  0.953729 accuracy:  0.65
validation cost:  1.16014 accuracy:  0.5924
Epoch 45, CIFAR-10 Batch 5:  trainging cost:  0.918953 accuracy:  0.65
validation cost:  1.16866 accuracy:  0.5952
Epoch 46, CIFAR-10 Batch 1:  trainging cost:  0.941284 accuracy:  0.65
validation cost:  1.15672 accuracy:  0.5978
Epoch 46, CIFAR-10 Batch 2:  trainging cost:  0.815853 accuracy:  0.75
validation cost:  1.18005 accuracy:  0.5884
Epoch 46, CIFAR-10 Batch 3:  trainging cost:  0.890045 accuracy:  0.725
validation cost:  1.18584 accuracy:  0.5788
Epoch 46, CIFAR-10 Batch 4:  trainging cost:  0.953288 accuracy:  0.65
validation cost:  1.1822 accuracy:  0.5938
Epoch 46, CIFAR-10 Batch 5:  trainging cost:  0.898765 accuracy:  0.7
validation cost:  1.16968 accuracy:  0.5912
Epoch 47, CIFAR-10 Batch 1:  trainging cost:  0.97302 accuracy:  0.625
validation cost:  1.15943 accuracy:  0.5978
Epoch 47, CIFAR-10 Batch 2:  trainging cost:  0.854615 accuracy:  0.725
validation cost:  1.20282 accuracy:  0.574
Epoch 47, CIFAR-10 Batch 3:  trainging cost:  0.915522 accuracy:  0.725
validation cost:  1.18003 accuracy:  0.593
Epoch 47, CIFAR-10 Batch 4:  trainging cost:  0.968262 accuracy:  0.675
validation cost:  1.15897 accuracy:  0.5942
Epoch 47, CIFAR-10 Batch 5:  trainging cost:  0.902701 accuracy:  0.675
validation cost:  1.16743 accuracy:  0.5926
Epoch 48, CIFAR-10 Batch 1:  trainging cost:  0.951127 accuracy:  0.6
validation cost:  1.19485 accuracy:  0.588
Epoch 48, CIFAR-10 Batch 2:  trainging cost:  0.818092 accuracy:  0.7
validation cost:  1.17058 accuracy:  0.5896
Epoch 48, CIFAR-10 Batch 3:  trainging cost:  0.958786 accuracy:  0.675
validation cost:  1.20273 accuracy:  0.5748
Epoch 48, CIFAR-10 Batch 4:  trainging cost:  0.996656 accuracy:  0.6
validation cost:  1.17914 accuracy:  0.5918
Epoch 48, CIFAR-10 Batch 5:  trainging cost:  0.887386 accuracy:  0.7
validation cost:  1.16174 accuracy:  0.6004
Epoch 49, CIFAR-10 Batch 1:  trainging cost:  0.93274 accuracy:  0.7
validation cost:  1.15513 accuracy:  0.5994
Epoch 49, CIFAR-10 Batch 2:  trainging cost:  0.838415 accuracy:  0.675
validation cost:  1.18269 accuracy:  0.5806
Epoch 49, CIFAR-10 Batch 3:  trainging cost:  0.853926 accuracy:  0.775
validation cost:  1.18465 accuracy:  0.5848
Epoch 49, CIFAR-10 Batch 4:  trainging cost:  0.993959 accuracy:  0.6
validation cost:  1.16819 accuracy:  0.5952
Epoch 49, CIFAR-10 Batch 5:  trainging cost:  0.939237 accuracy:  0.65
validation cost:  1.15352 accuracy:  0.599
Epoch 50, CIFAR-10 Batch 1:  trainging cost:  0.947734 accuracy:  0.65
validation cost:  1.15885 accuracy:  0.6038
Epoch 50, CIFAR-10 Batch 2:  trainging cost:  0.837372 accuracy:  0.7
validation cost:  1.21079 accuracy:  0.5804
Epoch 50, CIFAR-10 Batch 3:  trainging cost:  0.872769 accuracy:  0.7
validation cost:  1.16725 accuracy:  0.5904
Epoch 50, CIFAR-10 Batch 4:  trainging cost:  0.93263 accuracy:  0.6
validation cost:  1.16363 accuracy:  0.5928
Epoch 50, CIFAR-10 Batch 5:  trainging cost:  0.915095 accuracy:  0.575
validation cost:  1.15496 accuracy:  0.5956
Epoch 51, CIFAR-10 Batch 1:  trainging cost:  0.914136 accuracy:  0.65
validation cost:  1.15747 accuracy:  0.5994
Epoch 51, CIFAR-10 Batch 2:  trainging cost:  0.789071 accuracy:  0.75
validation cost:  1.15297 accuracy:  0.5996
Epoch 51, CIFAR-10 Batch 3:  trainging cost:  0.904468 accuracy:  0.7
validation cost:  1.19306 accuracy:  0.5866
Epoch 51, CIFAR-10 Batch 4:  trainging cost:  0.989159 accuracy:  0.6
validation cost:  1.18468 accuracy:  0.5864
Epoch 51, CIFAR-10 Batch 5:  trainging cost:  0.950112 accuracy:  0.65
validation cost:  1.15532 accuracy:  0.5958
Epoch 52, CIFAR-10 Batch 1:  trainging cost:  0.938521 accuracy:  0.65
validation cost:  1.18597 accuracy:  0.5918
Epoch 52, CIFAR-10 Batch 2:  trainging cost:  0.759396 accuracy:  0.7
validation cost:  1.16085 accuracy:  0.5964
Epoch 52, CIFAR-10 Batch 3:  trainging cost:  0.857894 accuracy:  0.75
validation cost:  1.16467 accuracy:  0.5948
Epoch 52, CIFAR-10 Batch 4:  trainging cost:  0.948862 accuracy:  0.65
validation cost:  1.17476 accuracy:  0.5888
Epoch 52, CIFAR-10 Batch 5:  trainging cost:  0.922489 accuracy:  0.675
validation cost:  1.1597 accuracy:  0.597
Epoch 53, CIFAR-10 Batch 1:  trainging cost:  0.992347 accuracy:  0.6
validation cost:  1.18185 accuracy:  0.5984
Epoch 53, CIFAR-10 Batch 2:  trainging cost:  0.767821 accuracy:  0.775
validation cost:  1.15743 accuracy:  0.599
Epoch 53, CIFAR-10 Batch 3:  trainging cost:  0.88192 accuracy:  0.675
validation cost:  1.15587 accuracy:  0.5968
Epoch 53, CIFAR-10 Batch 4:  trainging cost:  0.922779 accuracy:  0.625
validation cost:  1.15765 accuracy:  0.5942
Epoch 53, CIFAR-10 Batch 5:  trainging cost:  0.880115 accuracy:  0.675
validation cost:  1.16335 accuracy:  0.5996
Epoch 54, CIFAR-10 Batch 1:  trainging cost:  0.914401 accuracy:  0.65
validation cost:  1.16483 accuracy:  0.5974
Epoch 54, CIFAR-10 Batch 2:  trainging cost:  0.815857 accuracy:  0.725
validation cost:  1.20949 accuracy:  0.5836
Epoch 54, CIFAR-10 Batch 3:  trainging cost:  0.875948 accuracy:  0.75
validation cost:  1.16734 accuracy:  0.5942
Epoch 54, CIFAR-10 Batch 4:  trainging cost:  0.956482 accuracy:  0.575
validation cost:  1.17916 accuracy:  0.5936
Epoch 54, CIFAR-10 Batch 5:  trainging cost:  0.891232 accuracy:  0.65
validation cost:  1.16373 accuracy:  0.5988
Epoch 55, CIFAR-10 Batch 1:  trainging cost:  0.915249 accuracy:  0.675
validation cost:  1.18934 accuracy:  0.5896
Epoch 55, CIFAR-10 Batch 2:  trainging cost:  0.763112 accuracy:  0.725
validation cost:  1.18153 accuracy:  0.5986
Epoch 55, CIFAR-10 Batch 3:  trainging cost:  0.867236 accuracy:  0.725
validation cost:  1.16499 accuracy:  0.597
Epoch 55, CIFAR-10 Batch 4:  trainging cost:  0.922387 accuracy:  0.675
validation cost:  1.17898 accuracy:  0.5922
Epoch 55, CIFAR-10 Batch 5:  trainging cost:  0.885466 accuracy:  0.65
validation cost:  1.15669 accuracy:  0.6016
Epoch 56, CIFAR-10 Batch 1:  trainging cost:  0.93594 accuracy:  0.6
validation cost:  1.20628 accuracy:  0.5856
Epoch 56, CIFAR-10 Batch 2:  trainging cost:  0.783416 accuracy:  0.75
validation cost:  1.16569 accuracy:  0.5984
Epoch 56, CIFAR-10 Batch 3:  trainging cost:  0.827221 accuracy:  0.725
validation cost:  1.14907 accuracy:  0.6084
Epoch 56, CIFAR-10 Batch 4:  trainging cost:  0.976832 accuracy:  0.65
validation cost:  1.17712 accuracy:  0.592
Epoch 56, CIFAR-10 Batch 5:  trainging cost:  0.910854 accuracy:  0.675
validation cost:  1.1617 accuracy:  0.598
Epoch 57, CIFAR-10 Batch 1:  trainging cost:  0.825796 accuracy:  0.775
validation cost:  1.16572 accuracy:  0.6024
Epoch 57, CIFAR-10 Batch 2:  trainging cost:  0.790074 accuracy:  0.7
validation cost:  1.18483 accuracy:  0.5884
Epoch 57, CIFAR-10 Batch 3:  trainging cost:  0.83043 accuracy:  0.775
validation cost:  1.15259 accuracy:  0.5996
Epoch 57, CIFAR-10 Batch 4:  trainging cost:  0.9701 accuracy:  0.625
validation cost:  1.19331 accuracy:  0.59
Epoch 57, CIFAR-10 Batch 5:  trainging cost:  0.883221 accuracy:  0.675
validation cost:  1.16555 accuracy:  0.6018
Epoch 58, CIFAR-10 Batch 1:  trainging cost:  0.892055 accuracy:  0.7
validation cost:  1.21112 accuracy:  0.582
Epoch 58, CIFAR-10 Batch 2:  trainging cost:  0.802828 accuracy:  0.825
validation cost:  1.20417 accuracy:  0.583
Epoch 58, CIFAR-10 Batch 3:  trainging cost:  0.90564 accuracy:  0.65
validation cost:  1.16879 accuracy:  0.5904
Epoch 58, CIFAR-10 Batch 4:  trainging cost:  0.911768 accuracy:  0.65
validation cost:  1.1673 accuracy:  0.5994
Epoch 58, CIFAR-10 Batch 5:  trainging cost:  0.85368 accuracy:  0.675
validation cost:  1.15654 accuracy:  0.5998
Epoch 59, CIFAR-10 Batch 1:  trainging cost:  0.905021 accuracy:  0.65
validation cost:  1.18865 accuracy:  0.5994
Epoch 59, CIFAR-10 Batch 2:  trainging cost:  0.820244 accuracy:  0.65
validation cost:  1.1912 accuracy:  0.5768
Epoch 59, CIFAR-10 Batch 3:  trainging cost:  0.839164 accuracy:  0.775
validation cost:  1.17826 accuracy:  0.5968
Epoch 59, CIFAR-10 Batch 4:  trainging cost:  0.907262 accuracy:  0.625
validation cost:  1.16397 accuracy:  0.5916
Epoch 59, CIFAR-10 Batch 5:  trainging cost:  0.887393 accuracy:  0.7
validation cost:  1.18061 accuracy:  0.5942
Epoch 60, CIFAR-10 Batch 1:  trainging cost:  0.83507 accuracy:  0.75
validation cost:  1.18351 accuracy:  0.5954
Epoch 60, CIFAR-10 Batch 2:  trainging cost:  0.726681 accuracy:  0.7
validation cost:  1.15745 accuracy:  0.5998
Epoch 60, CIFAR-10 Batch 3:  trainging cost:  0.851772 accuracy:  0.675
validation cost:  1.17516 accuracy:  0.5922
Epoch 60, CIFAR-10 Batch 4:  trainging cost:  0.868263 accuracy:  0.65
validation cost:  1.18045 accuracy:  0.598
Epoch 60, CIFAR-10 Batch 5:  trainging cost:  0.868054 accuracy:  0.675
validation cost:  1.16614 accuracy:  0.5996
Epoch 61, CIFAR-10 Batch 1:  trainging cost:  0.847963 accuracy:  0.75
validation cost:  1.20432 accuracy:  0.5972
Epoch 61, CIFAR-10 Batch 2:  trainging cost:  0.768656 accuracy:  0.75
validation cost:  1.1807 accuracy:  0.5886
Epoch 61, CIFAR-10 Batch 3:  trainging cost:  0.818415 accuracy:  0.75
validation cost:  1.17352 accuracy:  0.5986
Epoch 61, CIFAR-10 Batch 4:  trainging cost:  0.880829 accuracy:  0.7
validation cost:  1.17559 accuracy:  0.5858
Epoch 61, CIFAR-10 Batch 5:  trainging cost:  0.863129 accuracy:  0.725
validation cost:  1.17388 accuracy:  0.5942
Epoch 62, CIFAR-10 Batch 1:  trainging cost:  0.852394 accuracy:  0.75
validation cost:  1.19513 accuracy:  0.6002
Epoch 62, CIFAR-10 Batch 2:  trainging cost:  0.785941 accuracy:  0.7
validation cost:  1.20156 accuracy:  0.584
Epoch 62, CIFAR-10 Batch 3:  trainging cost:  0.835001 accuracy:  0.725
validation cost:  1.1761 accuracy:  0.5966
Epoch 62, CIFAR-10 Batch 4:  trainging cost:  0.937829 accuracy:  0.575
validation cost:  1.17748 accuracy:  0.5906
Epoch 62, CIFAR-10 Batch 5:  trainging cost:  0.844772 accuracy:  0.7
validation cost:  1.1685 accuracy:  0.6022
Epoch 63, CIFAR-10 Batch 1:  trainging cost:  0.840097 accuracy:  0.775
validation cost:  1.17883 accuracy:  0.5916
Epoch 63, CIFAR-10 Batch 2:  trainging cost:  0.79197 accuracy:  0.75
validation cost:  1.19001 accuracy:  0.5888
Epoch 63, CIFAR-10 Batch 3:  trainging cost:  0.798362 accuracy:  0.75
validation cost:  1.16519 accuracy:  0.6036
Epoch 63, CIFAR-10 Batch 4:  trainging cost:  0.857297 accuracy:  0.65
validation cost:  1.17682 accuracy:  0.594
Epoch 63, CIFAR-10 Batch 5:  trainging cost:  0.819609 accuracy:  0.7
validation cost:  1.18696 accuracy:  0.598
Epoch 64, CIFAR-10 Batch 1:  trainging cost:  0.831135 accuracy:  0.75
validation cost:  1.18324 accuracy:  0.6006
Epoch 64, CIFAR-10 Batch 2:  trainging cost:  0.72201 accuracy:  0.75
validation cost:  1.18092 accuracy:  0.5946
Epoch 64, CIFAR-10 Batch 3:  trainging cost:  0.85893 accuracy:  0.675
validation cost:  1.17916 accuracy:  0.5898
Epoch 64, CIFAR-10 Batch 4:  trainging cost:  0.886604 accuracy:  0.625
validation cost:  1.17757 accuracy:  0.5992
Epoch 64, CIFAR-10 Batch 5:  trainging cost:  0.885195 accuracy:  0.625
validation cost:  1.18259 accuracy:  0.5996
Epoch 65, CIFAR-10 Batch 1:  trainging cost:  0.855543 accuracy:  0.575
validation cost:  1.20323 accuracy:  0.5828
Epoch 65, CIFAR-10 Batch 2:  trainging cost:  0.757143 accuracy:  0.825
validation cost:  1.20408 accuracy:  0.5866
Epoch 65, CIFAR-10 Batch 3:  trainging cost:  0.811471 accuracy:  0.725
validation cost:  1.16417 accuracy:  0.5992
Epoch 65, CIFAR-10 Batch 4:  trainging cost:  0.911937 accuracy:  0.625
validation cost:  1.1885 accuracy:  0.5978
Epoch 65, CIFAR-10 Batch 5:  trainging cost:  0.824646 accuracy:  0.7
validation cost:  1.16146 accuracy:  0.6058
Epoch 66, CIFAR-10 Batch 1:  trainging cost:  0.822473 accuracy:  0.75
validation cost:  1.19071 accuracy:  0.6022
Epoch 66, CIFAR-10 Batch 2:  trainging cost:  0.750831 accuracy:  0.725
validation cost:  1.18875 accuracy:  0.5912
Epoch 66, CIFAR-10 Batch 3:  trainging cost:  0.788353 accuracy:  0.775
validation cost:  1.177 accuracy:  0.5984
Epoch 66, CIFAR-10 Batch 4:  trainging cost:  0.921114 accuracy:  0.65
validation cost:  1.19116 accuracy:  0.5882
Epoch 66, CIFAR-10 Batch 5:  trainging cost:  0.857716 accuracy:  0.75
validation cost:  1.17459 accuracy:  0.5972
Epoch 67, CIFAR-10 Batch 1:  trainging cost:  0.864869 accuracy:  0.725
validation cost:  1.20342 accuracy:  0.593
Epoch 67, CIFAR-10 Batch 2:  trainging cost:  0.7535 accuracy:  0.725
validation cost:  1.19248 accuracy:  0.5862
Epoch 67, CIFAR-10 Batch 3:  trainging cost:  0.784988 accuracy:  0.75
validation cost:  1.16714 accuracy:  0.6006
Epoch 67, CIFAR-10 Batch 4:  trainging cost:  0.861832 accuracy:  0.65
validation cost:  1.18162 accuracy:  0.5934
Epoch 67, CIFAR-10 Batch 5:  trainging cost:  0.82908 accuracy:  0.725
validation cost:  1.19156 accuracy:  0.5986
Epoch 68, CIFAR-10 Batch 1:  trainging cost:  0.830107 accuracy:  0.75
validation cost:  1.20217 accuracy:  0.593
Epoch 68, CIFAR-10 Batch 2:  trainging cost:  0.758121 accuracy:  0.725
validation cost:  1.21567 accuracy:  0.5782
Epoch 68, CIFAR-10 Batch 3:  trainging cost:  0.782363 accuracy:  0.775
validation cost:  1.17348 accuracy:  0.598
Epoch 68, CIFAR-10 Batch 4:  trainging cost:  0.866735 accuracy:  0.675
validation cost:  1.196 accuracy:  0.5928
Epoch 68, CIFAR-10 Batch 5:  trainging cost:  0.840881 accuracy:  0.7
validation cost:  1.17539 accuracy:  0.5956
Epoch 69, CIFAR-10 Batch 1:  trainging cost:  0.890544 accuracy:  0.675
validation cost:  1.16951 accuracy:  0.602
Epoch 69, CIFAR-10 Batch 2:  trainging cost:  0.763382 accuracy:  0.8
validation cost:  1.22394 accuracy:  0.5798
Epoch 69, CIFAR-10 Batch 3:  trainging cost:  0.787777 accuracy:  0.8
validation cost:  1.17297 accuracy:  0.5982
Epoch 69, CIFAR-10 Batch 4:  trainging cost:  0.8302 accuracy:  0.675
validation cost:  1.17202 accuracy:  0.5986
Epoch 69, CIFAR-10 Batch 5:  trainging cost:  0.866441 accuracy:  0.725
validation cost:  1.1818 accuracy:  0.5988
Epoch 70, CIFAR-10 Batch 1:  trainging cost:  0.842382 accuracy:  0.675
validation cost:  1.19498 accuracy:  0.5932
Epoch 70, CIFAR-10 Batch 2:  trainging cost:  0.719803 accuracy:  0.8
validation cost:  1.18501 accuracy:  0.593
Epoch 70, CIFAR-10 Batch 3:  trainging cost:  0.803505 accuracy:  0.7
validation cost:  1.1765 accuracy:  0.6026
Epoch 70, CIFAR-10 Batch 4:  trainging cost:  0.851313 accuracy:  0.65
validation cost:  1.181 accuracy:  0.5938
Epoch 70, CIFAR-10 Batch 5:  trainging cost:  0.809912 accuracy:  0.725
validation cost:  1.18057 accuracy:  0.5984
Epoch 71, CIFAR-10 Batch 1:  trainging cost:  0.829813 accuracy:  0.7
validation cost:  1.22274 accuracy:  0.5834
Epoch 71, CIFAR-10 Batch 2:  trainging cost:  0.740505 accuracy:  0.8
validation cost:  1.21051 accuracy:  0.5842
Epoch 71, CIFAR-10 Batch 3:  trainging cost:  0.779731 accuracy:  0.75
validation cost:  1.19398 accuracy:  0.5916
Epoch 71, CIFAR-10 Batch 4:  trainging cost:  0.789966 accuracy:  0.675
validation cost:  1.17443 accuracy:  0.6014
Epoch 71, CIFAR-10 Batch 5:  trainging cost:  0.818712 accuracy:  0.75
validation cost:  1.1794 accuracy:  0.596
Epoch 72, CIFAR-10 Batch 1:  trainging cost:  0.809901 accuracy:  0.65
validation cost:  1.20525 accuracy:  0.59
Epoch 72, CIFAR-10 Batch 2:  trainging cost:  0.707904 accuracy:  0.8
validation cost:  1.19049 accuracy:  0.595
Epoch 72, CIFAR-10 Batch 3:  trainging cost:  0.8299 accuracy:  0.725
validation cost:  1.17447 accuracy:  0.607
Epoch 72, CIFAR-10 Batch 4:  trainging cost:  0.823652 accuracy:  0.675
validation cost:  1.17747 accuracy:  0.599
Epoch 72, CIFAR-10 Batch 5:  trainging cost:  0.861238 accuracy:  0.725
validation cost:  1.18655 accuracy:  0.6004
Epoch 73, CIFAR-10 Batch 1:  trainging cost:  0.825482 accuracy:  0.75
validation cost:  1.20346 accuracy:  0.5994
Epoch 73, CIFAR-10 Batch 2:  trainging cost:  0.735462 accuracy:  0.775
validation cost:  1.22196 accuracy:  0.5802
Epoch 73, CIFAR-10 Batch 3:  trainging cost:  0.805169 accuracy:  0.725
validation cost:  1.19538 accuracy:  0.5908
Epoch 73, CIFAR-10 Batch 4:  trainging cost:  0.854952 accuracy:  0.675
validation cost:  1.2038 accuracy:  0.5836
Epoch 73, CIFAR-10 Batch 5:  trainging cost:  0.749169 accuracy:  0.75
validation cost:  1.17734 accuracy:  0.6038
Epoch 74, CIFAR-10 Batch 1:  trainging cost:  0.789542 accuracy:  0.725
validation cost:  1.20142 accuracy:  0.5904
Epoch 74, CIFAR-10 Batch 2:  trainging cost:  0.700098 accuracy:  0.8
validation cost:  1.19359 accuracy:  0.5904
Epoch 74, CIFAR-10 Batch 3:  trainging cost:  0.825897 accuracy:  0.75
validation cost:  1.19596 accuracy:  0.5902
Epoch 74, CIFAR-10 Batch 4:  trainging cost:  0.869966 accuracy:  0.675
validation cost:  1.20772 accuracy:  0.59
Epoch 74, CIFAR-10 Batch 5:  trainging cost:  0.81088 accuracy:  0.675
validation cost:  1.18462 accuracy:  0.604
Epoch 75, CIFAR-10 Batch 1:  trainging cost:  0.800623 accuracy:  0.75
validation cost:  1.18749 accuracy:  0.6066
Epoch 75, CIFAR-10 Batch 2:  trainging cost:  0.730084 accuracy:  0.8
validation cost:  1.19779 accuracy:  0.5924
Epoch 75, CIFAR-10 Batch 3:  trainging cost:  0.800733 accuracy:  0.8
validation cost:  1.18771 accuracy:  0.602
Epoch 75, CIFAR-10 Batch 4:  trainging cost:  0.861843 accuracy:  0.675
validation cost:  1.20295 accuracy:  0.5958
Epoch 75, CIFAR-10 Batch 5:  trainging cost:  0.877807 accuracy:  0.65
validation cost:  1.18943 accuracy:  0.5936
Epoch 76, CIFAR-10 Batch 1:  trainging cost:  0.843864 accuracy:  0.675
validation cost:  1.21455 accuracy:  0.5904
Epoch 76, CIFAR-10 Batch 2:  trainging cost:  0.717008 accuracy:  0.7
validation cost:  1.22995 accuracy:  0.5908
Epoch 76, CIFAR-10 Batch 3:  trainging cost:  0.755925 accuracy:  0.75
validation cost:  1.18569 accuracy:  0.5974
Epoch 76, CIFAR-10 Batch 4:  trainging cost:  0.842582 accuracy:  0.675
validation cost:  1.20892 accuracy:  0.5964
Epoch 76, CIFAR-10 Batch 5:  trainging cost:  0.79927 accuracy:  0.725
validation cost:  1.1804 accuracy:  0.5974
Epoch 77, CIFAR-10 Batch 1:  trainging cost:  0.792724 accuracy:  0.7
validation cost:  1.20082 accuracy:  0.5946
Epoch 77, CIFAR-10 Batch 2:  trainging cost:  0.721858 accuracy:  0.775
validation cost:  1.23717 accuracy:  0.581
Epoch 77, CIFAR-10 Batch 3:  trainging cost:  0.754518 accuracy:  0.775
validation cost:  1.18273 accuracy:  0.606
Epoch 77, CIFAR-10 Batch 4:  trainging cost:  0.813889 accuracy:  0.7
validation cost:  1.19325 accuracy:  0.5988
Epoch 77, CIFAR-10 Batch 5:  trainging cost:  0.859254 accuracy:  0.675
validation cost:  1.17847 accuracy:  0.6056
Epoch 78, CIFAR-10 Batch 1:  trainging cost:  0.809906 accuracy:  0.725
validation cost:  1.23262 accuracy:  0.5902
Epoch 78, CIFAR-10 Batch 2:  trainging cost:  0.68826 accuracy:  0.8
validation cost:  1.20634 accuracy:  0.594
Epoch 78, CIFAR-10 Batch 3:  trainging cost:  0.774711 accuracy:  0.8
validation cost:  1.21032 accuracy:  0.59
Epoch 78, CIFAR-10 Batch 4:  trainging cost:  0.871078 accuracy:  0.65
validation cost:  1.22472 accuracy:  0.589
Epoch 78, CIFAR-10 Batch 5:  trainging cost:  0.797607 accuracy:  0.7
validation cost:  1.18543 accuracy:  0.5996
Epoch 79, CIFAR-10 Batch 1:  trainging cost:  0.764494 accuracy:  0.775
validation cost:  1.1914 accuracy:  0.5988
Epoch 79, CIFAR-10 Batch 2:  trainging cost:  0.713039 accuracy:  0.8
validation cost:  1.2199 accuracy:  0.595
Epoch 79, CIFAR-10 Batch 3:  trainging cost:  0.731444 accuracy:  0.8
validation cost:  1.18513 accuracy:  0.5998
Epoch 79, CIFAR-10 Batch 4:  trainging cost:  0.814317 accuracy:  0.675
validation cost:  1.20938 accuracy:  0.5898
Epoch 79, CIFAR-10 Batch 5:  trainging cost:  0.800465 accuracy:  0.675
validation cost:  1.19683 accuracy:  0.594
Epoch 80, CIFAR-10 Batch 1:  trainging cost:  0.793772 accuracy:  0.725
validation cost:  1.21954 accuracy:  0.5878
Epoch 80, CIFAR-10 Batch 2:  trainging cost:  0.760217 accuracy:  0.775
validation cost:  1.21969 accuracy:  0.5872
Epoch 80, CIFAR-10 Batch 3:  trainging cost:  0.74511 accuracy:  0.8
validation cost:  1.18876 accuracy:  0.6022
Epoch 80, CIFAR-10 Batch 4:  trainging cost:  0.875216 accuracy:  0.6
validation cost:  1.18431 accuracy:  0.6008
Epoch 80, CIFAR-10 Batch 5:  trainging cost:  0.751545 accuracy:  0.725
validation cost:  1.18065 accuracy:  0.6032
Epoch 81, CIFAR-10 Batch 1:  trainging cost:  0.774185 accuracy:  0.75
validation cost:  1.20498 accuracy:  0.6022
Epoch 81, CIFAR-10 Batch 2:  trainging cost:  0.691417 accuracy:  0.8
validation cost:  1.22389 accuracy:  0.591
Epoch 81, CIFAR-10 Batch 3:  trainging cost:  0.757333 accuracy:  0.775
validation cost:  1.18214 accuracy:  0.6034
Epoch 81, CIFAR-10 Batch 4:  trainging cost:  0.801431 accuracy:  0.675
validation cost:  1.20382 accuracy:  0.5968
Epoch 81, CIFAR-10 Batch 5:  trainging cost:  0.758822 accuracy:  0.725
validation cost:  1.18036 accuracy:  0.6006
Epoch 82, CIFAR-10 Batch 1:  trainging cost:  0.872035 accuracy:  0.725
validation cost:  1.23568 accuracy:  0.5926
Epoch 82, CIFAR-10 Batch 2:  trainging cost:  0.679328 accuracy:  0.675
validation cost:  1.18869 accuracy:  0.6
Epoch 82, CIFAR-10 Batch 3:  trainging cost:  0.749477 accuracy:  0.75
validation cost:  1.19896 accuracy:  0.5988
Epoch 82, CIFAR-10 Batch 4:  trainging cost:  0.802965 accuracy:  0.65
validation cost:  1.22027 accuracy:  0.5904
Epoch 82, CIFAR-10 Batch 5:  trainging cost:  0.790082 accuracy:  0.725
validation cost:  1.20312 accuracy:  0.5984
Epoch 83, CIFAR-10 Batch 1:  trainging cost:  0.830499 accuracy:  0.7
validation cost:  1.21144 accuracy:  0.5952
Epoch 83, CIFAR-10 Batch 2:  trainging cost:  0.638325 accuracy:  0.775
validation cost:  1.20706 accuracy:  0.5856
Epoch 83, CIFAR-10 Batch 3:  trainging cost:  0.747093 accuracy:  0.8
validation cost:  1.20249 accuracy:  0.595
Epoch 83, CIFAR-10 Batch 4:  trainging cost:  0.872674 accuracy:  0.625
validation cost:  1.20084 accuracy:  0.593
Epoch 83, CIFAR-10 Batch 5:  trainging cost:  0.779763 accuracy:  0.725
validation cost:  1.20817 accuracy:  0.5936
Epoch 84, CIFAR-10 Batch 1:  trainging cost:  0.796702 accuracy:  0.8
validation cost:  1.21684 accuracy:  0.5988
Epoch 84, CIFAR-10 Batch 2:  trainging cost:  0.710889 accuracy:  0.775
validation cost:  1.24397 accuracy:  0.5836
Epoch 84, CIFAR-10 Batch 3:  trainging cost:  0.795395 accuracy:  0.75
validation cost:  1.22675 accuracy:  0.5916
Epoch 84, CIFAR-10 Batch 4:  trainging cost:  0.821883 accuracy:  0.7
validation cost:  1.20358 accuracy:  0.5904
Epoch 84, CIFAR-10 Batch 5:  trainging cost:  0.781374 accuracy:  0.725
validation cost:  1.20494 accuracy:  0.5974
Epoch 85, CIFAR-10 Batch 1:  trainging cost:  0.807062 accuracy:  0.825
validation cost:  1.22215 accuracy:  0.5988
Epoch 85, CIFAR-10 Batch 2:  trainging cost:  0.705175 accuracy:  0.7
validation cost:  1.21172 accuracy:  0.5912
Epoch 85, CIFAR-10 Batch 3:  trainging cost:  0.753523 accuracy:  0.775
validation cost:  1.19801 accuracy:  0.5884
Epoch 85, CIFAR-10 Batch 4:  trainging cost:  0.850025 accuracy:  0.65
validation cost:  1.26292 accuracy:  0.5858
Epoch 85, CIFAR-10 Batch 5:  trainging cost:  0.799482 accuracy:  0.7
validation cost:  1.20763 accuracy:  0.6006
Epoch 86, CIFAR-10 Batch 1:  trainging cost:  0.771333 accuracy:  0.75
validation cost:  1.20455 accuracy:  0.5988
Epoch 86, CIFAR-10 Batch 2:  trainging cost:  0.722286 accuracy:  0.7
validation cost:  1.20488 accuracy:  0.5888
Epoch 86, CIFAR-10 Batch 3:  trainging cost:  0.78714 accuracy:  0.8
validation cost:  1.2313 accuracy:  0.5818
Epoch 86, CIFAR-10 Batch 4:  trainging cost:  0.876509 accuracy:  0.625
validation cost:  1.21557 accuracy:  0.5922
Epoch 86, CIFAR-10 Batch 5:  trainging cost:  0.757765 accuracy:  0.725
validation cost:  1.19308 accuracy:  0.6044
Epoch 87, CIFAR-10 Batch 1:  trainging cost:  0.737563 accuracy:  0.825
validation cost:  1.24543 accuracy:  0.5942
Epoch 87, CIFAR-10 Batch 2:  trainging cost:  0.638671 accuracy:  0.775
validation cost:  1.18764 accuracy:  0.601
Epoch 87, CIFAR-10 Batch 3:  trainging cost:  0.787188 accuracy:  0.775
validation cost:  1.19661 accuracy:  0.6046
Epoch 87, CIFAR-10 Batch 4:  trainging cost:  0.827062 accuracy:  0.7
validation cost:  1.23786 accuracy:  0.5938
Epoch 87, CIFAR-10 Batch 5:  trainging cost:  0.789304 accuracy:  0.725
validation cost:  1.21319 accuracy:  0.5954
Epoch 88, CIFAR-10 Batch 1:  trainging cost:  0.749302 accuracy:  0.775
validation cost:  1.20512 accuracy:  0.6044
Epoch 88, CIFAR-10 Batch 2:  trainging cost:  0.721912 accuracy:  0.825
validation cost:  1.21719 accuracy:  0.5866
Epoch 88, CIFAR-10 Batch 3:  trainging cost:  0.739587 accuracy:  0.825
validation cost:  1.22347 accuracy:  0.5942
Epoch 88, CIFAR-10 Batch 4:  trainging cost:  0.796264 accuracy:  0.65
validation cost:  1.22075 accuracy:  0.6038
Epoch 88, CIFAR-10 Batch 5:  trainging cost:  0.745579 accuracy:  0.725
validation cost:  1.21653 accuracy:  0.592
Epoch 89, CIFAR-10 Batch 1:  trainging cost:  0.766632 accuracy:  0.725
validation cost:  1.23753 accuracy:  0.5902
Epoch 89, CIFAR-10 Batch 2:  trainging cost:  0.66539 accuracy:  0.775
validation cost:  1.21811 accuracy:  0.5912
Epoch 89, CIFAR-10 Batch 3:  trainging cost:  0.734746 accuracy:  0.85
validation cost:  1.22952 accuracy:  0.5956
Epoch 89, CIFAR-10 Batch 4:  trainging cost:  0.813129 accuracy:  0.725
validation cost:  1.26757 accuracy:  0.5854
Epoch 89, CIFAR-10 Batch 5:  trainging cost:  0.786432 accuracy:  0.675
validation cost:  1.24454 accuracy:  0.5928
Epoch 90, CIFAR-10 Batch 1:  trainging cost:  0.766832 accuracy:  0.75
validation cost:  1.25426 accuracy:  0.5938
Epoch 90, CIFAR-10 Batch 2:  trainging cost:  0.664523 accuracy:  0.725
validation cost:  1.21142 accuracy:  0.5946
Epoch 90, CIFAR-10 Batch 3:  trainging cost:  0.745714 accuracy:  0.75
validation cost:  1.2177 accuracy:  0.5964
Epoch 90, CIFAR-10 Batch 4:  trainging cost:  0.79467 accuracy:  0.675
validation cost:  1.21446 accuracy:  0.5928
Epoch 90, CIFAR-10 Batch 5:  trainging cost:  0.755031 accuracy:  0.775
validation cost:  1.20507 accuracy:  0.6014
Epoch 91, CIFAR-10 Batch 1:  trainging cost:  0.816505 accuracy:  0.7
validation cost:  1.254 accuracy:  0.586
Epoch 91, CIFAR-10 Batch 2:  trainging cost:  0.688065 accuracy:  0.8
validation cost:  1.21135 accuracy:  0.5892
Epoch 91, CIFAR-10 Batch 3:  trainging cost:  0.683636 accuracy:  0.825
validation cost:  1.22033 accuracy:  0.6016
Epoch 91, CIFAR-10 Batch 4:  trainging cost:  0.821004 accuracy:  0.675
validation cost:  1.23817 accuracy:  0.5904
Epoch 91, CIFAR-10 Batch 5:  trainging cost:  0.767853 accuracy:  0.7
validation cost:  1.23111 accuracy:  0.591
Epoch 92, CIFAR-10 Batch 1:  trainging cost:  0.758212 accuracy:  0.85
validation cost:  1.24012 accuracy:  0.5942
Epoch 92, CIFAR-10 Batch 2:  trainging cost:  0.704154 accuracy:  0.8
validation cost:  1.22616 accuracy:  0.5826
Epoch 92, CIFAR-10 Batch 3:  trainging cost:  0.754073 accuracy:  0.825
validation cost:  1.21879 accuracy:  0.5988
Epoch 92, CIFAR-10 Batch 4:  trainging cost:  0.814414 accuracy:  0.7
validation cost:  1.24562 accuracy:  0.585
Epoch 92, CIFAR-10 Batch 5:  trainging cost:  0.841646 accuracy:  0.675
validation cost:  1.251 accuracy:  0.586
Epoch 93, CIFAR-10 Batch 1:  trainging cost:  0.789497 accuracy:  0.775
validation cost:  1.27405 accuracy:  0.5842
Epoch 93, CIFAR-10 Batch 2:  trainging cost:  0.737972 accuracy:  0.775
validation cost:  1.22739 accuracy:  0.5818
Epoch 93, CIFAR-10 Batch 3:  trainging cost:  0.723164 accuracy:  0.8
validation cost:  1.22252 accuracy:  0.5966
Epoch 93, CIFAR-10 Batch 4:  trainging cost:  0.866315 accuracy:  0.7
validation cost:  1.23951 accuracy:  0.5934
Epoch 93, CIFAR-10 Batch 5:  trainging cost:  0.729657 accuracy:  0.725
validation cost:  1.21575 accuracy:  0.596
Epoch 94, CIFAR-10 Batch 1:  trainging cost:  0.799149 accuracy:  0.775
validation cost:  1.27008 accuracy:  0.6008
Epoch 94, CIFAR-10 Batch 2:  trainging cost:  0.746743 accuracy:  0.8
validation cost:  1.23453 accuracy:  0.5844
Epoch 94, CIFAR-10 Batch 3:  trainging cost:  0.728649 accuracy:  0.8
validation cost:  1.19557 accuracy:  0.6066
Epoch 94, CIFAR-10 Batch 4:  trainging cost:  0.782496 accuracy:  0.7
validation cost:  1.22125 accuracy:  0.6014
Epoch 94, CIFAR-10 Batch 5:  trainging cost:  0.773681 accuracy:  0.7
validation cost:  1.21288 accuracy:  0.5896
Epoch 95, CIFAR-10 Batch 1:  trainging cost:  0.763081 accuracy:  0.725
validation cost:  1.23792 accuracy:  0.599
Epoch 95, CIFAR-10 Batch 2:  trainging cost:  0.674773 accuracy:  0.775
validation cost:  1.20857 accuracy:  0.5982
Epoch 95, CIFAR-10 Batch 3:  trainging cost:  0.707894 accuracy:  0.8
validation cost:  1.22282 accuracy:  0.601
Epoch 95, CIFAR-10 Batch 4:  trainging cost:  0.809844 accuracy:  0.65
validation cost:  1.21869 accuracy:  0.6022
Epoch 95, CIFAR-10 Batch 5:  trainging cost:  0.761903 accuracy:  0.725
validation cost:  1.20913 accuracy:  0.598
Epoch 96, CIFAR-10 Batch 1:  trainging cost:  0.781684 accuracy:  0.725
validation cost:  1.25546 accuracy:  0.5808
Epoch 96, CIFAR-10 Batch 2:  trainging cost:  0.673242 accuracy:  0.775
validation cost:  1.23114 accuracy:  0.5886
Epoch 96, CIFAR-10 Batch 3:  trainging cost:  0.747541 accuracy:  0.725
validation cost:  1.22328 accuracy:  0.5998
Epoch 96, CIFAR-10 Batch 4:  trainging cost:  0.773547 accuracy:  0.65
validation cost:  1.24209 accuracy:  0.5858
Epoch 96, CIFAR-10 Batch 5:  trainging cost:  0.748602 accuracy:  0.75
validation cost:  1.21818 accuracy:  0.602
Epoch 97, CIFAR-10 Batch 1:  trainging cost:  0.832877 accuracy:  0.75
validation cost:  1.26324 accuracy:  0.5802
Epoch 97, CIFAR-10 Batch 2:  trainging cost:  0.683202 accuracy:  0.775
validation cost:  1.21049 accuracy:  0.5908
Epoch 97, CIFAR-10 Batch 3:  trainging cost:  0.734482 accuracy:  0.75
validation cost:  1.2346 accuracy:  0.6042
Epoch 97, CIFAR-10 Batch 4:  trainging cost:  0.815612 accuracy:  0.675
validation cost:  1.25928 accuracy:  0.5866
Epoch 97, CIFAR-10 Batch 5:  trainging cost:  0.732404 accuracy:  0.725
validation cost:  1.21315 accuracy:  0.595
Epoch 98, CIFAR-10 Batch 1:  trainging cost:  0.720119 accuracy:  0.8
validation cost:  1.28035 accuracy:  0.589
Epoch 98, CIFAR-10 Batch 2:  trainging cost:  0.682177 accuracy:  0.75
validation cost:  1.22244 accuracy:  0.5856
Epoch 98, CIFAR-10 Batch 3:  trainging cost:  0.695109 accuracy:  0.85
validation cost:  1.22653 accuracy:  0.595
Epoch 98, CIFAR-10 Batch 4:  trainging cost:  0.792917 accuracy:  0.625
validation cost:  1.22991 accuracy:  0.5894
Epoch 98, CIFAR-10 Batch 5:  trainging cost:  0.746741 accuracy:  0.75
validation cost:  1.21264 accuracy:  0.5892
Epoch 99, CIFAR-10 Batch 1:  trainging cost:  0.753435 accuracy:  0.75
validation cost:  1.23215 accuracy:  0.5968
Epoch 99, CIFAR-10 Batch 2:  trainging cost:  0.612695 accuracy:  0.725
validation cost:  1.2221 accuracy:  0.5898
Epoch 99, CIFAR-10 Batch 3:  trainging cost:  0.732597 accuracy:  0.8
validation cost:  1.22356 accuracy:  0.5916
Epoch 99, CIFAR-10 Batch 4:  trainging cost:  0.780251 accuracy:  0.65
validation cost:  1.23318 accuracy:  0.5958
Epoch 99, CIFAR-10 Batch 5:  trainging cost:  0.793001 accuracy:  0.7
validation cost:  1.24133 accuracy:  0.578
Epoch 100, CIFAR-10 Batch 1:  trainging cost:  0.775081 accuracy:  0.775
validation cost:  1.26092 accuracy:  0.5892
Epoch 100, CIFAR-10 Batch 2:  trainging cost:  0.636532 accuracy:  0.775
validation cost:  1.21455 accuracy:  0.5894
Epoch 100, CIFAR-10 Batch 3:  trainging cost:  0.715839 accuracy:  0.8
validation cost:  1.24248 accuracy:  0.5908
Epoch 100, CIFAR-10 Batch 4:  trainging cost:  0.833197 accuracy:  0.675
validation cost:  1.24972 accuracy:  0.5972
Epoch 100, CIFAR-10 Batch 5:  trainging cost:  0.729741 accuracy:  0.7
validation cost:  1.21786 accuracy:  0.5984

Checkpoint

The model has been saved to disk.

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.


In [20]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_training.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for train_feature_batch, train_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: train_feature_batch, loaded_y: train_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


Testing Accuracy: 0.5022963258785943

Why 50-70% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 70%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.