Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.


In [1]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

# Use Floyd's cifar-10 dataset if present
floyd_cifar10_location = '/input/cifar-10/python.tar.gz'
if isfile(floyd_cifar10_location):
    tar_gz_path = floyd_cifar10_location
else:
    tar_gz_path = 'cifar-10-python.tar.gz'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(tar_gz_path):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            tar_gz_path,
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open(tar_gz_path) as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


All files found!

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

  • airplane
  • automobile
  • bird
  • cat
  • deer
  • dog
  • frog
  • horse
  • ship
  • truck

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.


In [2]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id =5
sample_id = 2000
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 5:
Samples: 10000
Label Counts: {0: 1014, 1: 1014, 2: 952, 3: 1016, 4: 997, 5: 1025, 6: 980, 7: 977, 8: 1003, 9: 1022}
First 20 Labels: [1, 8, 5, 1, 5, 7, 4, 3, 8, 2, 7, 2, 0, 1, 5, 9, 6, 2, 0, 8]

Example of Image 2000:
Image - Min Value: 13 Max Value: 255
Image - Shape: (32, 32, 3)
Label - Label Id: 9 Name: truck

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.


In [3]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    
    return np.array(x/255)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_normalize(normalize)


Tests Passed

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.

Hint: Don't reinvent the wheel.


In [4]:
from sklearn.preprocessing import LabelBinarizer
lb = LabelBinarizer()
lb.fit([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
#print(lb.classes_)
def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    # TODO: Implement Function
    return lb.transform(x)

# exemple of one line solution 
# np.eye(10)[x]
# np.identity(10, dtype=int)[x]
# sklearn.preprocessing.label_binarize(x, classes=range(10))
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_one_hot_encode(one_hot_encode)


Tests Passed

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.


In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.


In [6]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

Note: If you're finding it hard to dedicate enough time for this course each week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages to build each layer, except the layers you build in the "Convolutional and Max Pooling Layer" section. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

However, if you would like to get the most out of this course, try to solve all the problems without using anything from the TF Layers packages. You can still use classes from other packages that happen to have the same name as ones you find in TF Layers! For example, instead of using the TF Layers version of the conv2d class, tf.layers.conv2d, you would want to use the TF Neural Network version of conv2d, tf.nn.conv2d.

Let's begin!

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

  • Implement neural_net_image_input
    • Return a TF Placeholder
    • Set the shape using image_shape with batch size set to None.
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_label_input
    • Return a TF Placeholder
    • Set the shape using n_classes with batch size set to None.
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
  • Implement neural_net_keep_prob_input
    • Return a TF Placeholder for dropout keep probability.
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.

These names will be used at the end of the project to load your saved model.

Note: None for shapes in TensorFlow allow for a dynamic size.


In [7]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a batch of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    # TODO: Implement Function
    return tf.placeholder( 
                    tf.float32,
                    shape=[None, image_shape[0], image_shape[1], image_shape[2]],
                    name="x")
    # another solution with unpacking
    #tf.placeholder(tf.float32, shape=[None, *image_shape], name='x')

def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    # TODO: Implement Function
    return tf.placeholder( 
                    tf.float32,
                    shape=[None, n_classes],
                    name="y")


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    # TODO: Implement Function
    return tf.placeholder( 
                    tf.float32,
                    name="keep_prob")


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Image Input Tests Passed.
Label Input Tests Passed.
Keep Prob Tests Passed.

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • Apply a convolution to x_tensor using weight and conv_strides.
    • We recommend you use same padding, but you're welcome to use any padding.
  • Add bias
  • Add a nonlinear activation to the convolution.
  • Apply Max Pooling using pool_ksize and pool_strides.
    • We recommend you use same padding, but you're welcome to use any padding.

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer, but you can still use TensorFlow's Neural Network package. You may still use the shortcut option for all the other layers.


In [10]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_ksize: kernal size 2-D Tuple for the convolutional layer
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """
    
    # TODO: Implement Function

    # Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
    weight = tf.Variable(tf.random_normal([conv_ksize[0], conv_ksize[1], int(x_tensor.shape[3]), conv_num_outputs],stddev=0.05) )
    bias = tf.Variable(tf.zeros([conv_num_outputs]))
    
    # Apply a convolution to x_tensor using weight and conv_strides.
    x = tf.nn.conv2d(x_tensor, weight, strides=[1, conv_strides[0], conv_strides[1], 1], padding='SAME')
    
    # Add bias
    x = tf.nn.bias_add(x, bias)
 
    
    
    # Apply Max Pooling using pool_ksize and pool_strides.
    x=tf.nn.max_pool(
            x,
            ksize=[1, pool_ksize[0], pool_ksize[1], 1],
            strides=[1, pool_strides[0], pool_strides[1], 1],
            padding='SAME')
    
    #Add a nonlinear activation to the convolution.
    x = tf.nn.relu(x)
    return x


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_con_pool(conv2d_maxpool)


Tests Passed

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [11]:
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    # TODO: Implement Function
    return tf.contrib.layers.flatten(x_tensor)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_flatten(flatten)


Tests Passed

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.


In [12]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function

    return tf.contrib.layers.fully_connected (
        x_tensor,
        num_outputs,
        )



"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_fully_conn(fully_conn)


Tests Passed

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). Shortcut option: you can use classes from the TensorFlow Layers or TensorFlow Layers (contrib) packages for this layer. For more of a challenge, only use other TensorFlow packages.

Note: Activation, softmax, or cross entropy should not be applied to this.


In [13]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    
    return tf.contrib.layers.fully_connected (
        x_tensor,
        num_outputs,
        activation_fn=None
        )


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_output(output)


Tests Passed

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • Apply a Flatten Layer
  • Apply 1, 2, or 3 Fully Connected Layers
  • Apply an Output Layer
  • Return the output
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.

In [15]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """
    # TODO: Apply 1, 2, or 3 Convolution and Max Pool layers
    #    Play around with different number of outputs, kernel size and stride
    # Function Definition from Above:
    #    conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides)
    x=conv2d_maxpool(x, 32, [3,3], [1,1], [4,4], [4,4])
    
    # TODO: Apply a Flatten Layer
    # Function Definition from Above:
    #   flatten(x_tensor)
    x=flatten(x)

    # TODO: Apply 1, 2, or 3 Fully Connected Layers
    #    Play around with different number of outputs
    # Function Definition from Above:
    #   fully_conn(x_tensor, num_outputs)
    x=fully_conn(x, 50)
    
    # Apply dropout
    x=tf.nn.dropout(
        x,
        keep_prob
    
    )
    


    
    # TODO: Apply an Output Layer
    #    Set this to the number of classes
    # Function Definition from Above:
    #   output(x_tensor, num_outputs)
    x=output(x, 10)
    
    # TODO: return output
    return x


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""

##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

tests.test_conv_net(conv_net)


Neural Network Built!

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

  • x for image input
  • y for labels
  • keep_prob for keep probability for dropout

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

Note: Nothing needs to be returned. This function is only optimizing the neural network.


In [16]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    # TODO: Implement Function
    
    session.run(optimizer,feed_dict={x:feature_batch,y:label_batch,keep_prob:keep_probability})


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_train_nn(train_neural_network)


Tests Passed

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.


In [17]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """
    # TODO: Implement Function
    current_cost = session.run(cost, feed_dict={x:feature_batch, y:label_batch, keep_prob:1.0})
    current_accuracy = session.run(accuracy, feed_dict={x:valid_features, y:valid_labels, keep_prob:1.0})
    print("")
    print("current_cost : " , current_cost) 
    print("current_accuracy: ", current_accuracy)

Hyperparameters

Tune the following parameters:

  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
    • 64
    • 128
    • 256
    • ...
  • Set keep_probability to the probability of keeping a node using dropout

In [20]:
# TODO: Tune Parameters
epochs = 200
batch_size = 128
keep_probability = 0.6

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.


In [21]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  
current_cost :  2.18207
current_accuracy:  0.261
Epoch  2, CIFAR-10 Batch 1:  
current_cost :  2.05283
current_accuracy:  0.3648
Epoch  3, CIFAR-10 Batch 1:  
current_cost :  1.9519
current_accuracy:  0.4152
Epoch  4, CIFAR-10 Batch 1:  
current_cost :  1.82447
current_accuracy:  0.4292
Epoch  5, CIFAR-10 Batch 1:  
current_cost :  1.78123
current_accuracy:  0.458
Epoch  6, CIFAR-10 Batch 1:  
current_cost :  1.67649
current_accuracy:  0.4602
Epoch  7, CIFAR-10 Batch 1:  
current_cost :  1.63025
current_accuracy:  0.469
Epoch  8, CIFAR-10 Batch 1:  
current_cost :  1.5991
current_accuracy:  0.484
Epoch  9, CIFAR-10 Batch 1:  
current_cost :  1.51548
current_accuracy:  0.4918
Epoch 10, CIFAR-10 Batch 1:  
current_cost :  1.50649
current_accuracy:  0.4986
Epoch 11, CIFAR-10 Batch 1:  
current_cost :  1.47545
current_accuracy:  0.5016
Epoch 12, CIFAR-10 Batch 1:  
current_cost :  1.42895
current_accuracy:  0.509
Epoch 13, CIFAR-10 Batch 1:  
current_cost :  1.38392
current_accuracy:  0.4954
Epoch 14, CIFAR-10 Batch 1:  
current_cost :  1.33095
current_accuracy:  0.5008
Epoch 15, CIFAR-10 Batch 1:  
current_cost :  1.33177
current_accuracy:  0.5018
Epoch 16, CIFAR-10 Batch 1:  
current_cost :  1.31677
current_accuracy:  0.4986
Epoch 17, CIFAR-10 Batch 1:  
current_cost :  1.24782
current_accuracy:  0.51
Epoch 18, CIFAR-10 Batch 1:  
current_cost :  1.29209
current_accuracy:  0.5034
Epoch 19, CIFAR-10 Batch 1:  
current_cost :  1.24949
current_accuracy:  0.5074
Epoch 20, CIFAR-10 Batch 1:  
current_cost :  1.22292
current_accuracy:  0.5092
Epoch 21, CIFAR-10 Batch 1:  
current_cost :  1.177
current_accuracy:  0.5072
Epoch 22, CIFAR-10 Batch 1:  
current_cost :  1.15872
current_accuracy:  0.5144
Epoch 23, CIFAR-10 Batch 1:  
current_cost :  1.1293
current_accuracy:  0.5126
Epoch 24, CIFAR-10 Batch 1:  
current_cost :  1.14111
current_accuracy:  0.5098
Epoch 25, CIFAR-10 Batch 1:  
current_cost :  1.11651
current_accuracy:  0.5184
Epoch 26, CIFAR-10 Batch 1:  
current_cost :  1.11953
current_accuracy:  0.5156
Epoch 27, CIFAR-10 Batch 1:  
current_cost :  1.09597
current_accuracy:  0.515
Epoch 28, CIFAR-10 Batch 1:  
current_cost :  1.05628
current_accuracy:  0.525
Epoch 29, CIFAR-10 Batch 1:  
current_cost :  1.05256
current_accuracy:  0.5292
Epoch 30, CIFAR-10 Batch 1:  
current_cost :  1.0351
current_accuracy:  0.5258
Epoch 31, CIFAR-10 Batch 1:  
current_cost :  1.03964
current_accuracy:  0.5288
Epoch 32, CIFAR-10 Batch 1:  
current_cost :  1.00214
current_accuracy:  0.522
Epoch 33, CIFAR-10 Batch 1:  
current_cost :  0.946731
current_accuracy:  0.5348
Epoch 34, CIFAR-10 Batch 1:  
current_cost :  0.955999
current_accuracy:  0.5404
Epoch 35, CIFAR-10 Batch 1:  
current_cost :  0.916823
current_accuracy:  0.5398
Epoch 36, CIFAR-10 Batch 1:  
current_cost :  0.948869
current_accuracy:  0.5406
Epoch 37, CIFAR-10 Batch 1:  
current_cost :  0.917223
current_accuracy:  0.537
Epoch 38, CIFAR-10 Batch 1:  
current_cost :  0.920267
current_accuracy:  0.5414
Epoch 39, CIFAR-10 Batch 1:  
current_cost :  0.883743
current_accuracy:  0.5376
Epoch 40, CIFAR-10 Batch 1:  
current_cost :  0.887086
current_accuracy:  0.5324
Epoch 41, CIFAR-10 Batch 1:  
current_cost :  0.88912
current_accuracy:  0.5366
Epoch 42, CIFAR-10 Batch 1:  
current_cost :  0.8769
current_accuracy:  0.5406
Epoch 43, CIFAR-10 Batch 1:  
current_cost :  0.888017
current_accuracy:  0.5352
Epoch 44, CIFAR-10 Batch 1:  
current_cost :  0.841744
current_accuracy:  0.545
Epoch 45, CIFAR-10 Batch 1:  
current_cost :  0.829366
current_accuracy:  0.537
Epoch 46, CIFAR-10 Batch 1:  
current_cost :  0.839944
current_accuracy:  0.5426
Epoch 47, CIFAR-10 Batch 1:  
current_cost :  0.82164
current_accuracy:  0.546
Epoch 48, CIFAR-10 Batch 1:  
current_cost :  0.815285
current_accuracy:  0.5454
Epoch 49, CIFAR-10 Batch 1:  
current_cost :  0.820895
current_accuracy:  0.5384
Epoch 50, CIFAR-10 Batch 1:  
current_cost :  0.763558
current_accuracy:  0.5408
Epoch 51, CIFAR-10 Batch 1:  
current_cost :  0.77213
current_accuracy:  0.5466
Epoch 52, CIFAR-10 Batch 1:  
current_cost :  0.768492
current_accuracy:  0.5456
Epoch 53, CIFAR-10 Batch 1:  
current_cost :  0.790972
current_accuracy:  0.5476
Epoch 54, CIFAR-10 Batch 1:  
current_cost :  0.760026
current_accuracy:  0.5508
Epoch 55, CIFAR-10 Batch 1:  
current_cost :  0.751871
current_accuracy:  0.5486
Epoch 56, CIFAR-10 Batch 1:  
current_cost :  0.752445
current_accuracy:  0.5462
Epoch 57, CIFAR-10 Batch 1:  
current_cost :  0.766486
current_accuracy:  0.5464
Epoch 58, CIFAR-10 Batch 1:  
current_cost :  0.702471
current_accuracy:  0.5516
Epoch 59, CIFAR-10 Batch 1:  
current_cost :  0.74759
current_accuracy:  0.5512
Epoch 60, CIFAR-10 Batch 1:  
current_cost :  0.71783
current_accuracy:  0.5448
Epoch 61, CIFAR-10 Batch 1:  
current_cost :  0.713742
current_accuracy:  0.5482
Epoch 62, CIFAR-10 Batch 1:  
current_cost :  0.749653
current_accuracy:  0.5406
Epoch 63, CIFAR-10 Batch 1:  
current_cost :  0.725135
current_accuracy:  0.5498
Epoch 64, CIFAR-10 Batch 1:  
current_cost :  0.716883
current_accuracy:  0.5412
Epoch 65, CIFAR-10 Batch 1:  
current_cost :  0.701829
current_accuracy:  0.5536
Epoch 66, CIFAR-10 Batch 1:  
current_cost :  0.685877
current_accuracy:  0.545
Epoch 67, CIFAR-10 Batch 1:  
current_cost :  0.691879
current_accuracy:  0.5502
Epoch 68, CIFAR-10 Batch 1:  
current_cost :  0.668968
current_accuracy:  0.5566
Epoch 69, CIFAR-10 Batch 1:  
current_cost :  0.681108
current_accuracy:  0.5482
Epoch 70, CIFAR-10 Batch 1:  
current_cost :  0.686906
current_accuracy:  0.5534
Epoch 71, CIFAR-10 Batch 1:  
current_cost :  0.695101
current_accuracy:  0.555
Epoch 72, CIFAR-10 Batch 1:  
current_cost :  0.658359
current_accuracy:  0.5538
Epoch 73, CIFAR-10 Batch 1:  
current_cost :  0.699706
current_accuracy:  0.5476
Epoch 74, CIFAR-10 Batch 1:  
current_cost :  0.680389
current_accuracy:  0.5474
Epoch 75, CIFAR-10 Batch 1:  
current_cost :  0.662896
current_accuracy:  0.5526
Epoch 76, CIFAR-10 Batch 1:  
current_cost :  0.655797
current_accuracy:  0.551
Epoch 77, CIFAR-10 Batch 1:  
current_cost :  0.665134
current_accuracy:  0.543
Epoch 78, CIFAR-10 Batch 1:  
current_cost :  0.621958
current_accuracy:  0.549
Epoch 79, CIFAR-10 Batch 1:  
current_cost :  0.6821
current_accuracy:  0.539
Epoch 80, CIFAR-10 Batch 1:  
current_cost :  0.619437
current_accuracy:  0.5534
Epoch 81, CIFAR-10 Batch 1:  
current_cost :  0.610252
current_accuracy:  0.5536
Epoch 82, CIFAR-10 Batch 1:  
current_cost :  0.614726
current_accuracy:  0.5512
Epoch 83, CIFAR-10 Batch 1:  
current_cost :  0.599414
current_accuracy:  0.5504
Epoch 84, CIFAR-10 Batch 1:  
current_cost :  0.605255
current_accuracy:  0.5506
Epoch 85, CIFAR-10 Batch 1:  
current_cost :  0.611195
current_accuracy:  0.5518
Epoch 86, CIFAR-10 Batch 1:  
current_cost :  0.587529
current_accuracy:  0.5484
Epoch 87, CIFAR-10 Batch 1:  
current_cost :  0.641272
current_accuracy:  0.548
Epoch 88, CIFAR-10 Batch 1:  
current_cost :  0.604976
current_accuracy:  0.5514
Epoch 89, CIFAR-10 Batch 1:  
current_cost :  0.572501
current_accuracy:  0.5516
Epoch 90, CIFAR-10 Batch 1:  
current_cost :  0.581467
current_accuracy:  0.5596
Epoch 91, CIFAR-10 Batch 1:  
current_cost :  0.603744
current_accuracy:  0.5538
Epoch 92, CIFAR-10 Batch 1:  
current_cost :  0.635809
current_accuracy:  0.5454
Epoch 93, CIFAR-10 Batch 1:  
current_cost :  0.569743
current_accuracy:  0.5468
Epoch 94, CIFAR-10 Batch 1:  
current_cost :  0.578271
current_accuracy:  0.553
Epoch 95, CIFAR-10 Batch 1:  
current_cost :  0.586151
current_accuracy:  0.5462
Epoch 96, CIFAR-10 Batch 1:  
current_cost :  0.548126
current_accuracy:  0.5572
Epoch 97, CIFAR-10 Batch 1:  
current_cost :  0.556146
current_accuracy:  0.5548
Epoch 98, CIFAR-10 Batch 1:  
current_cost :  0.557261
current_accuracy:  0.5528
Epoch 99, CIFAR-10 Batch 1:  
current_cost :  0.544171
current_accuracy:  0.5566
Epoch 100, CIFAR-10 Batch 1:  
current_cost :  0.572191
current_accuracy:  0.5552
Epoch 101, CIFAR-10 Batch 1:  
current_cost :  0.543097
current_accuracy:  0.5542
Epoch 102, CIFAR-10 Batch 1:  
current_cost :  0.550175
current_accuracy:  0.5488
Epoch 103, CIFAR-10 Batch 1:  
current_cost :  0.567764
current_accuracy:  0.5476
Epoch 104, CIFAR-10 Batch 1:  
current_cost :  0.533498
current_accuracy:  0.5524
Epoch 105, CIFAR-10 Batch 1:  
current_cost :  0.543542
current_accuracy:  0.5494
Epoch 106, CIFAR-10 Batch 1:  
current_cost :  0.534856
current_accuracy:  0.5454
Epoch 107, CIFAR-10 Batch 1:  
current_cost :  0.506176
current_accuracy:  0.5564
Epoch 108, CIFAR-10 Batch 1:  
current_cost :  0.513227
current_accuracy:  0.5558
Epoch 109, CIFAR-10 Batch 1:  
current_cost :  0.475328
current_accuracy:  0.5554
Epoch 110, CIFAR-10 Batch 1:  
current_cost :  0.489377
current_accuracy:  0.555
Epoch 111, CIFAR-10 Batch 1:  
current_cost :  0.507472
current_accuracy:  0.5436
Epoch 112, CIFAR-10 Batch 1:  
current_cost :  0.506323
current_accuracy:  0.5486
Epoch 113, CIFAR-10 Batch 1:  
current_cost :  0.47648
current_accuracy:  0.5506
Epoch 114, CIFAR-10 Batch 1:  
current_cost :  0.520891
current_accuracy:  0.5504
Epoch 115, CIFAR-10 Batch 1:  
current_cost :  0.51763
current_accuracy:  0.5536
Epoch 116, CIFAR-10 Batch 1:  
current_cost :  0.492875
current_accuracy:  0.552
Epoch 117, CIFAR-10 Batch 1:  
current_cost :  0.461681
current_accuracy:  0.5544
Epoch 118, CIFAR-10 Batch 1:  
current_cost :  0.488445
current_accuracy:  0.5442
Epoch 119, CIFAR-10 Batch 1:  
current_cost :  0.453936
current_accuracy:  0.5452
Epoch 120, CIFAR-10 Batch 1:  
current_cost :  0.484841
current_accuracy:  0.5452
Epoch 121, CIFAR-10 Batch 1:  
current_cost :  0.478375
current_accuracy:  0.5458
Epoch 122, CIFAR-10 Batch 1:  
current_cost :  0.521841
current_accuracy:  0.5434
Epoch 123, CIFAR-10 Batch 1:  
current_cost :  0.422452
current_accuracy:  0.5562
Epoch 124, CIFAR-10 Batch 1:  
current_cost :  0.426111
current_accuracy:  0.5568
Epoch 125, CIFAR-10 Batch 1:  
current_cost :  0.465679
current_accuracy:  0.5448
Epoch 126, CIFAR-10 Batch 1:  
current_cost :  0.418472
current_accuracy:  0.5562
Epoch 127, CIFAR-10 Batch 1:  
current_cost :  0.47442
current_accuracy:  0.5516
Epoch 128, CIFAR-10 Batch 1:  
current_cost :  0.432721
current_accuracy:  0.551
Epoch 129, CIFAR-10 Batch 1:  
current_cost :  0.423597
current_accuracy:  0.5476
Epoch 130, CIFAR-10 Batch 1:  
current_cost :  0.415462
current_accuracy:  0.5492
Epoch 131, CIFAR-10 Batch 1:  
current_cost :  0.43646
current_accuracy:  0.538
Epoch 132, CIFAR-10 Batch 1:  
current_cost :  0.42423
current_accuracy:  0.54
Epoch 133, CIFAR-10 Batch 1:  
current_cost :  0.436021
current_accuracy:  0.5332
Epoch 134, CIFAR-10 Batch 1:  
current_cost :  0.426425
current_accuracy:  0.5428
Epoch 135, CIFAR-10 Batch 1:  
current_cost :  0.437202
current_accuracy:  0.551
Epoch 136, CIFAR-10 Batch 1:  
current_cost :  0.449872
current_accuracy:  0.5438
Epoch 137, CIFAR-10 Batch 1:  
current_cost :  0.428792
current_accuracy:  0.5486
Epoch 138, CIFAR-10 Batch 1:  
current_cost :  0.424041
current_accuracy:  0.553
Epoch 139, CIFAR-10 Batch 1:  
current_cost :  0.404262
current_accuracy:  0.552
Epoch 140, CIFAR-10 Batch 1:  
current_cost :  0.427057
current_accuracy:  0.549
Epoch 141, CIFAR-10 Batch 1:  
current_cost :  0.418841
current_accuracy:  0.5506
Epoch 142, CIFAR-10 Batch 1:  
current_cost :  0.394751
current_accuracy:  0.5468
Epoch 143, CIFAR-10 Batch 1:  
current_cost :  0.39436
current_accuracy:  0.5512
Epoch 144, CIFAR-10 Batch 1:  
current_cost :  0.430312
current_accuracy:  0.5468
Epoch 145, CIFAR-10 Batch 1:  
current_cost :  0.378507
current_accuracy:  0.5514
Epoch 146, CIFAR-10 Batch 1:  
current_cost :  0.400123
current_accuracy:  0.5496
Epoch 147, CIFAR-10 Batch 1:  
current_cost :  0.392631
current_accuracy:  0.5414
Epoch 148, CIFAR-10 Batch 1:  
current_cost :  0.390688
current_accuracy:  0.549
Epoch 149, CIFAR-10 Batch 1:  
current_cost :  0.393943
current_accuracy:  0.5438
Epoch 150, CIFAR-10 Batch 1:  
current_cost :  0.400576
current_accuracy:  0.5394
Epoch 151, CIFAR-10 Batch 1:  
current_cost :  0.391374
current_accuracy:  0.549
Epoch 152, CIFAR-10 Batch 1:  
current_cost :  0.384483
current_accuracy:  0.5512
Epoch 153, CIFAR-10 Batch 1:  
current_cost :  0.381629
current_accuracy:  0.541
Epoch 154, CIFAR-10 Batch 1:  
current_cost :  0.397209
current_accuracy:  0.547
Epoch 155, CIFAR-10 Batch 1:  
current_cost :  0.358776
current_accuracy:  0.5452
Epoch 156, CIFAR-10 Batch 1:  
current_cost :  0.393558
current_accuracy:  0.5456
Epoch 157, CIFAR-10 Batch 1:  
current_cost :  0.35901
current_accuracy:  0.5444
Epoch 158, CIFAR-10 Batch 1:  
current_cost :  0.37068
current_accuracy:  0.5468
Epoch 159, CIFAR-10 Batch 1:  
current_cost :  0.352173
current_accuracy:  0.5508
Epoch 160, CIFAR-10 Batch 1:  
current_cost :  0.39371
current_accuracy:  0.542
Epoch 161, CIFAR-10 Batch 1:  
current_cost :  0.359357
current_accuracy:  0.55
Epoch 162, CIFAR-10 Batch 1:  
current_cost :  0.354984
current_accuracy:  0.5528
Epoch 163, CIFAR-10 Batch 1:  
current_cost :  0.351541
current_accuracy:  0.5512
Epoch 164, CIFAR-10 Batch 1:  
current_cost :  0.347484
current_accuracy:  0.551
Epoch 165, CIFAR-10 Batch 1:  
current_cost :  0.376439
current_accuracy:  0.553
Epoch 166, CIFAR-10 Batch 1:  
current_cost :  0.385638
current_accuracy:  0.5452
Epoch 167, CIFAR-10 Batch 1:  
current_cost :  0.342987
current_accuracy:  0.5484
Epoch 168, CIFAR-10 Batch 1:  
current_cost :  0.355877
current_accuracy:  0.5432
Epoch 169, CIFAR-10 Batch 1:  
current_cost :  0.350247
current_accuracy:  0.5498
Epoch 170, CIFAR-10 Batch 1:  
current_cost :  0.353755
current_accuracy:  0.549
Epoch 171, CIFAR-10 Batch 1:  
current_cost :  0.384666
current_accuracy:  0.5426
Epoch 172, CIFAR-10 Batch 1:  
current_cost :  0.373948
current_accuracy:  0.5394
Epoch 173, CIFAR-10 Batch 1:  
current_cost :  0.343173
current_accuracy:  0.5494
Epoch 174, CIFAR-10 Batch 1:  
current_cost :  0.340493
current_accuracy:  0.548
Epoch 175, CIFAR-10 Batch 1:  
current_cost :  0.336068
current_accuracy:  0.5502
Epoch 176, CIFAR-10 Batch 1:  
current_cost :  0.340041
current_accuracy:  0.5476
Epoch 177, CIFAR-10 Batch 1:  
current_cost :  0.360476
current_accuracy:  0.5526
Epoch 178, CIFAR-10 Batch 1:  
current_cost :  0.314163
current_accuracy:  0.5442
Epoch 179, CIFAR-10 Batch 1:  
current_cost :  0.344081
current_accuracy:  0.5496
Epoch 180, CIFAR-10 Batch 1:  
current_cost :  0.351199
current_accuracy:  0.5484
Epoch 181, CIFAR-10 Batch 1:  
current_cost :  0.358431
current_accuracy:  0.5476
Epoch 182, CIFAR-10 Batch 1:  
current_cost :  0.362342
current_accuracy:  0.5418
Epoch 183, CIFAR-10 Batch 1:  
current_cost :  0.359899
current_accuracy:  0.5558
Epoch 184, CIFAR-10 Batch 1:  
current_cost :  0.356691
current_accuracy:  0.55
Epoch 185, CIFAR-10 Batch 1:  
current_cost :  0.346404
current_accuracy:  0.5438
Epoch 186, CIFAR-10 Batch 1:  
current_cost :  0.330655
current_accuracy:  0.5486
Epoch 187, CIFAR-10 Batch 1:  
current_cost :  0.321151
current_accuracy:  0.549
Epoch 188, CIFAR-10 Batch 1:  
current_cost :  0.348875
current_accuracy:  0.5478
Epoch 189, CIFAR-10 Batch 1:  
current_cost :  0.372936
current_accuracy:  0.5372
Epoch 190, CIFAR-10 Batch 1:  
current_cost :  0.349454
current_accuracy:  0.548
Epoch 191, CIFAR-10 Batch 1:  
current_cost :  0.34384
current_accuracy:  0.5438
Epoch 192, CIFAR-10 Batch 1:  
current_cost :  0.370202
current_accuracy:  0.545
Epoch 193, CIFAR-10 Batch 1:  
current_cost :  0.338454
current_accuracy:  0.5396
Epoch 194, CIFAR-10 Batch 1:  
current_cost :  0.35178
current_accuracy:  0.5392
Epoch 195, CIFAR-10 Batch 1:  
current_cost :  0.338924
current_accuracy:  0.5444
Epoch 196, CIFAR-10 Batch 1:  
current_cost :  0.31931
current_accuracy:  0.5444
Epoch 197, CIFAR-10 Batch 1:  
current_cost :  0.346599
current_accuracy:  0.532
Epoch 198, CIFAR-10 Batch 1:  
current_cost :  0.335672
current_accuracy:  0.539
Epoch 199, CIFAR-10 Batch 1:  
current_cost :  0.311372
current_accuracy:  0.5332
Epoch 200, CIFAR-10 Batch 1:  
current_cost :  0.314508
current_accuracy:  0.5444

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.


In [22]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  
current_cost :  2.05214
current_accuracy:  0.3264
Epoch  1, CIFAR-10 Batch 2:  
current_cost :  1.85561
current_accuracy:  0.391
Epoch  1, CIFAR-10 Batch 3:  
current_cost :  1.62818
current_accuracy:  0.4152
Epoch  1, CIFAR-10 Batch 4:  
current_cost :  1.66735
current_accuracy:  0.4608
Epoch  1, CIFAR-10 Batch 5:  
current_cost :  1.61077
current_accuracy:  0.457
Epoch  2, CIFAR-10 Batch 1:  
current_cost :  1.63574
current_accuracy:  0.4886
Epoch  2, CIFAR-10 Batch 2:  
current_cost :  1.52349
current_accuracy:  0.5028
Epoch  2, CIFAR-10 Batch 3:  
current_cost :  1.31348
current_accuracy:  0.4904
Epoch  2, CIFAR-10 Batch 4:  
current_cost :  1.46085
current_accuracy:  0.5302
Epoch  2, CIFAR-10 Batch 5:  
current_cost :  1.46097
current_accuracy:  0.5264
Epoch  3, CIFAR-10 Batch 1:  
current_cost :  1.47797
current_accuracy:  0.5316
Epoch  3, CIFAR-10 Batch 2:  
current_cost :  1.35776
current_accuracy:  0.547
Epoch  3, CIFAR-10 Batch 3:  
current_cost :  1.13741
current_accuracy:  0.5346
Epoch  3, CIFAR-10 Batch 4:  
current_cost :  1.35914
current_accuracy:  0.5396
Epoch  3, CIFAR-10 Batch 5:  
current_cost :  1.3417
current_accuracy:  0.5462
Epoch  4, CIFAR-10 Batch 1:  
current_cost :  1.3655
current_accuracy:  0.5542
Epoch  4, CIFAR-10 Batch 2:  
current_cost :  1.21832
current_accuracy:  0.557
Epoch  4, CIFAR-10 Batch 3:  
current_cost :  1.01516
current_accuracy:  0.5542
Epoch  4, CIFAR-10 Batch 4:  
current_cost :  1.28901
current_accuracy:  0.5628
Epoch  4, CIFAR-10 Batch 5:  
current_cost :  1.2561
current_accuracy:  0.5594
Epoch  5, CIFAR-10 Batch 1:  
current_cost :  1.30023
current_accuracy:  0.561
Epoch  5, CIFAR-10 Batch 2:  
current_cost :  1.19325
current_accuracy:  0.568
Epoch  5, CIFAR-10 Batch 3:  
current_cost :  0.993075
current_accuracy:  0.5542
Epoch  5, CIFAR-10 Batch 4:  
current_cost :  1.22668
current_accuracy:  0.5718
Epoch  5, CIFAR-10 Batch 5:  
current_cost :  1.1994
current_accuracy:  0.5714
Epoch  6, CIFAR-10 Batch 1:  
current_cost :  1.1735
current_accuracy:  0.571
Epoch  6, CIFAR-10 Batch 2:  
current_cost :  1.13312
current_accuracy:  0.5726
Epoch  6, CIFAR-10 Batch 3:  
current_cost :  0.880036
current_accuracy:  0.566
Epoch  6, CIFAR-10 Batch 4:  
current_cost :  1.16491
current_accuracy:  0.5874
Epoch  6, CIFAR-10 Batch 5:  
current_cost :  1.14578
current_accuracy:  0.5842
Epoch  7, CIFAR-10 Batch 1:  
current_cost :  1.14632
current_accuracy:  0.584
Epoch  7, CIFAR-10 Batch 2:  
current_cost :  1.04645
current_accuracy:  0.587
Epoch  7, CIFAR-10 Batch 3:  
current_cost :  0.874936
current_accuracy:  0.5718
Epoch  7, CIFAR-10 Batch 4:  
current_cost :  1.15343
current_accuracy:  0.5806
Epoch  7, CIFAR-10 Batch 5:  
current_cost :  1.0685
current_accuracy:  0.5996
Epoch  8, CIFAR-10 Batch 1:  
current_cost :  1.0778
current_accuracy:  0.5958
Epoch  8, CIFAR-10 Batch 2:  
current_cost :  1.05132
current_accuracy:  0.5964
Epoch  8, CIFAR-10 Batch 3:  
current_cost :  0.817447
current_accuracy:  0.5962
Epoch  8, CIFAR-10 Batch 4:  
current_cost :  1.06525
current_accuracy:  0.5996
Epoch  8, CIFAR-10 Batch 5:  
current_cost :  1.03905
current_accuracy:  0.5998
Epoch  9, CIFAR-10 Batch 1:  
current_cost :  1.07904
current_accuracy:  0.5976
Epoch  9, CIFAR-10 Batch 2:  
current_cost :  0.948894
current_accuracy:  0.6044
Epoch  9, CIFAR-10 Batch 3:  
current_cost :  0.81156
current_accuracy:  0.6028
Epoch  9, CIFAR-10 Batch 4:  
current_cost :  1.02829
current_accuracy:  0.6134
Epoch  9, CIFAR-10 Batch 5:  
current_cost :  1.02369
current_accuracy:  0.5958
Epoch 10, CIFAR-10 Batch 1:  
current_cost :  1.03601
current_accuracy:  0.6004
Epoch 10, CIFAR-10 Batch 2:  
current_cost :  0.968747
current_accuracy:  0.607
Epoch 10, CIFAR-10 Batch 3:  
current_cost :  0.755293
current_accuracy:  0.615
Epoch 10, CIFAR-10 Batch 4:  
current_cost :  1.02905
current_accuracy:  0.6108
Epoch 10, CIFAR-10 Batch 5:  
current_cost :  0.965616
current_accuracy:  0.6114
Epoch 11, CIFAR-10 Batch 1:  
current_cost :  1.01451
current_accuracy:  0.601
Epoch 11, CIFAR-10 Batch 2:  
current_cost :  0.906653
current_accuracy:  0.6148
Epoch 11, CIFAR-10 Batch 3:  
current_cost :  0.737788
current_accuracy:  0.5992
Epoch 11, CIFAR-10 Batch 4:  
current_cost :  1.00325
current_accuracy:  0.6156
Epoch 11, CIFAR-10 Batch 5:  
current_cost :  0.891908
current_accuracy:  0.6158
Epoch 12, CIFAR-10 Batch 1:  
current_cost :  0.982458
current_accuracy:  0.6038
Epoch 12, CIFAR-10 Batch 2:  
current_cost :  0.909366
current_accuracy:  0.6198
Epoch 12, CIFAR-10 Batch 3:  
current_cost :  0.692452
current_accuracy:  0.6046
Epoch 12, CIFAR-10 Batch 4:  
current_cost :  0.977984
current_accuracy:  0.616
Epoch 12, CIFAR-10 Batch 5:  
current_cost :  0.938867
current_accuracy:  0.6122
Epoch 13, CIFAR-10 Batch 1:  
current_cost :  0.987552
current_accuracy:  0.6144
Epoch 13, CIFAR-10 Batch 2:  
current_cost :  0.850388
current_accuracy:  0.6246
Epoch 13, CIFAR-10 Batch 3:  
current_cost :  0.688778
current_accuracy:  0.6138
Epoch 13, CIFAR-10 Batch 4:  
current_cost :  0.910381
current_accuracy:  0.6236
Epoch 13, CIFAR-10 Batch 5:  
current_cost :  0.874153
current_accuracy:  0.6232
Epoch 14, CIFAR-10 Batch 1:  
current_cost :  0.906823
current_accuracy:  0.615
Epoch 14, CIFAR-10 Batch 2:  
current_cost :  0.857179
current_accuracy:  0.6206
Epoch 14, CIFAR-10 Batch 3:  
current_cost :  0.682546
current_accuracy:  0.618
Epoch 14, CIFAR-10 Batch 4:  
current_cost :  0.89231
current_accuracy:  0.6246
Epoch 14, CIFAR-10 Batch 5:  
current_cost :  0.858071
current_accuracy:  0.6204
Epoch 15, CIFAR-10 Batch 1:  
current_cost :  0.873849
current_accuracy:  0.6208
Epoch 15, CIFAR-10 Batch 2:  
current_cost :  0.778933
current_accuracy:  0.628
Epoch 15, CIFAR-10 Batch 3:  
current_cost :  0.645421
current_accuracy:  0.6226
Epoch 15, CIFAR-10 Batch 4:  
current_cost :  0.884233
current_accuracy:  0.6274
Epoch 15, CIFAR-10 Batch 5:  
current_cost :  0.848894
current_accuracy:  0.6212
Epoch 16, CIFAR-10 Batch 1:  
current_cost :  0.861883
current_accuracy:  0.6254
Epoch 16, CIFAR-10 Batch 2:  
current_cost :  0.772284
current_accuracy:  0.6346
Epoch 16, CIFAR-10 Batch 3:  
current_cost :  0.619329
current_accuracy:  0.6234
Epoch 16, CIFAR-10 Batch 4:  
current_cost :  0.848185
current_accuracy:  0.6294
Epoch 16, CIFAR-10 Batch 5:  
current_cost :  0.826209
current_accuracy:  0.6284
Epoch 17, CIFAR-10 Batch 1:  
current_cost :  0.865087
current_accuracy:  0.6144
Epoch 17, CIFAR-10 Batch 2:  
current_cost :  0.762112
current_accuracy:  0.6336
Epoch 17, CIFAR-10 Batch 3:  
current_cost :  0.620812
current_accuracy:  0.63
Epoch 17, CIFAR-10 Batch 4:  
current_cost :  0.847707
current_accuracy:  0.6322
Epoch 17, CIFAR-10 Batch 5:  
current_cost :  0.782372
current_accuracy:  0.6356
Epoch 18, CIFAR-10 Batch 1:  
current_cost :  0.794587
current_accuracy:  0.623
Epoch 18, CIFAR-10 Batch 2:  
current_cost :  0.759554
current_accuracy:  0.6284
Epoch 18, CIFAR-10 Batch 3:  
current_cost :  0.603489
current_accuracy:  0.6288
Epoch 18, CIFAR-10 Batch 4:  
current_cost :  0.798806
current_accuracy:  0.6364
Epoch 18, CIFAR-10 Batch 5:  
current_cost :  0.759736
current_accuracy:  0.633
Epoch 19, CIFAR-10 Batch 1:  
current_cost :  0.80083
current_accuracy:  0.6312
Epoch 19, CIFAR-10 Batch 2:  
current_cost :  0.753476
current_accuracy:  0.637
Epoch 19, CIFAR-10 Batch 3:  
current_cost :  0.631251
current_accuracy:  0.6266
Epoch 19, CIFAR-10 Batch 4:  
current_cost :  0.858175
current_accuracy:  0.6264
Epoch 19, CIFAR-10 Batch 5:  
current_cost :  0.764042
current_accuracy:  0.6326
Epoch 20, CIFAR-10 Batch 1:  
current_cost :  0.802787
current_accuracy:  0.6352
Epoch 20, CIFAR-10 Batch 2:  
current_cost :  0.74485
current_accuracy:  0.6318
Epoch 20, CIFAR-10 Batch 3:  
current_cost :  0.568388
current_accuracy:  0.634
Epoch 20, CIFAR-10 Batch 4:  
current_cost :  0.791921
current_accuracy:  0.6268
Epoch 20, CIFAR-10 Batch 5:  
current_cost :  0.75488
current_accuracy:  0.642
Epoch 21, CIFAR-10 Batch 1:  
current_cost :  0.783937
current_accuracy:  0.633
Epoch 21, CIFAR-10 Batch 2:  
current_cost :  0.71918
current_accuracy:  0.636
Epoch 21, CIFAR-10 Batch 3:  
current_cost :  0.578806
current_accuracy:  0.6352
Epoch 21, CIFAR-10 Batch 4:  
current_cost :  0.822729
current_accuracy:  0.6348
Epoch 21, CIFAR-10 Batch 5:  
current_cost :  0.765705
current_accuracy:  0.6382
Epoch 22, CIFAR-10 Batch 1:  
current_cost :  0.774633
current_accuracy:  0.6246
Epoch 22, CIFAR-10 Batch 2:  
current_cost :  0.692477
current_accuracy:  0.6422
Epoch 22, CIFAR-10 Batch 3:  
current_cost :  0.586415
current_accuracy:  0.6348
Epoch 22, CIFAR-10 Batch 4:  
current_cost :  0.748878
current_accuracy:  0.6358
Epoch 22, CIFAR-10 Batch 5:  
current_cost :  0.687059
current_accuracy:  0.6472
Epoch 23, CIFAR-10 Batch 1:  
current_cost :  0.800009
current_accuracy:  0.6306
Epoch 23, CIFAR-10 Batch 2:  
current_cost :  0.699912
current_accuracy:  0.641
Epoch 23, CIFAR-10 Batch 3:  
current_cost :  0.535438
current_accuracy:  0.6358
Epoch 23, CIFAR-10 Batch 4:  
current_cost :  0.737472
current_accuracy:  0.6392
Epoch 23, CIFAR-10 Batch 5:  
current_cost :  0.7277
current_accuracy:  0.6418
Epoch 24, CIFAR-10 Batch 1:  
current_cost :  0.760413
current_accuracy:  0.6272
Epoch 24, CIFAR-10 Batch 2:  
current_cost :  0.676897
current_accuracy:  0.6396
Epoch 24, CIFAR-10 Batch 3:  
current_cost :  0.561815
current_accuracy:  0.643
Epoch 24, CIFAR-10 Batch 4:  
current_cost :  0.786603
current_accuracy:  0.6406
Epoch 24, CIFAR-10 Batch 5:  
current_cost :  0.661278
current_accuracy:  0.645
Epoch 25, CIFAR-10 Batch 1:  
current_cost :  0.740723
current_accuracy:  0.6452
Epoch 25, CIFAR-10 Batch 2:  
current_cost :  0.712472
current_accuracy:  0.6418
Epoch 25, CIFAR-10 Batch 3:  
current_cost :  0.516226
current_accuracy:  0.638
Epoch 25, CIFAR-10 Batch 4:  
current_cost :  0.703034
current_accuracy:  0.6402
Epoch 25, CIFAR-10 Batch 5:  
current_cost :  0.668344
current_accuracy:  0.641
Epoch 26, CIFAR-10 Batch 1:  
current_cost :  0.722358
current_accuracy:  0.6438
Epoch 26, CIFAR-10 Batch 2:  
current_cost :  0.663965
current_accuracy:  0.6478
Epoch 26, CIFAR-10 Batch 3:  
current_cost :  0.532167
current_accuracy:  0.646
Epoch 26, CIFAR-10 Batch 4:  
current_cost :  0.721041
current_accuracy:  0.6434
Epoch 26, CIFAR-10 Batch 5:  
current_cost :  0.67298
current_accuracy:  0.6414
Epoch 27, CIFAR-10 Batch 1:  
current_cost :  0.752919
current_accuracy:  0.639
Epoch 27, CIFAR-10 Batch 2:  
current_cost :  0.718785
current_accuracy:  0.6376
Epoch 27, CIFAR-10 Batch 3:  
current_cost :  0.520398
current_accuracy:  0.6456
Epoch 27, CIFAR-10 Batch 4:  
current_cost :  0.7168
current_accuracy:  0.6422
Epoch 27, CIFAR-10 Batch 5:  
current_cost :  0.635382
current_accuracy:  0.653
Epoch 28, CIFAR-10 Batch 1:  
current_cost :  0.688323
current_accuracy:  0.6444
Epoch 28, CIFAR-10 Batch 2:  
current_cost :  0.653784
current_accuracy:  0.6474
Epoch 28, CIFAR-10 Batch 3:  
current_cost :  0.523071
current_accuracy:  0.6498
Epoch 28, CIFAR-10 Batch 4:  
current_cost :  0.704962
current_accuracy:  0.6396
Epoch 28, CIFAR-10 Batch 5:  
current_cost :  0.647338
current_accuracy:  0.6494
Epoch 29, CIFAR-10 Batch 1:  
current_cost :  0.687822
current_accuracy:  0.6346
Epoch 29, CIFAR-10 Batch 2:  
current_cost :  0.634375
current_accuracy:  0.6448
Epoch 29, CIFAR-10 Batch 3:  
current_cost :  0.524452
current_accuracy:  0.65
Epoch 29, CIFAR-10 Batch 4:  
current_cost :  0.646026
current_accuracy:  0.6504
Epoch 29, CIFAR-10 Batch 5:  
current_cost :  0.648259
current_accuracy:  0.6462
Epoch 30, CIFAR-10 Batch 1:  
current_cost :  0.669909
current_accuracy:  0.6452
Epoch 30, CIFAR-10 Batch 2:  
current_cost :  0.635321
current_accuracy:  0.6486
Epoch 30, CIFAR-10 Batch 3:  
current_cost :  0.515409
current_accuracy:  0.6458
Epoch 30, CIFAR-10 Batch 4:  
current_cost :  0.647997
current_accuracy:  0.6466
Epoch 30, CIFAR-10 Batch 5:  
current_cost :  0.659281
current_accuracy:  0.6472
Epoch 31, CIFAR-10 Batch 1:  
current_cost :  0.700379
current_accuracy:  0.636
Epoch 31, CIFAR-10 Batch 2:  
current_cost :  0.610493
current_accuracy:  0.6398
Epoch 31, CIFAR-10 Batch 3:  
current_cost :  0.482675
current_accuracy:  0.6468
Epoch 31, CIFAR-10 Batch 4:  
current_cost :  0.675644
current_accuracy:  0.6486
Epoch 31, CIFAR-10 Batch 5:  
current_cost :  0.63472
current_accuracy:  0.6398
Epoch 32, CIFAR-10 Batch 1:  
current_cost :  0.718259
current_accuracy:  0.6372
Epoch 32, CIFAR-10 Batch 2:  
current_cost :  0.612693
current_accuracy:  0.646
Epoch 32, CIFAR-10 Batch 3:  
current_cost :  0.440626
current_accuracy:  0.6528
Epoch 32, CIFAR-10 Batch 4:  
current_cost :  0.631377
current_accuracy:  0.6504
Epoch 32, CIFAR-10 Batch 5:  
current_cost :  0.626941
current_accuracy:  0.65
Epoch 33, CIFAR-10 Batch 1:  
current_cost :  0.656243
current_accuracy:  0.6462
Epoch 33, CIFAR-10 Batch 2:  
current_cost :  0.608631
current_accuracy:  0.649
Epoch 33, CIFAR-10 Batch 3:  
current_cost :  0.485276
current_accuracy:  0.6396
Epoch 33, CIFAR-10 Batch 4:  
current_cost :  0.626196
current_accuracy:  0.6564
Epoch 33, CIFAR-10 Batch 5:  
current_cost :  0.628427
current_accuracy:  0.6466
Epoch 34, CIFAR-10 Batch 1:  
current_cost :  0.660843
current_accuracy:  0.6484
Epoch 34, CIFAR-10 Batch 2:  
current_cost :  0.56503
current_accuracy:  0.6452
Epoch 34, CIFAR-10 Batch 3:  
current_cost :  0.456833
current_accuracy:  0.6496
Epoch 34, CIFAR-10 Batch 4:  
current_cost :  0.623431
current_accuracy:  0.6506
Epoch 34, CIFAR-10 Batch 5:  
current_cost :  0.611878
current_accuracy:  0.6522
Epoch 35, CIFAR-10 Batch 1:  
current_cost :  0.65357
current_accuracy:  0.6472
Epoch 35, CIFAR-10 Batch 2:  
current_cost :  0.65632
current_accuracy:  0.6446
Epoch 35, CIFAR-10 Batch 3:  
current_cost :  0.450064
current_accuracy:  0.6526
Epoch 35, CIFAR-10 Batch 4:  
current_cost :  0.589446
current_accuracy:  0.6492
Epoch 35, CIFAR-10 Batch 5:  
current_cost :  0.592372
current_accuracy:  0.649
Epoch 36, CIFAR-10 Batch 1:  
current_cost :  0.644754
current_accuracy:  0.6424
Epoch 36, CIFAR-10 Batch 2:  
current_cost :  0.583844
current_accuracy:  0.651
Epoch 36, CIFAR-10 Batch 3:  
current_cost :  0.426635
current_accuracy:  0.6486
Epoch 36, CIFAR-10 Batch 4:  
current_cost :  0.605543
current_accuracy:  0.6412
Epoch 36, CIFAR-10 Batch 5:  
current_cost :  0.569255
current_accuracy:  0.6522
Epoch 37, CIFAR-10 Batch 1:  
current_cost :  0.61583
current_accuracy:  0.6512
Epoch 37, CIFAR-10 Batch 2:  
current_cost :  0.584018
current_accuracy:  0.655
Epoch 37, CIFAR-10 Batch 3:  
current_cost :  0.436444
current_accuracy:  0.6432
Epoch 37, CIFAR-10 Batch 4:  
current_cost :  0.590919
current_accuracy:  0.6486
Epoch 37, CIFAR-10 Batch 5:  
current_cost :  0.574483
current_accuracy:  0.6452
Epoch 38, CIFAR-10 Batch 1:  
current_cost :  0.616645
current_accuracy:  0.641
Epoch 38, CIFAR-10 Batch 2:  
current_cost :  0.554783
current_accuracy:  0.6512
Epoch 38, CIFAR-10 Batch 3:  
current_cost :  0.457029
current_accuracy:  0.6526
Epoch 38, CIFAR-10 Batch 4:  
current_cost :  0.620848
current_accuracy:  0.6474
Epoch 38, CIFAR-10 Batch 5:  
current_cost :  0.616763
current_accuracy:  0.634
Epoch 39, CIFAR-10 Batch 1:  
current_cost :  0.621337
current_accuracy:  0.6468
Epoch 39, CIFAR-10 Batch 2:  
current_cost :  0.564807
current_accuracy:  0.645
Epoch 39, CIFAR-10 Batch 3:  
current_cost :  0.418509
current_accuracy:  0.6548
Epoch 39, CIFAR-10 Batch 4:  
current_cost :  0.548557
current_accuracy:  0.652
Epoch 39, CIFAR-10 Batch 5:  
current_cost :  0.633875
current_accuracy:  0.6296
Epoch 40, CIFAR-10 Batch 1:  
current_cost :  0.622132
current_accuracy:  0.6396
Epoch 40, CIFAR-10 Batch 2:  
current_cost :  0.555568
current_accuracy:  0.65
Epoch 40, CIFAR-10 Batch 3:  
current_cost :  0.438921
current_accuracy:  0.6496
Epoch 40, CIFAR-10 Batch 4:  
current_cost :  0.548183
current_accuracy:  0.6526
Epoch 40, CIFAR-10 Batch 5:  
current_cost :  0.571244
current_accuracy:  0.651
Epoch 41, CIFAR-10 Batch 1:  
current_cost :  0.608369
current_accuracy:  0.64
Epoch 41, CIFAR-10 Batch 2:  
current_cost :  0.552734
current_accuracy:  0.6466
Epoch 41, CIFAR-10 Batch 3:  
current_cost :  0.417883
current_accuracy:  0.6484
Epoch 41, CIFAR-10 Batch 4:  
current_cost :  0.543928
current_accuracy:  0.645
Epoch 41, CIFAR-10 Batch 5:  
current_cost :  0.543326
current_accuracy:  0.648
Epoch 42, CIFAR-10 Batch 1:  
current_cost :  0.613289
current_accuracy:  0.6404
Epoch 42, CIFAR-10 Batch 2:  
current_cost :  0.556369
current_accuracy:  0.6486
Epoch 42, CIFAR-10 Batch 3:  
current_cost :  0.441199
current_accuracy:  0.6512
Epoch 42, CIFAR-10 Batch 4:  
current_cost :  0.553578
current_accuracy:  0.6398
Epoch 42, CIFAR-10 Batch 5:  
current_cost :  0.546975
current_accuracy:  0.6376
Epoch 43, CIFAR-10 Batch 1:  
current_cost :  0.558543
current_accuracy:  0.646
Epoch 43, CIFAR-10 Batch 2:  
current_cost :  0.518235
current_accuracy:  0.6492
Epoch 43, CIFAR-10 Batch 3:  
current_cost :  0.431485
current_accuracy:  0.6542
Epoch 43, CIFAR-10 Batch 4:  
current_cost :  0.53318
current_accuracy:  0.6502
Epoch 43, CIFAR-10 Batch 5:  
current_cost :  0.542653
current_accuracy:  0.6452
Epoch 44, CIFAR-10 Batch 1:  
current_cost :  0.58055
current_accuracy:  0.6458
Epoch 44, CIFAR-10 Batch 2:  
current_cost :  0.53559
current_accuracy:  0.6476
Epoch 44, CIFAR-10 Batch 3:  
current_cost :  0.425924
current_accuracy:  0.6488
Epoch 44, CIFAR-10 Batch 4:  
current_cost :  0.576895
current_accuracy:  0.6446
Epoch 44, CIFAR-10 Batch 5:  
current_cost :  0.537346
current_accuracy:  0.6422
Epoch 45, CIFAR-10 Batch 1:  
current_cost :  0.532233
current_accuracy:  0.6452
Epoch 45, CIFAR-10 Batch 2:  
current_cost :  0.517129
current_accuracy:  0.6486
Epoch 45, CIFAR-10 Batch 3:  
current_cost :  0.418052
current_accuracy:  0.649
Epoch 45, CIFAR-10 Batch 4:  
current_cost :  0.513611
current_accuracy:  0.6504
Epoch 45, CIFAR-10 Batch 5:  
current_cost :  0.531381
current_accuracy:  0.6454
Epoch 46, CIFAR-10 Batch 1:  
current_cost :  0.593829
current_accuracy:  0.6502
Epoch 46, CIFAR-10 Batch 2:  
current_cost :  0.503298
current_accuracy:  0.6496
Epoch 46, CIFAR-10 Batch 3:  
current_cost :  0.403824
current_accuracy:  0.655
Epoch 46, CIFAR-10 Batch 4:  
current_cost :  0.530741
current_accuracy:  0.65
Epoch 46, CIFAR-10 Batch 5:  
current_cost :  0.518665
current_accuracy:  0.6548
Epoch 47, CIFAR-10 Batch 1:  
current_cost :  0.568085
current_accuracy:  0.6454
Epoch 47, CIFAR-10 Batch 2:  
current_cost :  0.515321
current_accuracy:  0.6464
Epoch 47, CIFAR-10 Batch 3:  
current_cost :  0.387766
current_accuracy:  0.6502
Epoch 47, CIFAR-10 Batch 4:  
current_cost :  0.510773
current_accuracy:  0.6502
Epoch 47, CIFAR-10 Batch 5:  
current_cost :  0.49171
current_accuracy:  0.6474
Epoch 48, CIFAR-10 Batch 1:  
current_cost :  0.579809
current_accuracy:  0.6458
Epoch 48, CIFAR-10 Batch 2:  
current_cost :  0.499665
current_accuracy:  0.6514
Epoch 48, CIFAR-10 Batch 3:  
current_cost :  0.363053
current_accuracy:  0.6472
Epoch 48, CIFAR-10 Batch 4:  
current_cost :  0.53042
current_accuracy:  0.647
Epoch 48, CIFAR-10 Batch 5:  
current_cost :  0.527523
current_accuracy:  0.6492
Epoch 49, CIFAR-10 Batch 1:  
current_cost :  0.552001
current_accuracy:  0.6466
Epoch 49, CIFAR-10 Batch 2:  
current_cost :  0.49293
current_accuracy:  0.651
Epoch 49, CIFAR-10 Batch 3:  
current_cost :  0.380937
current_accuracy:  0.6544
Epoch 49, CIFAR-10 Batch 4:  
current_cost :  0.466807
current_accuracy:  0.6546
Epoch 49, CIFAR-10 Batch 5:  
current_cost :  0.487355
current_accuracy:  0.6484
Epoch 50, CIFAR-10 Batch 1:  
current_cost :  0.511247
current_accuracy:  0.6442
Epoch 50, CIFAR-10 Batch 2:  
current_cost :  0.499164
current_accuracy:  0.649
Epoch 50, CIFAR-10 Batch 3:  
current_cost :  0.372186
current_accuracy:  0.6476
Epoch 50, CIFAR-10 Batch 4:  
current_cost :  0.529251
current_accuracy:  0.6366
Epoch 50, CIFAR-10 Batch 5:  
current_cost :  0.473123
current_accuracy:  0.6518
Epoch 51, CIFAR-10 Batch 1:  
current_cost :  0.562103
current_accuracy:  0.6338
Epoch 51, CIFAR-10 Batch 2:  
current_cost :  0.485127
current_accuracy:  0.6514
Epoch 51, CIFAR-10 Batch 3:  
current_cost :  0.412703
current_accuracy:  0.6478
Epoch 51, CIFAR-10 Batch 4:  
current_cost :  0.49307
current_accuracy:  0.644
Epoch 51, CIFAR-10 Batch 5:  
current_cost :  0.495441
current_accuracy:  0.6478
Epoch 52, CIFAR-10 Batch 1:  
current_cost :  0.610701
current_accuracy:  0.6378
Epoch 52, CIFAR-10 Batch 2:  
current_cost :  0.515531
current_accuracy:  0.6504
Epoch 52, CIFAR-10 Batch 3:  
current_cost :  0.351182
current_accuracy:  0.6426
Epoch 52, CIFAR-10 Batch 4:  
current_cost :  0.481376
current_accuracy:  0.654
Epoch 52, CIFAR-10 Batch 5:  
current_cost :  0.488416
current_accuracy:  0.6504
Epoch 53, CIFAR-10 Batch 1:  
current_cost :  0.580617
current_accuracy:  0.6454
Epoch 53, CIFAR-10 Batch 2:  
current_cost :  0.498544
current_accuracy:  0.6478
Epoch 53, CIFAR-10 Batch 3:  
current_cost :  0.365247
current_accuracy:  0.6392
Epoch 53, CIFAR-10 Batch 4:  
current_cost :  0.492424
current_accuracy:  0.6544
Epoch 53, CIFAR-10 Batch 5:  
current_cost :  0.485281
current_accuracy:  0.6496
Epoch 54, CIFAR-10 Batch 1:  
current_cost :  0.543065
current_accuracy:  0.649
Epoch 54, CIFAR-10 Batch 2:  
current_cost :  0.497769
current_accuracy:  0.6524
Epoch 54, CIFAR-10 Batch 3:  
current_cost :  0.388272
current_accuracy:  0.6438
Epoch 54, CIFAR-10 Batch 4:  
current_cost :  0.480196
current_accuracy:  0.6522
Epoch 54, CIFAR-10 Batch 5:  
current_cost :  0.461599
current_accuracy:  0.6462
Epoch 55, CIFAR-10 Batch 1:  
current_cost :  0.538046
current_accuracy:  0.6454
Epoch 55, CIFAR-10 Batch 2:  
current_cost :  0.483653
current_accuracy:  0.6478
Epoch 55, CIFAR-10 Batch 3:  
current_cost :  0.382436
current_accuracy:  0.6394
Epoch 55, CIFAR-10 Batch 4:  
current_cost :  0.502726
current_accuracy:  0.6464
Epoch 55, CIFAR-10 Batch 5:  
current_cost :  0.44709
current_accuracy:  0.648
Epoch 56, CIFAR-10 Batch 1:  
current_cost :  0.516505
current_accuracy:  0.6454
Epoch 56, CIFAR-10 Batch 2:  
current_cost :  0.501993
current_accuracy:  0.6466
Epoch 56, CIFAR-10 Batch 3:  
current_cost :  0.363497
current_accuracy:  0.641
Epoch 56, CIFAR-10 Batch 4:  
current_cost :  0.462127
current_accuracy:  0.6558
Epoch 56, CIFAR-10 Batch 5:  
current_cost :  0.456481
current_accuracy:  0.6456
Epoch 57, CIFAR-10 Batch 1:  
current_cost :  0.528256
current_accuracy:  0.6482
Epoch 57, CIFAR-10 Batch 2:  
current_cost :  0.522013
current_accuracy:  0.6384
Epoch 57, CIFAR-10 Batch 3:  
current_cost :  0.347648
current_accuracy:  0.6528
Epoch 57, CIFAR-10 Batch 4:  
current_cost :  0.447799
current_accuracy:  0.6524
Epoch 57, CIFAR-10 Batch 5:  
current_cost :  0.431012
current_accuracy:  0.6506
Epoch 58, CIFAR-10 Batch 1:  
current_cost :  0.525088
current_accuracy:  0.651
Epoch 58, CIFAR-10 Batch 2:  
current_cost :  0.47162
current_accuracy:  0.6468
Epoch 58, CIFAR-10 Batch 3:  
current_cost :  0.341104
current_accuracy:  0.6486
Epoch 58, CIFAR-10 Batch 4:  
current_cost :  0.427437
current_accuracy:  0.6466
Epoch 58, CIFAR-10 Batch 5:  
current_cost :  0.445342
current_accuracy:  0.6468
Epoch 59, CIFAR-10 Batch 1:  
current_cost :  0.51955
current_accuracy:  0.6452
Epoch 59, CIFAR-10 Batch 2:  
current_cost :  0.460348
current_accuracy:  0.6536
Epoch 59, CIFAR-10 Batch 3:  
current_cost :  0.331058
current_accuracy:  0.6446
Epoch 59, CIFAR-10 Batch 4:  
current_cost :  0.454785
current_accuracy:  0.648
Epoch 59, CIFAR-10 Batch 5:  
current_cost :  0.451275
current_accuracy:  0.6494
Epoch 60, CIFAR-10 Batch 1:  
current_cost :  0.509375
current_accuracy:  0.6484
Epoch 60, CIFAR-10 Batch 2:  
current_cost :  0.456827
current_accuracy:  0.6454
Epoch 60, CIFAR-10 Batch 3:  
current_cost :  0.347652
current_accuracy:  0.6476
Epoch 60, CIFAR-10 Batch 4:  
current_cost :  0.433339
current_accuracy:  0.6472
Epoch 60, CIFAR-10 Batch 5:  
current_cost :  0.423917
current_accuracy:  0.651
Epoch 61, CIFAR-10 Batch 1:  
current_cost :  0.497488
current_accuracy:  0.6456
Epoch 61, CIFAR-10 Batch 2:  
current_cost :  0.457594
current_accuracy:  0.6432
Epoch 61, CIFAR-10 Batch 3:  
current_cost :  0.333091
current_accuracy:  0.6484
Epoch 61, CIFAR-10 Batch 4:  
current_cost :  0.45081
current_accuracy:  0.6488
Epoch 61, CIFAR-10 Batch 5:  
current_cost :  0.420234
current_accuracy:  0.6482
Epoch 62, CIFAR-10 Batch 1:  
current_cost :  0.471812
current_accuracy:  0.6378
Epoch 62, CIFAR-10 Batch 2:  
current_cost :  0.446633
current_accuracy:  0.6466
Epoch 62, CIFAR-10 Batch 3:  
current_cost :  0.340333
current_accuracy:  0.6436
Epoch 62, CIFAR-10 Batch 4:  
current_cost :  0.453973
current_accuracy:  0.6456
Epoch 62, CIFAR-10 Batch 5:  
current_cost :  0.401286
current_accuracy:  0.6488
Epoch 63, CIFAR-10 Batch 1:  
current_cost :  0.450653
current_accuracy:  0.6452
Epoch 63, CIFAR-10 Batch 2:  
current_cost :  0.461428
current_accuracy:  0.6418
Epoch 63, CIFAR-10 Batch 3:  
current_cost :  0.32208
current_accuracy:  0.6458
Epoch 63, CIFAR-10 Batch 4:  
current_cost :  0.457486
current_accuracy:  0.6472
Epoch 63, CIFAR-10 Batch 5:  
current_cost :  0.41796
current_accuracy:  0.6494
Epoch 64, CIFAR-10 Batch 1:  
current_cost :  0.467483
current_accuracy:  0.6386
Epoch 64, CIFAR-10 Batch 2:  
current_cost :  0.500574
current_accuracy:  0.6432
Epoch 64, CIFAR-10 Batch 3:  
current_cost :  0.313798
current_accuracy:  0.6424
Epoch 64, CIFAR-10 Batch 4:  
current_cost :  0.445172
current_accuracy:  0.648
Epoch 64, CIFAR-10 Batch 5:  
current_cost :  0.404965
current_accuracy:  0.6486
Epoch 65, CIFAR-10 Batch 1:  
current_cost :  0.507667
current_accuracy:  0.6402
Epoch 65, CIFAR-10 Batch 2:  
current_cost :  0.465732
current_accuracy:  0.6378
Epoch 65, CIFAR-10 Batch 3:  
current_cost :  0.307149
current_accuracy:  0.6376
Epoch 65, CIFAR-10 Batch 4:  
current_cost :  0.434658
current_accuracy:  0.6474
Epoch 65, CIFAR-10 Batch 5:  
current_cost :  0.429908
current_accuracy:  0.6474
Epoch 66, CIFAR-10 Batch 1:  
current_cost :  0.472083
current_accuracy:  0.6468
Epoch 66, CIFAR-10 Batch 2:  
current_cost :  0.453114
current_accuracy:  0.6478
Epoch 66, CIFAR-10 Batch 3:  
current_cost :  0.314757
current_accuracy:  0.6476
Epoch 66, CIFAR-10 Batch 4:  
current_cost :  0.471047
current_accuracy:  0.644
Epoch 66, CIFAR-10 Batch 5:  
current_cost :  0.418523
current_accuracy:  0.6466
Epoch 67, CIFAR-10 Batch 1:  
current_cost :  0.538379
current_accuracy:  0.6424
Epoch 67, CIFAR-10 Batch 2:  
current_cost :  0.448695
current_accuracy:  0.6456
Epoch 67, CIFAR-10 Batch 3:  
current_cost :  0.308749
current_accuracy:  0.6468
Epoch 67, CIFAR-10 Batch 4:  
current_cost :  0.506614
current_accuracy:  0.6356
Epoch 67, CIFAR-10 Batch 5:  
current_cost :  0.391209
current_accuracy:  0.6424
Epoch 68, CIFAR-10 Batch 1:  
current_cost :  0.477479
current_accuracy:  0.6422
Epoch 68, CIFAR-10 Batch 2:  
current_cost :  0.460767
current_accuracy:  0.6486
Epoch 68, CIFAR-10 Batch 3:  
current_cost :  0.343976
current_accuracy:  0.6474
Epoch 68, CIFAR-10 Batch 4:  
current_cost :  0.435891
current_accuracy:  0.6494
Epoch 68, CIFAR-10 Batch 5:  
current_cost :  0.382479
current_accuracy:  0.6432
Epoch 69, CIFAR-10 Batch 1:  
current_cost :  0.443077
current_accuracy:  0.644
Epoch 69, CIFAR-10 Batch 2:  
current_cost :  0.46455
current_accuracy:  0.6396
Epoch 69, CIFAR-10 Batch 3:  
current_cost :  0.315009
current_accuracy:  0.6388
Epoch 69, CIFAR-10 Batch 4:  
current_cost :  0.429684
current_accuracy:  0.6426
Epoch 69, CIFAR-10 Batch 5:  
current_cost :  0.417061
current_accuracy:  0.6512
Epoch 70, CIFAR-10 Batch 1:  
current_cost :  0.468701
current_accuracy:  0.6428
Epoch 70, CIFAR-10 Batch 2:  
current_cost :  0.461508
current_accuracy:  0.6464
Epoch 70, CIFAR-10 Batch 3:  
current_cost :  0.332826
current_accuracy:  0.6486
Epoch 70, CIFAR-10 Batch 4:  
current_cost :  0.420036
current_accuracy:  0.6528
Epoch 70, CIFAR-10 Batch 5:  
current_cost :  0.374451
current_accuracy:  0.647
Epoch 71, CIFAR-10 Batch 1:  
current_cost :  0.45136
current_accuracy:  0.6418
Epoch 71, CIFAR-10 Batch 2:  
current_cost :  0.494883
current_accuracy:  0.6398
Epoch 71, CIFAR-10 Batch 3:  
current_cost :  0.317976
current_accuracy:  0.6368
Epoch 71, CIFAR-10 Batch 4:  
current_cost :  0.415868
current_accuracy:  0.6492
Epoch 71, CIFAR-10 Batch 5:  
current_cost :  0.39024
current_accuracy:  0.6464
Epoch 72, CIFAR-10 Batch 1:  
current_cost :  0.476213
current_accuracy:  0.6384
Epoch 72, CIFAR-10 Batch 2:  
current_cost :  0.454668
current_accuracy:  0.6498
Epoch 72, CIFAR-10 Batch 3:  
current_cost :  0.310427
current_accuracy:  0.6446
Epoch 72, CIFAR-10 Batch 4:  
current_cost :  0.403458
current_accuracy:  0.6372
Epoch 72, CIFAR-10 Batch 5:  
current_cost :  0.386516
current_accuracy:  0.6452
Epoch 73, CIFAR-10 Batch 1:  
current_cost :  0.465562
current_accuracy:  0.6416
Epoch 73, CIFAR-10 Batch 2:  
current_cost :  0.440505
current_accuracy:  0.6402
Epoch 73, CIFAR-10 Batch 3:  
current_cost :  0.293801
current_accuracy:  0.647
Epoch 73, CIFAR-10 Batch 4:  
current_cost :  0.433376
current_accuracy:  0.6542
Epoch 73, CIFAR-10 Batch 5:  
current_cost :  0.361474
current_accuracy:  0.6484
Epoch 74, CIFAR-10 Batch 1:  
current_cost :  0.434258
current_accuracy:  0.6394
Epoch 74, CIFAR-10 Batch 2:  
current_cost :  0.46665
current_accuracy:  0.6406
Epoch 74, CIFAR-10 Batch 3:  
current_cost :  0.294669
current_accuracy:  0.6516
Epoch 74, CIFAR-10 Batch 4:  
current_cost :  0.405557
current_accuracy:  0.646
Epoch 74, CIFAR-10 Batch 5:  
current_cost :  0.354996
current_accuracy:  0.6426
Epoch 75, CIFAR-10 Batch 1:  
current_cost :  0.464295
current_accuracy:  0.6382
Epoch 75, CIFAR-10 Batch 2:  
current_cost :  0.454293
current_accuracy:  0.6436
Epoch 75, CIFAR-10 Batch 3:  
current_cost :  0.288099
current_accuracy:  0.6464
Epoch 75, CIFAR-10 Batch 4:  
current_cost :  0.391431
current_accuracy:  0.6482
Epoch 75, CIFAR-10 Batch 5:  
current_cost :  0.353996
current_accuracy:  0.6518
Epoch 76, CIFAR-10 Batch 1:  
current_cost :  0.433352
current_accuracy:  0.6432
Epoch 76, CIFAR-10 Batch 2:  
current_cost :  0.432152
current_accuracy:  0.645
Epoch 76, CIFAR-10 Batch 3:  
current_cost :  0.292904
current_accuracy:  0.6436
Epoch 76, CIFAR-10 Batch 4:  
current_cost :  0.397932
current_accuracy:  0.6402
Epoch 76, CIFAR-10 Batch 5:  
current_cost :  0.350458
current_accuracy:  0.6476
Epoch 77, CIFAR-10 Batch 1:  
current_cost :  0.434101
current_accuracy:  0.6386
Epoch 77, CIFAR-10 Batch 2:  
current_cost :  0.435085
current_accuracy:  0.6358
Epoch 77, CIFAR-10 Batch 3:  
current_cost :  0.31112
current_accuracy:  0.6412
Epoch 77, CIFAR-10 Batch 4:  
current_cost :  0.387674
current_accuracy:  0.643
Epoch 77, CIFAR-10 Batch 5:  
current_cost :  0.381277
current_accuracy:  0.643
Epoch 78, CIFAR-10 Batch 1:  
current_cost :  0.423134
current_accuracy:  0.644
Epoch 78, CIFAR-10 Batch 2:  
current_cost :  0.451892
current_accuracy:  0.6394
Epoch 78, CIFAR-10 Batch 3:  
current_cost :  0.288297
current_accuracy:  0.6438
Epoch 78, CIFAR-10 Batch 4:  
current_cost :  0.386318
current_accuracy:  0.6416
Epoch 78, CIFAR-10 Batch 5:  
current_cost :  0.36454
current_accuracy:  0.6466
Epoch 79, CIFAR-10 Batch 1:  
current_cost :  0.450802
current_accuracy:  0.6324
Epoch 79, CIFAR-10 Batch 2:  
current_cost :  0.415916
current_accuracy:  0.6402
Epoch 79, CIFAR-10 Batch 3:  
current_cost :  0.278827
current_accuracy:  0.6436
Epoch 79, CIFAR-10 Batch 4:  
current_cost :  0.389532
current_accuracy:  0.6422
Epoch 79, CIFAR-10 Batch 5:  
current_cost :  0.356875
current_accuracy:  0.642
Epoch 80, CIFAR-10 Batch 1:  
current_cost :  0.431906
current_accuracy:  0.6338
Epoch 80, CIFAR-10 Batch 2:  
current_cost :  0.404587
current_accuracy:  0.634
Epoch 80, CIFAR-10 Batch 3:  
current_cost :  0.296889
current_accuracy:  0.6506
Epoch 80, CIFAR-10 Batch 4:  
current_cost :  0.381589
current_accuracy:  0.646
Epoch 80, CIFAR-10 Batch 5:  
current_cost :  0.344101
current_accuracy:  0.643
Epoch 81, CIFAR-10 Batch 1:  
current_cost :  0.415673
current_accuracy:  0.64
Epoch 81, CIFAR-10 Batch 2:  
current_cost :  0.394739
current_accuracy:  0.637
Epoch 81, CIFAR-10 Batch 3:  
current_cost :  0.286106
current_accuracy:  0.6344
Epoch 81, CIFAR-10 Batch 4:  
current_cost :  0.380286
current_accuracy:  0.6418
Epoch 81, CIFAR-10 Batch 5:  
current_cost :  0.349444
current_accuracy:  0.6434
Epoch 82, CIFAR-10 Batch 1:  
current_cost :  0.457513
current_accuracy:  0.6324
Epoch 82, CIFAR-10 Batch 2:  
current_cost :  0.401368
current_accuracy:  0.6402
Epoch 82, CIFAR-10 Batch 3:  
current_cost :  0.300871
current_accuracy:  0.6434
Epoch 82, CIFAR-10 Batch 4:  
current_cost :  0.373282
current_accuracy:  0.6432
Epoch 82, CIFAR-10 Batch 5:  
current_cost :  0.350614
current_accuracy:  0.645
Epoch 83, CIFAR-10 Batch 1:  
current_cost :  0.404983
current_accuracy:  0.6432
Epoch 83, CIFAR-10 Batch 2:  
current_cost :  0.407633
current_accuracy:  0.6392
Epoch 83, CIFAR-10 Batch 3:  
current_cost :  0.28323
current_accuracy:  0.642
Epoch 83, CIFAR-10 Batch 4:  
current_cost :  0.379731
current_accuracy:  0.647
Epoch 83, CIFAR-10 Batch 5:  
current_cost :  0.337083
current_accuracy:  0.6478
Epoch 84, CIFAR-10 Batch 1:  
current_cost :  0.408429
current_accuracy:  0.638
Epoch 84, CIFAR-10 Batch 2:  
current_cost :  0.411805
current_accuracy:  0.6436
Epoch 84, CIFAR-10 Batch 3:  
current_cost :  0.29459
current_accuracy:  0.6436
Epoch 84, CIFAR-10 Batch 4:  
current_cost :  0.353178
current_accuracy:  0.6392
Epoch 84, CIFAR-10 Batch 5:  
current_cost :  0.371672
current_accuracy:  0.6438
Epoch 85, CIFAR-10 Batch 1:  
current_cost :  0.412815
current_accuracy:  0.6406
Epoch 85, CIFAR-10 Batch 2:  
current_cost :  0.385736
current_accuracy:  0.6392
Epoch 85, CIFAR-10 Batch 3:  
current_cost :  0.3018
current_accuracy:  0.6436
Epoch 85, CIFAR-10 Batch 4:  
current_cost :  0.353046
current_accuracy:  0.6418
Epoch 85, CIFAR-10 Batch 5:  
current_cost :  0.355096
current_accuracy:  0.6458
Epoch 86, CIFAR-10 Batch 1:  
current_cost :  0.409036
current_accuracy:  0.6432
Epoch 86, CIFAR-10 Batch 2:  
current_cost :  0.377018
current_accuracy:  0.639
Epoch 86, CIFAR-10 Batch 3:  
current_cost :  0.301499
current_accuracy:  0.6452
Epoch 86, CIFAR-10 Batch 4:  
current_cost :  0.368852
current_accuracy:  0.6438
Epoch 86, CIFAR-10 Batch 5:  
current_cost :  0.339511
current_accuracy:  0.6474
Epoch 87, CIFAR-10 Batch 1:  
current_cost :  0.38802
current_accuracy:  0.6376
Epoch 87, CIFAR-10 Batch 2:  
current_cost :  0.372598
current_accuracy:  0.6394
Epoch 87, CIFAR-10 Batch 3:  
current_cost :  0.282295
current_accuracy:  0.6422
Epoch 87, CIFAR-10 Batch 4:  
current_cost :  0.392953
current_accuracy:  0.6438
Epoch 87, CIFAR-10 Batch 5:  
current_cost :  0.327261
current_accuracy:  0.6444
Epoch 88, CIFAR-10 Batch 1:  
current_cost :  0.417479
current_accuracy:  0.6348
Epoch 88, CIFAR-10 Batch 2:  
current_cost :  0.419485
current_accuracy:  0.6426
Epoch 88, CIFAR-10 Batch 3:  
current_cost :  0.275792
current_accuracy:  0.6456
Epoch 88, CIFAR-10 Batch 4:  
current_cost :  0.330314
current_accuracy:  0.6454
Epoch 88, CIFAR-10 Batch 5:  
current_cost :  0.323898
current_accuracy:  0.6438
Epoch 89, CIFAR-10 Batch 1:  
current_cost :  0.428172
current_accuracy:  0.6342
Epoch 89, CIFAR-10 Batch 2:  
current_cost :  0.389793
current_accuracy:  0.6334
Epoch 89, CIFAR-10 Batch 3:  
current_cost :  0.308549
current_accuracy:  0.6392
Epoch 89, CIFAR-10 Batch 4:  
current_cost :  0.343185
current_accuracy:  0.6434
Epoch 89, CIFAR-10 Batch 5:  
current_cost :  0.325736
current_accuracy:  0.6444
Epoch 90, CIFAR-10 Batch 1:  
current_cost :  0.4312
current_accuracy:  0.6364
Epoch 90, CIFAR-10 Batch 2:  
current_cost :  0.392914
current_accuracy:  0.6336
Epoch 90, CIFAR-10 Batch 3:  
current_cost :  0.266525
current_accuracy:  0.6386
Epoch 90, CIFAR-10 Batch 4:  
current_cost :  0.354679
current_accuracy:  0.6466
Epoch 90, CIFAR-10 Batch 5:  
current_cost :  0.321853
current_accuracy:  0.6462
Epoch 91, CIFAR-10 Batch 1:  
current_cost :  0.432722
current_accuracy:  0.634
Epoch 91, CIFAR-10 Batch 2:  
current_cost :  0.353045
current_accuracy:  0.637
Epoch 91, CIFAR-10 Batch 3:  
current_cost :  0.262575
current_accuracy:  0.645
Epoch 91, CIFAR-10 Batch 4:  
current_cost :  0.34459
current_accuracy:  0.6374
Epoch 91, CIFAR-10 Batch 5:  
current_cost :  0.320636
current_accuracy:  0.647
Epoch 92, CIFAR-10 Batch 1:  
current_cost :  0.460834
current_accuracy:  0.6344
Epoch 92, CIFAR-10 Batch 2:  
current_cost :  0.402569
current_accuracy:  0.6346
Epoch 92, CIFAR-10 Batch 3:  
current_cost :  0.271051
current_accuracy:  0.6436
Epoch 92, CIFAR-10 Batch 4:  
current_cost :  0.35874
current_accuracy:  0.645
Epoch 92, CIFAR-10 Batch 5:  
current_cost :  0.330182
current_accuracy:  0.6384
Epoch 93, CIFAR-10 Batch 1:  
current_cost :  0.432744
current_accuracy:  0.633
Epoch 93, CIFAR-10 Batch 2:  
current_cost :  0.384831
current_accuracy:  0.6424
Epoch 93, CIFAR-10 Batch 3:  
current_cost :  0.275534
current_accuracy:  0.6424
Epoch 93, CIFAR-10 Batch 4:  
current_cost :  0.343083
current_accuracy:  0.6454
Epoch 93, CIFAR-10 Batch 5:  
current_cost :  0.326989
current_accuracy:  0.6486
Epoch 94, CIFAR-10 Batch 1:  
current_cost :  0.427114
current_accuracy:  0.6396
Epoch 94, CIFAR-10 Batch 2:  
current_cost :  0.40165
current_accuracy:  0.6362
Epoch 94, CIFAR-10 Batch 3:  
current_cost :  0.270603
current_accuracy:  0.6408
Epoch 94, CIFAR-10 Batch 4:  
current_cost :  0.365601
current_accuracy:  0.637
Epoch 94, CIFAR-10 Batch 5:  
current_cost :  0.333853
current_accuracy:  0.6386
Epoch 95, CIFAR-10 Batch 1:  
current_cost :  0.434479
current_accuracy:  0.6342
Epoch 95, CIFAR-10 Batch 2:  
current_cost :  0.375017
current_accuracy:  0.6326
Epoch 95, CIFAR-10 Batch 3:  
current_cost :  0.269455
current_accuracy:  0.6432
Epoch 95, CIFAR-10 Batch 4:  
current_cost :  0.339843
current_accuracy:  0.641
Epoch 95, CIFAR-10 Batch 5:  
current_cost :  0.302063
current_accuracy:  0.6468
Epoch 96, CIFAR-10 Batch 1:  
current_cost :  0.417084
current_accuracy:  0.6356
Epoch 96, CIFAR-10 Batch 2:  
current_cost :  0.3855
current_accuracy:  0.6358
Epoch 96, CIFAR-10 Batch 3:  
current_cost :  0.291346
current_accuracy:  0.6402
Epoch 96, CIFAR-10 Batch 4:  
current_cost :  0.359267
current_accuracy:  0.6432
Epoch 96, CIFAR-10 Batch 5:  
current_cost :  0.323001
current_accuracy:  0.6432
Epoch 97, CIFAR-10 Batch 1:  
current_cost :  0.41491
current_accuracy:  0.6392
Epoch 97, CIFAR-10 Batch 2:  
current_cost :  0.370656
current_accuracy:  0.6378
Epoch 97, CIFAR-10 Batch 3:  
current_cost :  0.289763
current_accuracy:  0.6352
Epoch 97, CIFAR-10 Batch 4:  
current_cost :  0.347871
current_accuracy:  0.6422
Epoch 97, CIFAR-10 Batch 5:  
current_cost :  0.339552
current_accuracy:  0.6452
Epoch 98, CIFAR-10 Batch 1:  
current_cost :  0.413689
current_accuracy:  0.6274
Epoch 98, CIFAR-10 Batch 2:  
current_cost :  0.323172
current_accuracy:  0.632
Epoch 98, CIFAR-10 Batch 3:  
current_cost :  0.278944
current_accuracy:  0.641
Epoch 98, CIFAR-10 Batch 4:  
current_cost :  0.386362
current_accuracy:  0.6408
Epoch 98, CIFAR-10 Batch 5:  
current_cost :  0.332598
current_accuracy:  0.644
Epoch 99, CIFAR-10 Batch 1:  
current_cost :  0.405163
current_accuracy:  0.6366
Epoch 99, CIFAR-10 Batch 2:  
current_cost :  0.377534
current_accuracy:  0.635
Epoch 99, CIFAR-10 Batch 3:  
current_cost :  0.258908
current_accuracy:  0.6458
Epoch 99, CIFAR-10 Batch 4:  
current_cost :  0.351837
current_accuracy:  0.6416
Epoch 99, CIFAR-10 Batch 5:  
current_cost :  0.325275
current_accuracy:  0.64
Epoch 100, CIFAR-10 Batch 1:  
current_cost :  0.408415
current_accuracy:  0.6282
Epoch 100, CIFAR-10 Batch 2:  
current_cost :  0.38238
current_accuracy:  0.6356
Epoch 100, CIFAR-10 Batch 3:  
current_cost :  0.282728
current_accuracy:  0.6378
Epoch 100, CIFAR-10 Batch 4:  
current_cost :  0.327433
current_accuracy:  0.6444
Epoch 100, CIFAR-10 Batch 5:  
current_cost :  0.307014
current_accuracy:  0.6438
Epoch 101, CIFAR-10 Batch 1:  
current_cost :  0.4246
current_accuracy:  0.6344
Epoch 101, CIFAR-10 Batch 2:  
current_cost :  0.36963
current_accuracy:  0.6294
Epoch 101, CIFAR-10 Batch 3:  
current_cost :  0.268376
current_accuracy:  0.6426
Epoch 101, CIFAR-10 Batch 4:  
current_cost :  0.322627
current_accuracy:  0.6478
Epoch 101, CIFAR-10 Batch 5:  
current_cost :  0.285628
current_accuracy:  0.647
Epoch 102, CIFAR-10 Batch 1:  
current_cost :  0.430682
current_accuracy:  0.6408
Epoch 102, CIFAR-10 Batch 2:  
current_cost :  0.385906
current_accuracy:  0.6348
Epoch 102, CIFAR-10 Batch 3:  
current_cost :  0.312735
current_accuracy:  0.6438
Epoch 102, CIFAR-10 Batch 4:  
current_cost :  0.318437
current_accuracy:  0.6408
Epoch 102, CIFAR-10 Batch 5:  
current_cost :  0.314245
current_accuracy:  0.6432
Epoch 103, CIFAR-10 Batch 1:  
current_cost :  0.390082
current_accuracy:  0.6424
Epoch 103, CIFAR-10 Batch 2:  
current_cost :  0.357958
current_accuracy:  0.6152
Epoch 103, CIFAR-10 Batch 3:  
current_cost :  0.265769
current_accuracy:  0.6388
Epoch 103, CIFAR-10 Batch 4:  
current_cost :  0.312008
current_accuracy:  0.6404
Epoch 103, CIFAR-10 Batch 5:  
current_cost :  0.320151
current_accuracy:  0.6462
Epoch 104, CIFAR-10 Batch 1:  
current_cost :  0.39435
current_accuracy:  0.6368
Epoch 104, CIFAR-10 Batch 2:  
current_cost :  0.37738
current_accuracy:  0.6318
Epoch 104, CIFAR-10 Batch 3:  
current_cost :  0.265135
current_accuracy:  0.6398
Epoch 104, CIFAR-10 Batch 4:  
current_cost :  0.306156
current_accuracy:  0.6436
Epoch 104, CIFAR-10 Batch 5:  
current_cost :  0.306158
current_accuracy:  0.6462
Epoch 105, CIFAR-10 Batch 1:  
current_cost :  0.392764
current_accuracy:  0.637
Epoch 105, CIFAR-10 Batch 2:  
current_cost :  0.386718
current_accuracy:  0.631
Epoch 105, CIFAR-10 Batch 3:  
current_cost :  0.252367
current_accuracy:  0.6458
Epoch 105, CIFAR-10 Batch 4:  
current_cost :  0.304976
current_accuracy:  0.6414
Epoch 105, CIFAR-10 Batch 5:  
current_cost :  0.302848
current_accuracy:  0.6406
Epoch 106, CIFAR-10 Batch 1:  
current_cost :  0.411227
current_accuracy:  0.6358
Epoch 106, CIFAR-10 Batch 2:  
current_cost :  0.359558
current_accuracy:  0.636
Epoch 106, CIFAR-10 Batch 3:  
current_cost :  0.264073
current_accuracy:  0.6476
Epoch 106, CIFAR-10 Batch 4:  
current_cost :  0.326319
current_accuracy:  0.6416
Epoch 106, CIFAR-10 Batch 5:  
current_cost :  0.317662
current_accuracy:  0.6414
Epoch 107, CIFAR-10 Batch 1:  
current_cost :  0.38631
current_accuracy:  0.6336
Epoch 107, CIFAR-10 Batch 2:  
current_cost :  0.340745
current_accuracy:  0.6354
Epoch 107, CIFAR-10 Batch 3:  
current_cost :  0.255046
current_accuracy:  0.6426
Epoch 107, CIFAR-10 Batch 4:  
current_cost :  0.322753
current_accuracy:  0.6444
Epoch 107, CIFAR-10 Batch 5:  
current_cost :  0.334117
current_accuracy:  0.6392
Epoch 108, CIFAR-10 Batch 1:  
current_cost :  0.415333
current_accuracy:  0.6318
Epoch 108, CIFAR-10 Batch 2:  
current_cost :  0.345877
current_accuracy:  0.6154
Epoch 108, CIFAR-10 Batch 3:  
current_cost :  0.259596
current_accuracy:  0.6384
Epoch 108, CIFAR-10 Batch 4:  
current_cost :  0.332562
current_accuracy:  0.6366
Epoch 108, CIFAR-10 Batch 5:  
current_cost :  0.342687
current_accuracy:  0.6358
Epoch 109, CIFAR-10 Batch 1:  
current_cost :  0.404953
current_accuracy:  0.6344
Epoch 109, CIFAR-10 Batch 2:  
current_cost :  0.36696
current_accuracy:  0.6204
Epoch 109, CIFAR-10 Batch 3:  
current_cost :  0.247018
current_accuracy:  0.6424
Epoch 109, CIFAR-10 Batch 4:  
current_cost :  0.32184
current_accuracy:  0.6398
Epoch 109, CIFAR-10 Batch 5:  
current_cost :  0.27664
current_accuracy:  0.6408
Epoch 110, CIFAR-10 Batch 1:  
current_cost :  0.427581
current_accuracy:  0.6274
Epoch 110, CIFAR-10 Batch 2:  
current_cost :  0.342349
current_accuracy:  0.6166
Epoch 110, CIFAR-10 Batch 3:  
current_cost :  0.243122
current_accuracy:  0.6352
Epoch 110, CIFAR-10 Batch 4:  
current_cost :  0.300077
current_accuracy:  0.641
Epoch 110, CIFAR-10 Batch 5:  
current_cost :  0.307552
current_accuracy:  0.6374
Epoch 111, CIFAR-10 Batch 1:  
current_cost :  0.395592
current_accuracy:  0.6306
Epoch 111, CIFAR-10 Batch 2:  
current_cost :  0.356492
current_accuracy:  0.6252
Epoch 111, CIFAR-10 Batch 3:  
current_cost :  0.263462
current_accuracy:  0.6358
Epoch 111, CIFAR-10 Batch 4:  
current_cost :  0.332407
current_accuracy:  0.638
Epoch 111, CIFAR-10 Batch 5:  
current_cost :  0.302351
current_accuracy:  0.6448
Epoch 112, CIFAR-10 Batch 1:  
current_cost :  0.379306
current_accuracy:  0.6344
Epoch 112, CIFAR-10 Batch 2:  
current_cost :  0.334176
current_accuracy:  0.6224
Epoch 112, CIFAR-10 Batch 3:  
current_cost :  0.264063
current_accuracy:  0.642
Epoch 112, CIFAR-10 Batch 4:  
current_cost :  0.307677
current_accuracy:  0.6396
Epoch 112, CIFAR-10 Batch 5:  
current_cost :  0.273873
current_accuracy:  0.643
Epoch 113, CIFAR-10 Batch 1:  
current_cost :  0.386755
current_accuracy:  0.638
Epoch 113, CIFAR-10 Batch 2:  
current_cost :  0.355378
current_accuracy:  0.6234
Epoch 113, CIFAR-10 Batch 3:  
current_cost :  0.28636
current_accuracy:  0.6412
Epoch 113, CIFAR-10 Batch 4:  
current_cost :  0.296829
current_accuracy:  0.6378
Epoch 113, CIFAR-10 Batch 5:  
current_cost :  0.306625
current_accuracy:  0.6374
Epoch 114, CIFAR-10 Batch 1:  
current_cost :  0.349703
current_accuracy:  0.6362
Epoch 114, CIFAR-10 Batch 2:  
current_cost :  0.333174
current_accuracy:  0.6174
Epoch 114, CIFAR-10 Batch 3:  
current_cost :  0.245752
current_accuracy:  0.6398
Epoch 114, CIFAR-10 Batch 4:  
current_cost :  0.321659
current_accuracy:  0.636
Epoch 114, CIFAR-10 Batch 5:  
current_cost :  0.258807
current_accuracy:  0.643
Epoch 115, CIFAR-10 Batch 1:  
current_cost :  0.438258
current_accuracy:  0.6342
Epoch 115, CIFAR-10 Batch 2:  
current_cost :  0.323371
current_accuracy:  0.6294
Epoch 115, CIFAR-10 Batch 3:  
current_cost :  0.277013
current_accuracy:  0.6378
Epoch 115, CIFAR-10 Batch 4:  
current_cost :  0.300371
current_accuracy:  0.6452
Epoch 115, CIFAR-10 Batch 5:  
current_cost :  0.282407
current_accuracy:  0.64
Epoch 116, CIFAR-10 Batch 1:  
current_cost :  0.382998
current_accuracy:  0.6378
Epoch 116, CIFAR-10 Batch 2:  
current_cost :  0.335267
current_accuracy:  0.6246
Epoch 116, CIFAR-10 Batch 3:  
current_cost :  0.228338
current_accuracy:  0.6436
Epoch 116, CIFAR-10 Batch 4:  
current_cost :  0.290664
current_accuracy:  0.6406
Epoch 116, CIFAR-10 Batch 5:  
current_cost :  0.268566
current_accuracy:  0.6422
Epoch 117, CIFAR-10 Batch 1:  
current_cost :  0.342172
current_accuracy:  0.6428
Epoch 117, CIFAR-10 Batch 2:  
current_cost :  0.370597
current_accuracy:  0.6146
Epoch 117, CIFAR-10 Batch 3:  
current_cost :  0.247035
current_accuracy:  0.6424
Epoch 117, CIFAR-10 Batch 4:  
current_cost :  0.302047
current_accuracy:  0.6402
Epoch 117, CIFAR-10 Batch 5:  
current_cost :  0.29144
current_accuracy:  0.6402
Epoch 118, CIFAR-10 Batch 1:  
current_cost :  0.384877
current_accuracy:  0.6364
Epoch 118, CIFAR-10 Batch 2:  
current_cost :  0.327577
current_accuracy:  0.6116
Epoch 118, CIFAR-10 Batch 3:  
current_cost :  0.234032
current_accuracy:  0.6344
Epoch 118, CIFAR-10 Batch 4:  
current_cost :  0.286564
current_accuracy:  0.638
Epoch 118, CIFAR-10 Batch 5:  
current_cost :  0.258653
current_accuracy:  0.6474
Epoch 119, CIFAR-10 Batch 1:  
current_cost :  0.384177
current_accuracy:  0.638
Epoch 119, CIFAR-10 Batch 2:  
current_cost :  0.348979
current_accuracy:  0.6264
Epoch 119, CIFAR-10 Batch 3:  
current_cost :  0.243441
current_accuracy:  0.6382
Epoch 119, CIFAR-10 Batch 4:  
current_cost :  0.302613
current_accuracy:  0.643
Epoch 119, CIFAR-10 Batch 5:  
current_cost :  0.254879
current_accuracy:  0.6428
Epoch 120, CIFAR-10 Batch 1:  
current_cost :  0.371445
current_accuracy:  0.6386
Epoch 120, CIFAR-10 Batch 2:  
current_cost :  0.322469
current_accuracy:  0.6298
Epoch 120, CIFAR-10 Batch 3:  
current_cost :  0.253378
current_accuracy:  0.6406
Epoch 120, CIFAR-10 Batch 4:  
current_cost :  0.28417
current_accuracy:  0.641
Epoch 120, CIFAR-10 Batch 5:  
current_cost :  0.286839
current_accuracy:  0.6362
Epoch 121, CIFAR-10 Batch 1:  
current_cost :  0.40901
current_accuracy:  0.6334
Epoch 121, CIFAR-10 Batch 2:  
current_cost :  0.334519
current_accuracy:  0.6186
Epoch 121, CIFAR-10 Batch 3:  
current_cost :  0.267126
current_accuracy:  0.6382
Epoch 121, CIFAR-10 Batch 4:  
current_cost :  0.316345
current_accuracy:  0.6408
Epoch 121, CIFAR-10 Batch 5:  
current_cost :  0.300643
current_accuracy:  0.6346
Epoch 122, CIFAR-10 Batch 1:  
current_cost :  0.353481
current_accuracy:  0.6364
Epoch 122, CIFAR-10 Batch 2:  
current_cost :  0.364422
current_accuracy:  0.6172
Epoch 122, CIFAR-10 Batch 3:  
current_cost :  0.248615
current_accuracy:  0.6356
Epoch 122, CIFAR-10 Batch 4:  
current_cost :  0.293383
current_accuracy:  0.6392
Epoch 122, CIFAR-10 Batch 5:  
current_cost :  0.265458
current_accuracy:  0.6386
Epoch 123, CIFAR-10 Batch 1:  
current_cost :  0.382088
current_accuracy:  0.6406
Epoch 123, CIFAR-10 Batch 2:  
current_cost :  0.324646
current_accuracy:  0.6308
Epoch 123, CIFAR-10 Batch 3:  
current_cost :  0.254779
current_accuracy:  0.633
Epoch 123, CIFAR-10 Batch 4:  
current_cost :  0.38113
current_accuracy:  0.6358
Epoch 123, CIFAR-10 Batch 5:  
current_cost :  0.301932
current_accuracy:  0.639
Epoch 124, CIFAR-10 Batch 1:  
current_cost :  0.346001
current_accuracy:  0.6336
Epoch 124, CIFAR-10 Batch 2:  
current_cost :  0.334747
current_accuracy:  0.621
Epoch 124, CIFAR-10 Batch 3:  
current_cost :  0.248923
current_accuracy:  0.6338
Epoch 124, CIFAR-10 Batch 4:  
current_cost :  0.289828
current_accuracy:  0.6406
Epoch 124, CIFAR-10 Batch 5:  
current_cost :  0.275196
current_accuracy:  0.6336
Epoch 125, CIFAR-10 Batch 1:  
current_cost :  0.398031
current_accuracy:  0.629
Epoch 125, CIFAR-10 Batch 2:  
current_cost :  0.328008
current_accuracy:  0.6274
Epoch 125, CIFAR-10 Batch 3:  
current_cost :  0.231309
current_accuracy:  0.6308
Epoch 125, CIFAR-10 Batch 4:  
current_cost :  0.297045
current_accuracy:  0.6404
Epoch 125, CIFAR-10 Batch 5:  
current_cost :  0.2513
current_accuracy:  0.6396
Epoch 126, CIFAR-10 Batch 1:  
current_cost :  0.359593
current_accuracy:  0.6336
Epoch 126, CIFAR-10 Batch 2:  
current_cost :  0.331044
current_accuracy:  0.6202
Epoch 126, CIFAR-10 Batch 3:  
current_cost :  0.228557
current_accuracy:  0.6288
Epoch 126, CIFAR-10 Batch 4:  
current_cost :  0.288436
current_accuracy:  0.6392
Epoch 126, CIFAR-10 Batch 5:  
current_cost :  0.293282
current_accuracy:  0.64
Epoch 127, CIFAR-10 Batch 1:  
current_cost :  0.377692
current_accuracy:  0.6372
Epoch 127, CIFAR-10 Batch 2:  
current_cost :  0.321516
current_accuracy:  0.6144
Epoch 127, CIFAR-10 Batch 3:  
current_cost :  0.239346
current_accuracy:  0.6418
Epoch 127, CIFAR-10 Batch 4:  
current_cost :  0.301979
current_accuracy:  0.637
Epoch 127, CIFAR-10 Batch 5:  
current_cost :  0.260928
current_accuracy:  0.6412
Epoch 128, CIFAR-10 Batch 1:  
current_cost :  0.49443
current_accuracy:  0.622
Epoch 128, CIFAR-10 Batch 2:  
current_cost :  0.320666
current_accuracy:  0.6198
Epoch 128, CIFAR-10 Batch 3:  
current_cost :  0.241457
current_accuracy:  0.6378
Epoch 128, CIFAR-10 Batch 4:  
current_cost :  0.297387
current_accuracy:  0.638
Epoch 128, CIFAR-10 Batch 5:  
current_cost :  0.25612
current_accuracy:  0.6412
Epoch 129, CIFAR-10 Batch 1:  
current_cost :  0.376063
current_accuracy:  0.6348
Epoch 129, CIFAR-10 Batch 2:  
current_cost :  0.331711
current_accuracy:  0.619
Epoch 129, CIFAR-10 Batch 3:  
current_cost :  0.249814
current_accuracy:  0.6368
Epoch 129, CIFAR-10 Batch 4:  
current_cost :  0.273417
current_accuracy:  0.642
Epoch 129, CIFAR-10 Batch 5:  
current_cost :  0.273041
current_accuracy:  0.6362
Epoch 130, CIFAR-10 Batch 1:  
current_cost :  0.405871
current_accuracy:  0.6254
Epoch 130, CIFAR-10 Batch 2:  
current_cost :  0.343807
current_accuracy:  0.6248
Epoch 130, CIFAR-10 Batch 3:  
current_cost :  0.219462
current_accuracy:  0.6344
Epoch 130, CIFAR-10 Batch 4:  
current_cost :  0.297393
current_accuracy:  0.6384
Epoch 130, CIFAR-10 Batch 5:  
current_cost :  0.25553
current_accuracy:  0.637
Epoch 131, CIFAR-10 Batch 1:  
current_cost :  0.34309
current_accuracy:  0.6292
Epoch 131, CIFAR-10 Batch 2:  
current_cost :  0.298261
current_accuracy:  0.6228
Epoch 131, CIFAR-10 Batch 3:  
current_cost :  0.24821
current_accuracy:  0.641
Epoch 131, CIFAR-10 Batch 4:  
current_cost :  0.272899
current_accuracy:  0.6388
Epoch 131, CIFAR-10 Batch 5:  
current_cost :  0.276081
current_accuracy:  0.6336
Epoch 132, CIFAR-10 Batch 1:  
current_cost :  0.412232
current_accuracy:  0.6364
Epoch 132, CIFAR-10 Batch 2:  
current_cost :  0.329997
current_accuracy:  0.6166
Epoch 132, CIFAR-10 Batch 3:  
current_cost :  0.231299
current_accuracy:  0.6316
Epoch 132, CIFAR-10 Batch 4:  
current_cost :  0.283439
current_accuracy:  0.6388
Epoch 132, CIFAR-10 Batch 5:  
current_cost :  0.266215
current_accuracy:  0.642
Epoch 133, CIFAR-10 Batch 1:  
current_cost :  0.349244
current_accuracy:  0.6336
Epoch 133, CIFAR-10 Batch 2:  
current_cost :  0.307016
current_accuracy:  0.6178
Epoch 133, CIFAR-10 Batch 3:  
current_cost :  0.225813
current_accuracy:  0.6324
Epoch 133, CIFAR-10 Batch 4:  
current_cost :  0.319998
current_accuracy:  0.6424
Epoch 133, CIFAR-10 Batch 5:  
current_cost :  0.279805
current_accuracy:  0.6376
Epoch 134, CIFAR-10 Batch 1:  
current_cost :  0.369175
current_accuracy:  0.639
Epoch 134, CIFAR-10 Batch 2:  
current_cost :  0.307363
current_accuracy:  0.6216
Epoch 134, CIFAR-10 Batch 3:  
current_cost :  0.228555
current_accuracy:  0.6324
Epoch 134, CIFAR-10 Batch 4:  
current_cost :  0.298606
current_accuracy:  0.6382
Epoch 134, CIFAR-10 Batch 5:  
current_cost :  0.268955
current_accuracy:  0.6398
Epoch 135, CIFAR-10 Batch 1:  
current_cost :  0.393064
current_accuracy:  0.6348
Epoch 135, CIFAR-10 Batch 2:  
current_cost :  0.322266
current_accuracy:  0.6208
Epoch 135, CIFAR-10 Batch 3:  
current_cost :  0.191576
current_accuracy:  0.6282
Epoch 135, CIFAR-10 Batch 4:  
current_cost :  0.302285
current_accuracy:  0.6386
Epoch 135, CIFAR-10 Batch 5:  
current_cost :  0.240534
current_accuracy:  0.641
Epoch 136, CIFAR-10 Batch 1:  
current_cost :  0.358906
current_accuracy:  0.6326
Epoch 136, CIFAR-10 Batch 2:  
current_cost :  0.334919
current_accuracy:  0.6198
Epoch 136, CIFAR-10 Batch 3:  
current_cost :  0.19625
current_accuracy:  0.6304
Epoch 136, CIFAR-10 Batch 4:  
current_cost :  0.276742
current_accuracy:  0.6426
Epoch 136, CIFAR-10 Batch 5:  
current_cost :  0.260781
current_accuracy:  0.637
Epoch 137, CIFAR-10 Batch 1:  
current_cost :  0.34929
current_accuracy:  0.6314
Epoch 137, CIFAR-10 Batch 2:  
current_cost :  0.321093
current_accuracy:  0.6174
Epoch 137, CIFAR-10 Batch 3:  
current_cost :  0.229002
current_accuracy:  0.6288
Epoch 137, CIFAR-10 Batch 4:  
current_cost :  0.278816
current_accuracy:  0.6356
Epoch 137, CIFAR-10 Batch 5:  
current_cost :  0.250842
current_accuracy:  0.633
Epoch 138, CIFAR-10 Batch 1:  
current_cost :  0.409655
current_accuracy:  0.6276
Epoch 138, CIFAR-10 Batch 2:  
current_cost :  0.338501
current_accuracy:  0.62
Epoch 138, CIFAR-10 Batch 3:  
current_cost :  0.220257
current_accuracy:  0.6256
Epoch 138, CIFAR-10 Batch 4:  
current_cost :  0.292941
current_accuracy:  0.6316
Epoch 138, CIFAR-10 Batch 5:  
current_cost :  0.233361
current_accuracy:  0.6358
Epoch 139, CIFAR-10 Batch 1:  
current_cost :  0.334902
current_accuracy:  0.6364
Epoch 139, CIFAR-10 Batch 2:  
current_cost :  0.304175
current_accuracy:  0.6284
Epoch 139, CIFAR-10 Batch 3:  
current_cost :  0.219006
current_accuracy:  0.6356
Epoch 139, CIFAR-10 Batch 4:  
current_cost :  0.312947
current_accuracy:  0.637
Epoch 139, CIFAR-10 Batch 5:  
current_cost :  0.268425
current_accuracy:  0.6372
Epoch 140, CIFAR-10 Batch 1:  
current_cost :  0.354907
current_accuracy:  0.6364
Epoch 140, CIFAR-10 Batch 2:  
current_cost :  0.313741
current_accuracy:  0.6224
Epoch 140, CIFAR-10 Batch 3:  
current_cost :  0.211311
current_accuracy:  0.6354
Epoch 140, CIFAR-10 Batch 4:  
current_cost :  0.282562
current_accuracy:  0.6358
Epoch 140, CIFAR-10 Batch 5:  
current_cost :  0.261746
current_accuracy:  0.6364
Epoch 141, CIFAR-10 Batch 1:  
current_cost :  0.33229
current_accuracy:  0.636
Epoch 141, CIFAR-10 Batch 2:  
current_cost :  0.305029
current_accuracy:  0.6084
Epoch 141, CIFAR-10 Batch 3:  
current_cost :  0.221268
current_accuracy:  0.6294
Epoch 141, CIFAR-10 Batch 4:  
current_cost :  0.298979
current_accuracy:  0.636
Epoch 141, CIFAR-10 Batch 5:  
current_cost :  0.241454
current_accuracy:  0.6348
Epoch 142, CIFAR-10 Batch 1:  
current_cost :  0.350955
current_accuracy:  0.6268
Epoch 142, CIFAR-10 Batch 2:  
current_cost :  0.306246
current_accuracy:  0.6128
Epoch 142, CIFAR-10 Batch 3:  
current_cost :  0.223999
current_accuracy:  0.6378
Epoch 142, CIFAR-10 Batch 4:  
current_cost :  0.264428
current_accuracy:  0.6348
Epoch 142, CIFAR-10 Batch 5:  
current_cost :  0.269115
current_accuracy:  0.6362
Epoch 143, CIFAR-10 Batch 1:  
current_cost :  0.362425
current_accuracy:  0.6356
Epoch 143, CIFAR-10 Batch 2:  
current_cost :  0.307097
current_accuracy:  0.6278
Epoch 143, CIFAR-10 Batch 3:  
current_cost :  0.226307
current_accuracy:  0.632
Epoch 143, CIFAR-10 Batch 4:  
current_cost :  0.288235
current_accuracy:  0.636
Epoch 143, CIFAR-10 Batch 5:  
current_cost :  0.27952
current_accuracy:  0.6376
Epoch 144, CIFAR-10 Batch 1:  
current_cost :  0.391416
current_accuracy:  0.6348
Epoch 144, CIFAR-10 Batch 2:  
current_cost :  0.315186
current_accuracy:  0.6144
Epoch 144, CIFAR-10 Batch 3:  
current_cost :  0.199245
current_accuracy:  0.6306
Epoch 144, CIFAR-10 Batch 4:  
current_cost :  0.259189
current_accuracy:  0.635
Epoch 144, CIFAR-10 Batch 5:  
current_cost :  0.237414
current_accuracy:  0.6378
Epoch 145, CIFAR-10 Batch 1:  
current_cost :  0.355404
current_accuracy:  0.642
Epoch 145, CIFAR-10 Batch 2:  
current_cost :  0.301387
current_accuracy:  0.6302
Epoch 145, CIFAR-10 Batch 3:  
current_cost :  0.211361
current_accuracy:  0.6324
Epoch 145, CIFAR-10 Batch 4:  
current_cost :  0.246462
current_accuracy:  0.6392
Epoch 145, CIFAR-10 Batch 5:  
current_cost :  0.279469
current_accuracy:  0.6352
Epoch 146, CIFAR-10 Batch 1:  
current_cost :  0.36727
current_accuracy:  0.6358
Epoch 146, CIFAR-10 Batch 2:  
current_cost :  0.289329
current_accuracy:  0.6316
Epoch 146, CIFAR-10 Batch 3:  
current_cost :  0.223815
current_accuracy:  0.6328
Epoch 146, CIFAR-10 Batch 4:  
current_cost :  0.285425
current_accuracy:  0.6306
Epoch 146, CIFAR-10 Batch 5:  
current_cost :  0.242599
current_accuracy:  0.6404
Epoch 147, CIFAR-10 Batch 1:  
current_cost :  0.370685
current_accuracy:  0.6318
Epoch 147, CIFAR-10 Batch 2:  
current_cost :  0.311847
current_accuracy:  0.619
Epoch 147, CIFAR-10 Batch 3:  
current_cost :  0.185105
current_accuracy:  0.624
Epoch 147, CIFAR-10 Batch 4:  
current_cost :  0.256305
current_accuracy:  0.6318
Epoch 147, CIFAR-10 Batch 5:  
current_cost :  0.24897
current_accuracy:  0.6354
Epoch 148, CIFAR-10 Batch 1:  
current_cost :  0.383021
current_accuracy:  0.6304
Epoch 148, CIFAR-10 Batch 2:  
current_cost :  0.322825
current_accuracy:  0.6012
Epoch 148, CIFAR-10 Batch 3:  
current_cost :  0.19734
current_accuracy:  0.6256
Epoch 148, CIFAR-10 Batch 4:  
current_cost :  0.280089
current_accuracy:  0.6278
Epoch 148, CIFAR-10 Batch 5:  
current_cost :  0.265124
current_accuracy:  0.646
Epoch 149, CIFAR-10 Batch 1:  
current_cost :  0.401717
current_accuracy:  0.6292
Epoch 149, CIFAR-10 Batch 2:  
current_cost :  0.294449
current_accuracy:  0.6288
Epoch 149, CIFAR-10 Batch 3:  
current_cost :  0.20235
current_accuracy:  0.621
Epoch 149, CIFAR-10 Batch 4:  
current_cost :  0.279117
current_accuracy:  0.6402
Epoch 149, CIFAR-10 Batch 5:  
current_cost :  0.258466
current_accuracy:  0.6384
Epoch 150, CIFAR-10 Batch 1:  
current_cost :  0.348588
current_accuracy:  0.6368
Epoch 150, CIFAR-10 Batch 2:  
current_cost :  0.321051
current_accuracy:  0.6204
Epoch 150, CIFAR-10 Batch 3:  
current_cost :  0.189593
current_accuracy:  0.6242
Epoch 150, CIFAR-10 Batch 4:  
current_cost :  0.261465
current_accuracy:  0.635
Epoch 150, CIFAR-10 Batch 5:  
current_cost :  0.247692
current_accuracy:  0.6352
Epoch 151, CIFAR-10 Batch 1:  
current_cost :  0.367733
current_accuracy:  0.6236
Epoch 151, CIFAR-10 Batch 2:  
current_cost :  0.301897
current_accuracy:  0.6216
Epoch 151, CIFAR-10 Batch 3:  
current_cost :  0.193767
current_accuracy:  0.6292
Epoch 151, CIFAR-10 Batch 4:  
current_cost :  0.25345
current_accuracy:  0.6346
Epoch 151, CIFAR-10 Batch 5:  
current_cost :  0.243253
current_accuracy:  0.635
Epoch 152, CIFAR-10 Batch 1:  
current_cost :  0.367326
current_accuracy:  0.6336
Epoch 152, CIFAR-10 Batch 2:  
current_cost :  0.303042
current_accuracy:  0.6224
Epoch 152, CIFAR-10 Batch 3:  
current_cost :  0.205711
current_accuracy:  0.6306
Epoch 152, CIFAR-10 Batch 4:  
current_cost :  0.25378
current_accuracy:  0.6302
Epoch 152, CIFAR-10 Batch 5:  
current_cost :  0.270221
current_accuracy:  0.6348
Epoch 153, CIFAR-10 Batch 1:  
current_cost :  0.35686
current_accuracy:  0.6372
Epoch 153, CIFAR-10 Batch 2:  
current_cost :  0.307993
current_accuracy:  0.6178
Epoch 153, CIFAR-10 Batch 3:  
current_cost :  0.202855
current_accuracy:  0.6208
Epoch 153, CIFAR-10 Batch 4:  
current_cost :  0.263779
current_accuracy:  0.6392
Epoch 153, CIFAR-10 Batch 5:  
current_cost :  0.284905
current_accuracy:  0.6312
Epoch 154, CIFAR-10 Batch 1:  
current_cost :  0.32728
current_accuracy:  0.6332
Epoch 154, CIFAR-10 Batch 2:  
current_cost :  0.314679
current_accuracy:  0.623
Epoch 154, CIFAR-10 Batch 3:  
current_cost :  0.183475
current_accuracy:  0.6338
Epoch 154, CIFAR-10 Batch 4:  
current_cost :  0.241748
current_accuracy:  0.632
Epoch 154, CIFAR-10 Batch 5:  
current_cost :  0.253476
current_accuracy:  0.6358
Epoch 155, CIFAR-10 Batch 1:  
current_cost :  0.33879
current_accuracy:  0.6332
Epoch 155, CIFAR-10 Batch 2:  
current_cost :  0.32279
current_accuracy:  0.6176
Epoch 155, CIFAR-10 Batch 3:  
current_cost :  0.215393
current_accuracy:  0.6362
Epoch 155, CIFAR-10 Batch 4:  
current_cost :  0.238919
current_accuracy:  0.633
Epoch 155, CIFAR-10 Batch 5:  
current_cost :  0.234136
current_accuracy:  0.6422
Epoch 156, CIFAR-10 Batch 1:  
current_cost :  0.356118
current_accuracy:  0.639
Epoch 156, CIFAR-10 Batch 2:  
current_cost :  0.31257
current_accuracy:  0.6214
Epoch 156, CIFAR-10 Batch 3:  
current_cost :  0.213014
current_accuracy:  0.6256
Epoch 156, CIFAR-10 Batch 4:  
current_cost :  0.26407
current_accuracy:  0.633
Epoch 156, CIFAR-10 Batch 5:  
current_cost :  0.271585
current_accuracy:  0.6354
Epoch 157, CIFAR-10 Batch 1:  
current_cost :  0.323553
current_accuracy:  0.6312
Epoch 157, CIFAR-10 Batch 2:  
current_cost :  0.312123
current_accuracy:  0.6278
Epoch 157, CIFAR-10 Batch 3:  
current_cost :  0.216371
current_accuracy:  0.6324
Epoch 157, CIFAR-10 Batch 4:  
current_cost :  0.271461
current_accuracy:  0.6324
Epoch 157, CIFAR-10 Batch 5:  
current_cost :  0.239041
current_accuracy:  0.6346
Epoch 158, CIFAR-10 Batch 1:  
current_cost :  0.351255
current_accuracy:  0.6302
Epoch 158, CIFAR-10 Batch 2:  
current_cost :  0.323767
current_accuracy:  0.61
Epoch 158, CIFAR-10 Batch 3:  
current_cost :  0.19202
current_accuracy:  0.6338
Epoch 158, CIFAR-10 Batch 4:  
current_cost :  0.273968
current_accuracy:  0.631
Epoch 158, CIFAR-10 Batch 5:  
current_cost :  0.248842
current_accuracy:  0.6394
Epoch 159, CIFAR-10 Batch 1:  
current_cost :  0.353018
current_accuracy:  0.6294
Epoch 159, CIFAR-10 Batch 2:  
current_cost :  0.275255
current_accuracy:  0.6158
Epoch 159, CIFAR-10 Batch 3:  
current_cost :  0.195581
current_accuracy:  0.6238
Epoch 159, CIFAR-10 Batch 4:  
current_cost :  0.235948
current_accuracy:  0.632
Epoch 159, CIFAR-10 Batch 5:  
current_cost :  0.243111
current_accuracy:  0.627
Epoch 160, CIFAR-10 Batch 1:  
current_cost :  0.354404
current_accuracy:  0.631
Epoch 160, CIFAR-10 Batch 2:  
current_cost :  0.346126
current_accuracy:  0.6122
Epoch 160, CIFAR-10 Batch 3:  
current_cost :  0.209413
current_accuracy:  0.6168
Epoch 160, CIFAR-10 Batch 4:  
current_cost :  0.238443
current_accuracy:  0.634
Epoch 160, CIFAR-10 Batch 5:  
current_cost :  0.224822
current_accuracy:  0.6404
Epoch 161, CIFAR-10 Batch 1:  
current_cost :  0.333649
current_accuracy:  0.6322
Epoch 161, CIFAR-10 Batch 2:  
current_cost :  0.304671
current_accuracy:  0.6224
Epoch 161, CIFAR-10 Batch 3:  
current_cost :  0.177291
current_accuracy:  0.624
Epoch 161, CIFAR-10 Batch 4:  
current_cost :  0.27828
current_accuracy:  0.6306
Epoch 161, CIFAR-10 Batch 5:  
current_cost :  0.234264
current_accuracy:  0.6412
Epoch 162, CIFAR-10 Batch 1:  
current_cost :  0.322856
current_accuracy:  0.6324
Epoch 162, CIFAR-10 Batch 2:  
current_cost :  0.293454
current_accuracy:  0.6186
Epoch 162, CIFAR-10 Batch 3:  
current_cost :  0.186625
current_accuracy:  0.6236
Epoch 162, CIFAR-10 Batch 4:  
current_cost :  0.303616
current_accuracy:  0.6314
Epoch 162, CIFAR-10 Batch 5:  
current_cost :  0.259704
current_accuracy:  0.6356
Epoch 163, CIFAR-10 Batch 1:  
current_cost :  0.307548
current_accuracy:  0.6314
Epoch 163, CIFAR-10 Batch 2:  
current_cost :  0.264877
current_accuracy:  0.6198
Epoch 163, CIFAR-10 Batch 3:  
current_cost :  0.206459
current_accuracy:  0.6228
Epoch 163, CIFAR-10 Batch 4:  
current_cost :  0.235925
current_accuracy:  0.6322
Epoch 163, CIFAR-10 Batch 5:  
current_cost :  0.256213
current_accuracy:  0.6328
Epoch 164, CIFAR-10 Batch 1:  
current_cost :  0.308553
current_accuracy:  0.6272
Epoch 164, CIFAR-10 Batch 2:  
current_cost :  0.306867
current_accuracy:  0.6196
Epoch 164, CIFAR-10 Batch 3:  
current_cost :  0.203921
current_accuracy:  0.6226
Epoch 164, CIFAR-10 Batch 4:  
current_cost :  0.272709
current_accuracy:  0.6314
Epoch 164, CIFAR-10 Batch 5:  
current_cost :  0.256568
current_accuracy:  0.6372
Epoch 165, CIFAR-10 Batch 1:  
current_cost :  0.309035
current_accuracy:  0.6356
Epoch 165, CIFAR-10 Batch 2:  
current_cost :  0.315656
current_accuracy:  0.621
Epoch 165, CIFAR-10 Batch 3:  
current_cost :  0.197476
current_accuracy:  0.6214
Epoch 165, CIFAR-10 Batch 4:  
current_cost :  0.239289
current_accuracy:  0.6282
Epoch 165, CIFAR-10 Batch 5:  
current_cost :  0.226225
current_accuracy:  0.6396
Epoch 166, CIFAR-10 Batch 1:  
current_cost :  0.344105
current_accuracy:  0.6332
Epoch 166, CIFAR-10 Batch 2:  
current_cost :  0.290087
current_accuracy:  0.6162
Epoch 166, CIFAR-10 Batch 3:  
current_cost :  0.18191
current_accuracy:  0.618
Epoch 166, CIFAR-10 Batch 4:  
current_cost :  0.265246
current_accuracy:  0.6314
Epoch 166, CIFAR-10 Batch 5:  
current_cost :  0.264283
current_accuracy:  0.636
Epoch 167, CIFAR-10 Batch 1:  
current_cost :  0.294993
current_accuracy:  0.6352
Epoch 167, CIFAR-10 Batch 2:  
current_cost :  0.320186
current_accuracy:  0.6276
Epoch 167, CIFAR-10 Batch 3:  
current_cost :  0.208694
current_accuracy:  0.6266
Epoch 167, CIFAR-10 Batch 4:  
current_cost :  0.249908
current_accuracy:  0.63
Epoch 167, CIFAR-10 Batch 5:  
current_cost :  0.239509
current_accuracy:  0.6346
Epoch 168, CIFAR-10 Batch 1:  
current_cost :  0.380543
current_accuracy:  0.6346
Epoch 168, CIFAR-10 Batch 2:  
current_cost :  0.310555
current_accuracy:  0.623
Epoch 168, CIFAR-10 Batch 3:  
current_cost :  0.182108
current_accuracy:  0.6128
Epoch 168, CIFAR-10 Batch 4:  
current_cost :  0.240541
current_accuracy:  0.6272
Epoch 168, CIFAR-10 Batch 5:  
current_cost :  0.266229
current_accuracy:  0.6344
Epoch 169, CIFAR-10 Batch 1:  
current_cost :  0.370797
current_accuracy:  0.628
Epoch 169, CIFAR-10 Batch 2:  
current_cost :  0.301035
current_accuracy:  0.6192
Epoch 169, CIFAR-10 Batch 3:  
current_cost :  0.200823
current_accuracy:  0.6202
Epoch 169, CIFAR-10 Batch 4:  
current_cost :  0.243318
current_accuracy:  0.6342
Epoch 169, CIFAR-10 Batch 5:  
current_cost :  0.263326
current_accuracy:  0.6334
Epoch 170, CIFAR-10 Batch 1:  
current_cost :  0.358615
current_accuracy:  0.6276
Epoch 170, CIFAR-10 Batch 2:  
current_cost :  0.307101
current_accuracy:  0.6116
Epoch 170, CIFAR-10 Batch 3:  
current_cost :  0.189659
current_accuracy:  0.6142
Epoch 170, CIFAR-10 Batch 4:  
current_cost :  0.211579
current_accuracy:  0.6296
Epoch 170, CIFAR-10 Batch 5:  
current_cost :  0.261208
current_accuracy:  0.635
Epoch 171, CIFAR-10 Batch 1:  
current_cost :  0.322238
current_accuracy:  0.6312
Epoch 171, CIFAR-10 Batch 2:  
current_cost :  0.317284
current_accuracy:  0.6176
Epoch 171, CIFAR-10 Batch 3:  
current_cost :  0.178694
current_accuracy:  0.6298
Epoch 171, CIFAR-10 Batch 4:  
current_cost :  0.241134
current_accuracy:  0.636
Epoch 171, CIFAR-10 Batch 5:  
current_cost :  0.24203
current_accuracy:  0.636
Epoch 172, CIFAR-10 Batch 1:  
current_cost :  0.317449
current_accuracy:  0.6286
Epoch 172, CIFAR-10 Batch 2:  
current_cost :  0.298843
current_accuracy:  0.6168
Epoch 172, CIFAR-10 Batch 3:  
current_cost :  0.199208
current_accuracy:  0.6174
Epoch 172, CIFAR-10 Batch 4:  
current_cost :  0.251184
current_accuracy:  0.638
Epoch 172, CIFAR-10 Batch 5:  
current_cost :  0.251356
current_accuracy:  0.6314
Epoch 173, CIFAR-10 Batch 1:  
current_cost :  0.328227
current_accuracy:  0.6292
Epoch 173, CIFAR-10 Batch 2:  
current_cost :  0.286842
current_accuracy:  0.6172
Epoch 173, CIFAR-10 Batch 3:  
current_cost :  0.190384
current_accuracy:  0.6192
Epoch 173, CIFAR-10 Batch 4:  
current_cost :  0.229443
current_accuracy:  0.6306
Epoch 173, CIFAR-10 Batch 5:  
current_cost :  0.231803
current_accuracy:  0.6302
Epoch 174, CIFAR-10 Batch 1:  
current_cost :  0.29456
current_accuracy:  0.6298
Epoch 174, CIFAR-10 Batch 2:  
current_cost :  0.319478
current_accuracy:  0.6106
Epoch 174, CIFAR-10 Batch 3:  
current_cost :  0.173817
current_accuracy:  0.6296
Epoch 174, CIFAR-10 Batch 4:  
current_cost :  0.283285
current_accuracy:  0.6302
Epoch 174, CIFAR-10 Batch 5:  
current_cost :  0.234326
current_accuracy:  0.6346
Epoch 175, CIFAR-10 Batch 1:  
current_cost :  0.325663
current_accuracy:  0.626
Epoch 175, CIFAR-10 Batch 2:  
current_cost :  0.309267
current_accuracy:  0.6102
Epoch 175, CIFAR-10 Batch 3:  
current_cost :  0.172506
current_accuracy:  0.6154
Epoch 175, CIFAR-10 Batch 4:  
current_cost :  0.248648
current_accuracy:  0.6318
Epoch 175, CIFAR-10 Batch 5:  
current_cost :  0.258475
current_accuracy:  0.6402
Epoch 176, CIFAR-10 Batch 1:  
current_cost :  0.315668
current_accuracy:  0.6286
Epoch 176, CIFAR-10 Batch 2:  
current_cost :  0.283596
current_accuracy:  0.6114
Epoch 176, CIFAR-10 Batch 3:  
current_cost :  0.176002
current_accuracy:  0.6106
Epoch 176, CIFAR-10 Batch 4:  
current_cost :  0.254969
current_accuracy:  0.6326
Epoch 176, CIFAR-10 Batch 5:  
current_cost :  0.229435
current_accuracy:  0.636
Epoch 177, CIFAR-10 Batch 1:  
current_cost :  0.307132
current_accuracy:  0.629
Epoch 177, CIFAR-10 Batch 2:  
current_cost :  0.290407
current_accuracy:  0.6162
Epoch 177, CIFAR-10 Batch 3:  
current_cost :  0.175762
current_accuracy:  0.6166
Epoch 177, CIFAR-10 Batch 4:  
current_cost :  0.240632
current_accuracy:  0.6316
Epoch 177, CIFAR-10 Batch 5:  
current_cost :  0.242585
current_accuracy:  0.638
Epoch 178, CIFAR-10 Batch 1:  
current_cost :  0.334613
current_accuracy:  0.6236
Epoch 178, CIFAR-10 Batch 2:  
current_cost :  0.276916
current_accuracy:  0.6158
Epoch 178, CIFAR-10 Batch 3:  
current_cost :  0.179816
current_accuracy:  0.6092
Epoch 178, CIFAR-10 Batch 4:  
current_cost :  0.243579
current_accuracy:  0.6262
Epoch 178, CIFAR-10 Batch 5:  
current_cost :  0.249347
current_accuracy:  0.6322
Epoch 179, CIFAR-10 Batch 1:  
current_cost :  0.322149
current_accuracy:  0.6364
Epoch 179, CIFAR-10 Batch 2:  
current_cost :  0.274022
current_accuracy:  0.6202
Epoch 179, CIFAR-10 Batch 3:  
current_cost :  0.182295
current_accuracy:  0.6194
Epoch 179, CIFAR-10 Batch 4:  
current_cost :  0.235464
current_accuracy:  0.629
Epoch 179, CIFAR-10 Batch 5:  
current_cost :  0.262927
current_accuracy:  0.6338
Epoch 180, CIFAR-10 Batch 1:  
current_cost :  0.30725
current_accuracy:  0.6274
Epoch 180, CIFAR-10 Batch 2:  
current_cost :  0.312349
current_accuracy:  0.6106
Epoch 180, CIFAR-10 Batch 3:  
current_cost :  0.189942
current_accuracy:  0.628
Epoch 180, CIFAR-10 Batch 4:  
current_cost :  0.228219
current_accuracy:  0.6228
Epoch 180, CIFAR-10 Batch 5:  
current_cost :  0.22573
current_accuracy:  0.6342
Epoch 181, CIFAR-10 Batch 1:  
current_cost :  0.337072
current_accuracy:  0.6316
Epoch 181, CIFAR-10 Batch 2:  
current_cost :  0.318679
current_accuracy:  0.6078
Epoch 181, CIFAR-10 Batch 3:  
current_cost :  0.198296
current_accuracy:  0.6194
Epoch 181, CIFAR-10 Batch 4:  
current_cost :  0.258338
current_accuracy:  0.6284
Epoch 181, CIFAR-10 Batch 5:  
current_cost :  0.247724
current_accuracy:  0.6342
Epoch 182, CIFAR-10 Batch 1:  
current_cost :  0.311948
current_accuracy:  0.6314
Epoch 182, CIFAR-10 Batch 2:  
current_cost :  0.294908
current_accuracy:  0.613
Epoch 182, CIFAR-10 Batch 3:  
current_cost :  0.180197
current_accuracy:  0.631
Epoch 182, CIFAR-10 Batch 4:  
current_cost :  0.248519
current_accuracy:  0.6208
Epoch 182, CIFAR-10 Batch 5:  
current_cost :  0.246745
current_accuracy:  0.6354
Epoch 183, CIFAR-10 Batch 1:  
current_cost :  0.301898
current_accuracy:  0.632
Epoch 183, CIFAR-10 Batch 2:  
current_cost :  0.319291
current_accuracy:  0.611
Epoch 183, CIFAR-10 Batch 3:  
current_cost :  0.184539
current_accuracy:  0.6092
Epoch 183, CIFAR-10 Batch 4:  
current_cost :  0.218413
current_accuracy:  0.6326
Epoch 183, CIFAR-10 Batch 5:  
current_cost :  0.225309
current_accuracy:  0.636
Epoch 184, CIFAR-10 Batch 1:  
current_cost :  0.302295
current_accuracy:  0.6304
Epoch 184, CIFAR-10 Batch 2:  
current_cost :  0.32136
current_accuracy:  0.612
Epoch 184, CIFAR-10 Batch 3:  
current_cost :  0.183094
current_accuracy:  0.6172
Epoch 184, CIFAR-10 Batch 4:  
current_cost :  0.244955
current_accuracy:  0.6246
Epoch 184, CIFAR-10 Batch 5:  
current_cost :  0.230719
current_accuracy:  0.6318
Epoch 185, CIFAR-10 Batch 1:  
current_cost :  0.330976
current_accuracy:  0.6252
Epoch 185, CIFAR-10 Batch 2:  
current_cost :  0.273309
current_accuracy:  0.616
Epoch 185, CIFAR-10 Batch 3:  
current_cost :  0.167312
current_accuracy:  0.61
Epoch 185, CIFAR-10 Batch 4:  
current_cost :  0.228529
current_accuracy:  0.6354
Epoch 185, CIFAR-10 Batch 5:  
current_cost :  0.251408
current_accuracy:  0.6362
Epoch 186, CIFAR-10 Batch 1:  
current_cost :  0.33429
current_accuracy:  0.6318
Epoch 186, CIFAR-10 Batch 2:  
current_cost :  0.293194
current_accuracy:  0.61
Epoch 186, CIFAR-10 Batch 3:  
current_cost :  0.196305
current_accuracy:  0.6158
Epoch 186, CIFAR-10 Batch 4:  
current_cost :  0.264491
current_accuracy:  0.6282
Epoch 186, CIFAR-10 Batch 5:  
current_cost :  0.243763
current_accuracy:  0.633
Epoch 187, CIFAR-10 Batch 1:  
current_cost :  0.304216
current_accuracy:  0.6316
Epoch 187, CIFAR-10 Batch 2:  
current_cost :  0.279155
current_accuracy:  0.618
Epoch 187, CIFAR-10 Batch 3:  
current_cost :  0.179318
current_accuracy:  0.6306
Epoch 187, CIFAR-10 Batch 4:  
current_cost :  0.228502
current_accuracy:  0.6306
Epoch 187, CIFAR-10 Batch 5:  
current_cost :  0.248969
current_accuracy:  0.6358
Epoch 188, CIFAR-10 Batch 1:  
current_cost :  0.326126
current_accuracy:  0.6322
Epoch 188, CIFAR-10 Batch 2:  
current_cost :  0.298756
current_accuracy:  0.6182
Epoch 188, CIFAR-10 Batch 3:  
current_cost :  0.189839
current_accuracy:  0.62
Epoch 188, CIFAR-10 Batch 4:  
current_cost :  0.240499
current_accuracy:  0.6278
Epoch 188, CIFAR-10 Batch 5:  
current_cost :  0.252029
current_accuracy:  0.635
Epoch 189, CIFAR-10 Batch 1:  
current_cost :  0.291313
current_accuracy:  0.6326
Epoch 189, CIFAR-10 Batch 2:  
current_cost :  0.280319
current_accuracy:  0.615
Epoch 189, CIFAR-10 Batch 3:  
current_cost :  0.18134
current_accuracy:  0.6132
Epoch 189, CIFAR-10 Batch 4:  
current_cost :  0.226506
current_accuracy:  0.6324
Epoch 189, CIFAR-10 Batch 5:  
current_cost :  0.216625
current_accuracy:  0.6328
Epoch 190, CIFAR-10 Batch 1:  
current_cost :  0.313799
current_accuracy:  0.6294
Epoch 190, CIFAR-10 Batch 2:  
current_cost :  0.341207
current_accuracy:  0.6152
Epoch 190, CIFAR-10 Batch 3:  
current_cost :  0.182394
current_accuracy:  0.6208
Epoch 190, CIFAR-10 Batch 4:  
current_cost :  0.245197
current_accuracy:  0.6248
Epoch 190, CIFAR-10 Batch 5:  
current_cost :  0.247128
current_accuracy:  0.6268
Epoch 191, CIFAR-10 Batch 1:  
current_cost :  0.311498
current_accuracy:  0.6278
Epoch 191, CIFAR-10 Batch 2:  
current_cost :  0.323252
current_accuracy:  0.619
Epoch 191, CIFAR-10 Batch 3:  
current_cost :  0.18719
current_accuracy:  0.6274
Epoch 191, CIFAR-10 Batch 4:  
current_cost :  0.241273
current_accuracy:  0.6242
Epoch 191, CIFAR-10 Batch 5:  
current_cost :  0.264836
current_accuracy:  0.6294
Epoch 192, CIFAR-10 Batch 1:  
current_cost :  0.339119
current_accuracy:  0.6258
Epoch 192, CIFAR-10 Batch 2:  
current_cost :  0.327186
current_accuracy:  0.6164
Epoch 192, CIFAR-10 Batch 3:  
current_cost :  0.253832
current_accuracy:  0.6258
Epoch 192, CIFAR-10 Batch 4:  
current_cost :  0.2252
current_accuracy:  0.6208
Epoch 192, CIFAR-10 Batch 5:  
current_cost :  0.268589
current_accuracy:  0.6286
Epoch 193, CIFAR-10 Batch 1:  
current_cost :  0.324174
current_accuracy:  0.6252
Epoch 193, CIFAR-10 Batch 2:  
current_cost :  0.302837
current_accuracy:  0.6206
Epoch 193, CIFAR-10 Batch 3:  
current_cost :  0.189802
current_accuracy:  0.6222
Epoch 193, CIFAR-10 Batch 4:  
current_cost :  0.259403
current_accuracy:  0.6224
Epoch 193, CIFAR-10 Batch 5:  
current_cost :  0.23061
current_accuracy:  0.6334
Epoch 194, CIFAR-10 Batch 1:  
current_cost :  0.300025
current_accuracy:  0.6302
Epoch 194, CIFAR-10 Batch 2:  
current_cost :  0.306169
current_accuracy:  0.6156
Epoch 194, CIFAR-10 Batch 3:  
current_cost :  0.195116
current_accuracy:  0.6214
Epoch 194, CIFAR-10 Batch 4:  
current_cost :  0.240764
current_accuracy:  0.6218
Epoch 194, CIFAR-10 Batch 5:  
current_cost :  0.249824
current_accuracy:  0.6324
Epoch 195, CIFAR-10 Batch 1:  
current_cost :  0.320643
current_accuracy:  0.6274
Epoch 195, CIFAR-10 Batch 2:  
current_cost :  0.303224
current_accuracy:  0.6106
Epoch 195, CIFAR-10 Batch 3:  
current_cost :  0.177597
current_accuracy:  0.632
Epoch 195, CIFAR-10 Batch 4:  
current_cost :  0.264336
current_accuracy:  0.6204
Epoch 195, CIFAR-10 Batch 5:  
current_cost :  0.239076
current_accuracy:  0.6254
Epoch 196, CIFAR-10 Batch 1:  
current_cost :  0.31934
current_accuracy:  0.627
Epoch 196, CIFAR-10 Batch 2:  
current_cost :  0.30046
current_accuracy:  0.6206
Epoch 196, CIFAR-10 Batch 3:  
current_cost :  0.188533
current_accuracy:  0.6192
Epoch 196, CIFAR-10 Batch 4:  
current_cost :  0.252027
current_accuracy:  0.6244
Epoch 196, CIFAR-10 Batch 5:  
current_cost :  0.25863
current_accuracy:  0.6344
Epoch 197, CIFAR-10 Batch 1:  
current_cost :  0.300579
current_accuracy:  0.6272
Epoch 197, CIFAR-10 Batch 2:  
current_cost :  0.293811
current_accuracy:  0.6188
Epoch 197, CIFAR-10 Batch 3:  
current_cost :  0.187207
current_accuracy:  0.6264
Epoch 197, CIFAR-10 Batch 4:  
current_cost :  0.246699
current_accuracy:  0.6278
Epoch 197, CIFAR-10 Batch 5:  
current_cost :  0.233099
current_accuracy:  0.625
Epoch 198, CIFAR-10 Batch 1:  
current_cost :  0.316159
current_accuracy:  0.6322
Epoch 198, CIFAR-10 Batch 2:  
current_cost :  0.320788
current_accuracy:  0.6092
Epoch 198, CIFAR-10 Batch 3:  
current_cost :  0.19377
current_accuracy:  0.6192
Epoch 198, CIFAR-10 Batch 4:  
current_cost :  0.239522
current_accuracy:  0.6228
Epoch 198, CIFAR-10 Batch 5:  
current_cost :  0.260884
current_accuracy:  0.6308
Epoch 199, CIFAR-10 Batch 1:  
current_cost :  0.358444
current_accuracy:  0.6238
Epoch 199, CIFAR-10 Batch 2:  
current_cost :  0.309712
current_accuracy:  0.6192
Epoch 199, CIFAR-10 Batch 3:  
current_cost :  0.17462
current_accuracy:  0.6226
Epoch 199, CIFAR-10 Batch 4:  
current_cost :  0.233337
current_accuracy:  0.634
Epoch 199, CIFAR-10 Batch 5:  
current_cost :  0.241896
current_accuracy:  0.629
Epoch 200, CIFAR-10 Batch 1:  
current_cost :  0.326089
current_accuracy:  0.6178
Epoch 200, CIFAR-10 Batch 2:  
current_cost :  0.307884
current_accuracy:  0.618
Epoch 200, CIFAR-10 Batch 3:  
current_cost :  0.163723
current_accuracy:  0.6348
Epoch 200, CIFAR-10 Batch 4:  
current_cost :  0.221085
current_accuracy:  0.6288
Epoch 200, CIFAR-10 Batch 5:  
current_cost :  0.240304
current_accuracy:  0.6304

Checkpoint

The model has been saved to disk.

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.


In [23]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_test.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for test_feature_batch, test_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: test_feature_batch, loaded_y: test_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


Testing Accuracy: 0.6269778481012658

Why 50-80% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 80%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.