图像分类

在此项目中,你将对 CIFAR-10 数据集 中的图片进行分类。该数据集包含飞机、猫狗和其他物体。你需要预处理这些图片,然后用所有样本训练一个卷积神经网络。图片需要标准化(normalized),标签需要采用 one-hot 编码。你需要应用所学的知识构建卷积的、最大池化(max pooling)、丢弃(dropout)和完全连接(fully connected)的层。最后,你需要在样本图片上看到神经网络的预测结果。

获取数据

请运行以下单元,以下载 CIFAR-10 数据集(Python版)


In [17]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import problem_unittests as tests
import tarfile

cifar10_dataset_folder_path = 'cifar-10-batches-py'

# Use Floyd's cifar-10 dataset if present
floyd_cifar10_location = '/input/cifar-10/python.tar.gz'
if isfile(floyd_cifar10_location):
    tar_gz_path = floyd_cifar10_location
else:
    tar_gz_path = 'cifar-10-python.tar.gz'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(tar_gz_path):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
        urlretrieve(
            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
            tar_gz_path,
            pbar.hook)

if not isdir(cifar10_dataset_folder_path):
    with tarfile.open(tar_gz_path) as tar:
        tar.extractall()
        tar.close()


tests.test_folder_path(cifar10_dataset_folder_path)


All files found!

探索数据

该数据集分成了几部分/批次(batches),以免你的机器在计算时内存不足。CIFAR-10 数据集包含 5 个部分,名称分别为 data_batch_1data_batch_2,以此类推。每个部分都包含以下某个类别的标签和图片:

  • 飞机
  • 汽车
  • 鸟类
  • 鹿
  • 青蛙
  • 船只
  • 卡车

了解数据集也是对数据进行预测的必经步骤。你可以通过更改 batch_idsample_id 探索下面的代码单元。batch_id 是数据集一个部分的 ID(1 到 5)。sample_id 是该部分中图片和标签对(label pair)的 ID。

问问你自己:“可能的标签有哪些?”、“图片数据的值范围是多少?”、“标签是按顺序排列,还是随机排列的?”。思考类似的问题,有助于你预处理数据,并使预测结果更准确。


In [18]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import helper
import numpy as np

# Explore the dataset
batch_id = 1
sample_id = 5
helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)


Stats of batch 1:
Samples: 10000
Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}
First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]

Example of Image 5:
Image - Min Value: 0 Max Value: 252
Image - Shape: (32, 32, 3)
Label - Label Id: 1 Name: automobile

实现预处理函数

标准化

在下面的单元中,实现 normalize 函数,传入图片数据 x,并返回标准化 Numpy 数组。值应该在 0 到 1 的范围内(含 0 和 1)。返回对象应该和 x 的形状一样。


In [19]:
def normalize(x):
    """
    Normalize a list of sample image data in the range of 0 to 1
    : x: List of image data.  The image shape is (32, 32, 3)
    : return: Numpy array of normalize data
    """
    # TODO: Implement Function
    return x / 255


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_normalize(normalize)


Tests Passed

One-hot 编码

和之前的代码单元一样,你将为预处理实现一个函数。这次,你将实现 one_hot_encode 函数。输入,也就是 x,是一个标签列表。实现该函数,以返回为 one_hot 编码的 Numpy 数组的标签列表。标签的可能值为 0 到 9。每次调用 one_hot_encode 时,对于每个值,one_hot 编码函数应该返回相同的编码。确保将编码映射保存到该函数外面。

提示:不要重复发明轮子。


In [20]:
from sklearn.preprocessing import OneHotEncoder
enc = OneHotEncoder()
each_label = np.array(list(range(10))).reshape(-1,1)
enc.fit(each_label)
print(enc.n_values_)
print(enc.feature_indices_)

def one_hot_encode(x):
    """
    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
    : x: List of sample Labels
    : return: Numpy array of one-hot encoded labels
    """
    # TODO: Implement Function
    X = np.array(x).reshape(-1, 1)
    return enc.transform(X).toarray()


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_one_hot_encode(one_hot_encode)


[10]
[ 0 10]
Tests Passed

随机化数据

之前探索数据时,你已经了解到,样本的顺序是随机的。再随机化一次也不会有什么关系,但是对于这个数据集没有必要。

预处理所有数据并保存

运行下方的代码单元,将预处理所有 CIFAR-10 数据,并保存到文件中。下面的代码还使用了 10% 的训练数据,用来验证。


In [21]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
# Preprocess Training, Validation, and Testing Data
helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)

检查点

这是你的第一个检查点。如果你什么时候决定再回到该记事本,或需要重新启动该记事本,你可以从这里开始。预处理的数据已保存到本地。


In [22]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle
import problem_unittests as tests
import helper

# Load the Preprocessed Validation data
valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))

构建网络

对于该神经网络,你需要将每层都构建为一个函数。你看到的大部分代码都位于函数外面。要更全面地测试你的代码,我们需要你将每层放入一个函数中。这样使我们能够提供更好的反馈,并使用我们的统一测试检测简单的错误,然后再提交项目。

注意:如果你觉得每周很难抽出足够的时间学习这门课程,我们为此项目提供了一个小捷径。对于接下来的几个问题,你可以使用 TensorFlow LayersTensorFlow Layers (contrib) 程序包中的类来构建每个层级,但是“卷积和最大池化层级”部分的层级除外。TF Layers 和 Keras 及 TFLearn 层级类似,因此很容易学会。

但是,如果你想充分利用这门课程,请尝试自己解决所有问题,不使用 TF Layers 程序包中的任何类。你依然可以使用其他程序包中的类,这些类和你在 TF Layers 中的类名称是一样的!例如,你可以使用 TF Neural Network 版本的 conv2d 类 tf.nn.conv2d,而不是 TF Layers 版本的 conv2d 类 tf.layers.conv2d

我们开始吧!

输入

神经网络需要读取图片数据、one-hot 编码标签和丢弃保留概率(dropout keep probability)。请实现以下函数:

  • 实现 neural_net_image_input
    • 返回 TF Placeholder
    • 使用 image_shape 设置形状,部分大小设为 None
    • 使用 TF Placeholder 中的 TensorFlow name 参数对 TensorFlow 占位符 "x" 命名
  • 实现 neural_net_label_input
    • 返回 TF Placeholder
    • 使用 n_classes 设置形状,部分大小设为 None
    • 使用 TF Placeholder 中的 TensorFlow name 参数对 TensorFlow 占位符 "y" 命名
  • 实现 neural_net_keep_prob_input
    • 返回 TF Placeholder,用于丢弃保留概率
    • 使用 TF Placeholder 中的 TensorFlow name 参数对 TensorFlow 占位符 "keep_prob" 命名

这些名称将在项目结束时,用于加载保存的模型。

注意:TensorFlow 中的 None 表示形状可以是动态大小。


In [23]:
import tensorflow as tf

def neural_net_image_input(image_shape):
    """
    Return a Tensor for a batch of image input
    : image_shape: Shape of the images
    : return: Tensor for image input.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32, shape=[None, image_shape[0], image_shape[1], image_shape[2]], name='x')


def neural_net_label_input(n_classes):
    """
    Return a Tensor for a batch of label input
    : n_classes: Number of classes
    : return: Tensor for label input.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32, shape=[None, n_classes], name='y')


def neural_net_keep_prob_input():
    """
    Return a Tensor for keep probability
    : return: Tensor for keep probability.
    """
    # TODO: Implement Function
    return tf.placeholder(tf.float32, shape=None, name='keep_prob')


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tf.reset_default_graph()
tests.test_nn_image_inputs(neural_net_image_input)
tests.test_nn_label_inputs(neural_net_label_input)
tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)


Image Input Tests Passed.
Label Input Tests Passed.
Keep Prob Tests Passed.

卷积和最大池化层

卷积层级适合处理图片。对于此代码单元,你应该实现函数 conv2d_maxpool 以便应用卷积然后进行最大池化:

  • 使用 conv_ksizeconv_num_outputs 和 x_tensor 的形状创建权重(weight)和偏置(bias)。
  • 使用权重和 conv_strides 对 x_tensor 应用卷积。
    • 建议使用我们建议的间距(padding),当然也可以使用任何其他间距。
  • 添加偏置
  • 向卷积中添加非线性激活(nonlinear activation)
  • 使用 pool_ksize 和 pool_strides 应用最大池化
    • 建议使用我们建议的间距(padding),当然也可以使用任何其他间距。

注意:对于此层请勿使用 TensorFlow LayersTensorFlow Layers (contrib),但是仍然可以使用 TensorFlow 的 Neural Network 包。对于所有其他层,你依然可以使用快捷方法。


In [24]:
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
    """
    Apply convolution then max pooling to x_tensor
    :param x_tensor: TensorFlow Tensor
    :param conv_num_outputs: Number of outputs for the convolutional layer
    :param conv_ksize: kernal size 2-D Tuple for the convolutional layer
    :param conv_strides: Stride 2-D Tuple for convolution
    :param pool_ksize: kernal size 2-D Tuple for pool
    :param pool_strides: Stride 2-D Tuple for pool
    : return: A tensor that represents convolution and max pooling of x_tensor
    """
    # TODO: Implement Function
    weights = tf.Variable(tf.truncated_normal([conv_ksize[0], conv_ksize[1], x_tensor.get_shape().as_list()[-1], conv_num_outputs], stddev=0.1))
    biases = tf.Variable(tf.zeros([conv_num_outputs]))
    net = tf.nn.conv2d(x_tensor, weights, [1, conv_strides[0], conv_strides[1], 1], 'SAME')
    net = tf.nn.bias_add(net, biases)
    net = tf.nn.relu(net)

    pool_kernel = [1, pool_ksize[0], pool_ksize[1], 1]
    pool_strides = [1, pool_strides[0], pool_strides[1], 1]
    net = tf.nn.max_pool(net, pool_kernel, pool_strides, 'VALID')

    return net


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_con_pool(conv2d_maxpool)


Tests Passed

扁平化层

实现 flatten 函数,将 x_tensor 的维度从四维张量(4-D tensor)变成二维张量。输出应该是形状(部分大小(Batch Size)扁平化图片大小(Flattened Image Size))。快捷方法:对于此层,你可以使用 TensorFlow LayersTensorFlow Layers (contrib) 包中的类。如果你想要更大挑战,可以仅使用其他 TensorFlow 程序包。


In [25]:
import numpy as np
def flatten(x_tensor):
    """
    Flatten x_tensor to (Batch Size, Flattened Image Size)
    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
    : return: A tensor of size (Batch Size, Flattened Image Size).
    """
    # TODO: Implement Function
    shape = x_tensor.get_shape().as_list()
    dim = np.prod(shape[1:])
    x_tensor = tf.reshape(x_tensor, [-1,dim])
    return x_tensor


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_flatten(flatten)


Tests Passed

全连接层

实现 fully_conn 函数,以向 x_tensor 应用完全连接的层级,形状为(部分大小(Batch Size)num_outputs)。快捷方法:对于此层,你可以使用 TensorFlow LayersTensorFlow Layers (contrib) 包中的类。如果你想要更大挑战,可以仅使用其他 TensorFlow 程序包。


In [26]:
def fully_conn(x_tensor, num_outputs):
    """
    Apply a fully connected layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    weights = tf.Variable(tf.truncated_normal(shape=[x_tensor.get_shape().as_list()[-1], num_outputs], mean=0, stddev=1))
    biases = tf.Variable(tf.zeros(shape=[num_outputs]))
    net = tf.nn.bias_add(tf.matmul(x_tensor, weights), biases)
    net = tf.nn.relu(net)
    return net


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_fully_conn(fully_conn)


Tests Passed

输出层

实现 output 函数,向 x_tensor 应用完全连接的层级,形状为(部分大小(Batch Size)num_outputs)。快捷方法:对于此层,你可以使用 TensorFlow LayersTensorFlow Layers (contrib) 包中的类。如果你想要更大挑战,可以仅使用其他 TensorFlow 程序包。

注意:该层级不应应用 Activation、softmax 或交叉熵(cross entropy)。


In [27]:
def output(x_tensor, num_outputs):
    """
    Apply a output layer to x_tensor using weight and bias
    : x_tensor: A 2-D tensor where the first dimension is batch size.
    : num_outputs: The number of output that the new tensor should be.
    : return: A 2-D tensor where the second dimension is num_outputs.
    """
    # TODO: Implement Function
    weights = tf.Variable(tf.truncated_normal(shape=[x_tensor.get_shape().as_list()[-1], num_outputs], mean=0, stddev=1))
    biases = tf.Variable(tf.zeros(shape=[num_outputs]))
    net = tf.nn.bias_add(tf.matmul(x_tensor, weights), biases)
    return net


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_output(output)


Tests Passed

创建卷积模型

实现函数 conv_net, 创建卷积神经网络模型。该函数传入一批图片 x,并输出对数(logits)。使用你在上方创建的层创建此模型:

  • 应用 1、2 或 3 个卷积和最大池化层(Convolution and Max Pool layers)
  • 应用一个扁平层(Flatten Layer)
  • 应用 1、2 或 3 个完全连接层(Fully Connected Layers)
  • 应用一个输出层(Output Layer)
  • 返回输出
  • 使用 keep_prob 向模型中的一个或多个层应用 TensorFlow 的 Dropout

In [28]:
def conv_net(x, keep_prob):
    """
    Create a convolutional neural network model
    : x: Placeholder tensor that holds image data.
    : keep_prob: Placeholder tensor that hold dropout keep probability.
    : return: Tensor that represents logits
    """
    # TODO: Apply 1, 2, or 3 Convolution and Max Pool layers
    #    Play around with different number of outputs, kernel size and stride
    # Function Definition from Above:
    #    conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides)
    
#     net = conv2d_maxpool(x, 32, (5,5), (1,1), (2,2), (2,2))
#     net = tf.nn.dropout(net, keep_prob)
#     net = conv2d_maxpool(net, 32, (5,5), (1,1), (2,2), (2,2))
#     net = conv2d_maxpool(net, 64, (5,5), (1,1), (2,2), (2,2))

    net = conv2d_maxpool(x, 32, (3,3), (1,1), (2,2), (2,2))
    net = tf.nn.dropout(net, keep_prob)
    net = conv2d_maxpool(net, 64, (3,3), (1,1), (2,2), (2,2))
    net = tf.nn.dropout(net, keep_prob)
    net = conv2d_maxpool(net, 64, (3,3), (1,1), (2,2), (2,2))

    # TODO: Apply a Flatten Layer
    # Function Definition from Above:
    #   flatten(x_tensor)
    
    net = flatten(net)

    # TODO: Apply 1, 2, or 3 Fully Connected Layers
    #    Play around with different number of outputs
    # Function Definition from Above:
    #   fully_conn(x_tensor, num_outputs)
    
    net = fully_conn(net, 64)
    
    # TODO: Apply an Output Layer
    #    Set this to the number of classes
    # Function Definition from Above:
    #   output(x_tensor, num_outputs)
    
    net = output(net, enc.n_values_)
    
    # TODO: return output
    return net


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""

##############################
## Build the Neural Network ##
##############################

# Remove previous weights, bias, inputs, etc..
tf.reset_default_graph()

# Inputs
x = neural_net_image_input((32, 32, 3))
y = neural_net_label_input(10)
keep_prob = neural_net_keep_prob_input()

# Model
logits = conv_net(x, keep_prob)

# Name logits Tensor, so that is can be loaded from disk after training
logits = tf.identity(logits, name='logits')

# Loss and Optimizer
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
optimizer = tf.train.AdamOptimizer().minimize(cost)

# Accuracy
correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')

tests.test_conv_net(conv_net)


Neural Network Built!

训练神经网络

单次优化

实现函数 train_neural_network 以进行单次优化(single optimization)。该优化应该使用 optimizer 优化 session,其中 feed_dict 具有以下参数:

  • x 表示图片输入
  • y 表示标签
  • keep_prob 表示丢弃的保留率

每个部分都会调用该函数,所以 tf.global_variables_initializer() 已经被调用。

注意:不需要返回任何内容。该函数只是用来优化神经网络。


In [29]:
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
    """
    Optimize the session on a batch of images and labels
    : session: Current TensorFlow session
    : optimizer: TensorFlow optimizer function
    : keep_probability: keep probability
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    """
    # TODO: Implement Function
    _ = session.run([optimizer, cost, accuracy], feed_dict={x: feature_batch, y: label_batch, keep_prob: keep_probability})


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_train_nn(train_neural_network)


Tests Passed

显示数据

实现函数 print_stats 以输出损失和验证准确率。使用全局变量 valid_features 和 valid_labels 计算验证准确率。使用保留率 1.0 计算损失和验证准确率(loss and validation accuracy)。


In [30]:
def print_stats(session, feature_batch, label_batch, cost, accuracy):
    """
    Print information about loss and validation accuracy
    : session: Current TensorFlow session
    : feature_batch: Batch of Numpy image data
    : label_batch: Batch of Numpy label data
    : cost: TensorFlow cost function
    : accuracy: TensorFlow accuracy function
    """
    # TODO: Implement Function
    valid_loss, valid_accuracy = session.run([cost, accuracy], feed_dict={x: valid_features, y: valid_labels, keep_prob: 1})
    print("valid loss {:.3f}, accuracy {:.3f}".format(valid_loss, valid_accuracy))

超参数

调试以下超参数:

  • 设置 epochs 表示神经网络停止学习或开始过拟合的迭代次数
  • 设置 batch_size,表示机器内存允许的部分最大体积。大部分人设为以下常见内存大小:

    • 64
    • 128
    • 256
    • ...
  • 设置 keep_probability 表示使用丢弃时保留节点的概率

In [31]:
# TODO: Tune Parameters
epochs = 100
batch_size = 256
keep_probability = 0.8

在单个 CIFAR-10 部分上训练

我们先用单个部分,而不是用所有的 CIFAR-10 批次训练神经网络。这样可以节省时间,并对模型进行迭代,以提高准确率。最终验证准确率达到 50% 或以上之后,在下一部分对所有数据运行模型。


In [32]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
print('Checking the Training on a Single Batch...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        batch_i = 1
        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
        print_stats(sess, batch_features, batch_labels, cost, accuracy)


Checking the Training on a Single Batch...
Epoch  1, CIFAR-10 Batch 1:  valid loss 3.076, accuracy 0.140
Epoch  2, CIFAR-10 Batch 1:  valid loss 2.299, accuracy 0.147
Epoch  3, CIFAR-10 Batch 1:  valid loss 2.281, accuracy 0.141
Epoch  4, CIFAR-10 Batch 1:  valid loss 2.264, accuracy 0.151
Epoch  5, CIFAR-10 Batch 1:  valid loss 2.257, accuracy 0.159
Epoch  6, CIFAR-10 Batch 1:  valid loss 2.241, accuracy 0.176
Epoch  7, CIFAR-10 Batch 1:  valid loss 2.221, accuracy 0.182
Epoch  8, CIFAR-10 Batch 1:  valid loss 2.197, accuracy 0.187
Epoch  9, CIFAR-10 Batch 1:  valid loss 2.160, accuracy 0.209
Epoch 10, CIFAR-10 Batch 1:  valid loss 2.116, accuracy 0.219
Epoch 11, CIFAR-10 Batch 1:  valid loss 2.067, accuracy 0.237
Epoch 12, CIFAR-10 Batch 1:  valid loss 1.998, accuracy 0.264
Epoch 13, CIFAR-10 Batch 1:  valid loss 2.000, accuracy 0.268
Epoch 14, CIFAR-10 Batch 1:  valid loss 1.930, accuracy 0.292
Epoch 15, CIFAR-10 Batch 1:  valid loss 1.890, accuracy 0.297
Epoch 16, CIFAR-10 Batch 1:  valid loss 1.841, accuracy 0.318
Epoch 17, CIFAR-10 Batch 1:  valid loss 1.789, accuracy 0.354
Epoch 18, CIFAR-10 Batch 1:  valid loss 1.773, accuracy 0.344
Epoch 19, CIFAR-10 Batch 1:  valid loss 1.754, accuracy 0.352
Epoch 20, CIFAR-10 Batch 1:  valid loss 1.719, accuracy 0.372
Epoch 21, CIFAR-10 Batch 1:  valid loss 1.695, accuracy 0.383
Epoch 22, CIFAR-10 Batch 1:  valid loss 1.705, accuracy 0.377
Epoch 23, CIFAR-10 Batch 1:  valid loss 1.664, accuracy 0.395
Epoch 24, CIFAR-10 Batch 1:  valid loss 1.618, accuracy 0.413
Epoch 25, CIFAR-10 Batch 1:  valid loss 1.606, accuracy 0.410
Epoch 26, CIFAR-10 Batch 1:  valid loss 1.607, accuracy 0.411
Epoch 27, CIFAR-10 Batch 1:  valid loss 1.575, accuracy 0.426
Epoch 28, CIFAR-10 Batch 1:  valid loss 1.553, accuracy 0.435
Epoch 29, CIFAR-10 Batch 1:  valid loss 1.545, accuracy 0.439
Epoch 30, CIFAR-10 Batch 1:  valid loss 1.544, accuracy 0.438
Epoch 31, CIFAR-10 Batch 1:  valid loss 1.501, accuracy 0.460
Epoch 32, CIFAR-10 Batch 1:  valid loss 1.534, accuracy 0.444
Epoch 33, CIFAR-10 Batch 1:  valid loss 1.478, accuracy 0.464
Epoch 34, CIFAR-10 Batch 1:  valid loss 1.482, accuracy 0.459
Epoch 35, CIFAR-10 Batch 1:  valid loss 1.525, accuracy 0.444
Epoch 36, CIFAR-10 Batch 1:  valid loss 1.438, accuracy 0.481
Epoch 37, CIFAR-10 Batch 1:  valid loss 1.451, accuracy 0.482
Epoch 38, CIFAR-10 Batch 1:  valid loss 1.453, accuracy 0.472
Epoch 39, CIFAR-10 Batch 1:  valid loss 1.420, accuracy 0.489
Epoch 40, CIFAR-10 Batch 1:  valid loss 1.410, accuracy 0.490
Epoch 41, CIFAR-10 Batch 1:  valid loss 1.417, accuracy 0.480
Epoch 42, CIFAR-10 Batch 1:  valid loss 1.415, accuracy 0.484
Epoch 43, CIFAR-10 Batch 1:  valid loss 1.391, accuracy 0.495
Epoch 44, CIFAR-10 Batch 1:  valid loss 1.386, accuracy 0.502
Epoch 45, CIFAR-10 Batch 1:  valid loss 1.433, accuracy 0.480
Epoch 46, CIFAR-10 Batch 1:  valid loss 1.370, accuracy 0.503
Epoch 47, CIFAR-10 Batch 1:  valid loss 1.369, accuracy 0.509
Epoch 48, CIFAR-10 Batch 1:  valid loss 1.384, accuracy 0.502
Epoch 49, CIFAR-10 Batch 1:  valid loss 1.366, accuracy 0.507
Epoch 50, CIFAR-10 Batch 1:  valid loss 1.362, accuracy 0.508
Epoch 51, CIFAR-10 Batch 1:  valid loss 1.335, accuracy 0.518
Epoch 52, CIFAR-10 Batch 1:  valid loss 1.340, accuracy 0.516
Epoch 53, CIFAR-10 Batch 1:  valid loss 1.370, accuracy 0.509
Epoch 54, CIFAR-10 Batch 1:  valid loss 1.340, accuracy 0.518
Epoch 55, CIFAR-10 Batch 1:  valid loss 1.341, accuracy 0.522
Epoch 56, CIFAR-10 Batch 1:  valid loss 1.320, accuracy 0.525
Epoch 57, CIFAR-10 Batch 1:  valid loss 1.327, accuracy 0.524
Epoch 58, CIFAR-10 Batch 1:  valid loss 1.325, accuracy 0.524
Epoch 59, CIFAR-10 Batch 1:  valid loss 1.402, accuracy 0.513
Epoch 60, CIFAR-10 Batch 1:  valid loss 1.334, accuracy 0.522
Epoch 61, CIFAR-10 Batch 1:  valid loss 1.301, accuracy 0.534
Epoch 62, CIFAR-10 Batch 1:  valid loss 1.320, accuracy 0.528
Epoch 63, CIFAR-10 Batch 1:  valid loss 1.288, accuracy 0.540
Epoch 64, CIFAR-10 Batch 1:  valid loss 1.290, accuracy 0.546
Epoch 65, CIFAR-10 Batch 1:  valid loss 1.268, accuracy 0.546
Epoch 66, CIFAR-10 Batch 1:  valid loss 1.295, accuracy 0.549
Epoch 67, CIFAR-10 Batch 1:  valid loss 1.297, accuracy 0.530
Epoch 68, CIFAR-10 Batch 1:  valid loss 1.255, accuracy 0.545
Epoch 69, CIFAR-10 Batch 1:  valid loss 1.318, accuracy 0.542
Epoch 70, CIFAR-10 Batch 1:  valid loss 1.274, accuracy 0.544
Epoch 71, CIFAR-10 Batch 1:  valid loss 1.281, accuracy 0.545
Epoch 72, CIFAR-10 Batch 1:  valid loss 1.266, accuracy 0.548
Epoch 73, CIFAR-10 Batch 1:  valid loss 1.239, accuracy 0.559
Epoch 74, CIFAR-10 Batch 1:  valid loss 1.236, accuracy 0.560
Epoch 75, CIFAR-10 Batch 1:  valid loss 1.223, accuracy 0.561
Epoch 76, CIFAR-10 Batch 1:  valid loss 1.248, accuracy 0.558
Epoch 77, CIFAR-10 Batch 1:  valid loss 1.234, accuracy 0.564
Epoch 78, CIFAR-10 Batch 1:  valid loss 1.296, accuracy 0.546
Epoch 79, CIFAR-10 Batch 1:  valid loss 1.263, accuracy 0.556
Epoch 80, CIFAR-10 Batch 1:  valid loss 1.275, accuracy 0.554
Epoch 81, CIFAR-10 Batch 1:  valid loss 1.264, accuracy 0.554
Epoch 82, CIFAR-10 Batch 1:  valid loss 1.215, accuracy 0.570
Epoch 83, CIFAR-10 Batch 1:  valid loss 1.235, accuracy 0.564
Epoch 84, CIFAR-10 Batch 1:  valid loss 1.238, accuracy 0.567
Epoch 85, CIFAR-10 Batch 1:  valid loss 1.244, accuracy 0.562
Epoch 86, CIFAR-10 Batch 1:  valid loss 1.261, accuracy 0.563
Epoch 87, CIFAR-10 Batch 1:  valid loss 1.234, accuracy 0.570
Epoch 88, CIFAR-10 Batch 1:  valid loss 1.227, accuracy 0.573
Epoch 89, CIFAR-10 Batch 1:  valid loss 1.265, accuracy 0.563
Epoch 90, CIFAR-10 Batch 1:  valid loss 1.236, accuracy 0.563
Epoch 91, CIFAR-10 Batch 1:  valid loss 1.257, accuracy 0.562
Epoch 92, CIFAR-10 Batch 1:  valid loss 1.266, accuracy 0.560
Epoch 93, CIFAR-10 Batch 1:  valid loss 1.236, accuracy 0.572
Epoch 94, CIFAR-10 Batch 1:  valid loss 1.188, accuracy 0.588
Epoch 95, CIFAR-10 Batch 1:  valid loss 1.209, accuracy 0.581
Epoch 96, CIFAR-10 Batch 1:  valid loss 1.198, accuracy 0.583
Epoch 97, CIFAR-10 Batch 1:  valid loss 1.213, accuracy 0.576
Epoch 98, CIFAR-10 Batch 1:  valid loss 1.211, accuracy 0.579
Epoch 99, CIFAR-10 Batch 1:  valid loss 1.188, accuracy 0.585
Epoch 100, CIFAR-10 Batch 1:  valid loss 1.224, accuracy 0.579

完全训练模型

现在,单个 CIFAR-10 部分的准确率已经不错了,试试所有五个部分吧。


In [33]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
save_model_path = './image_classification'

print('Training...')
with tf.Session() as sess:
    # Initializing the variables
    sess.run(tf.global_variables_initializer())
    
    # Training cycle
    for epoch in range(epochs):
        # Loop over all batches
        n_batches = 5
        for batch_i in range(1, n_batches + 1):
            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
            print_stats(sess, batch_features, batch_labels, cost, accuracy)
            
    # Save Model
    saver = tf.train.Saver()
    save_path = saver.save(sess, save_model_path)


Training...
Epoch  1, CIFAR-10 Batch 1:  valid loss 2.709, accuracy 0.121
Epoch  1, CIFAR-10 Batch 2:  valid loss 2.329, accuracy 0.136
Epoch  1, CIFAR-10 Batch 3:  valid loss 2.276, accuracy 0.139
Epoch  1, CIFAR-10 Batch 4:  valid loss 2.255, accuracy 0.157
Epoch  1, CIFAR-10 Batch 5:  valid loss 2.228, accuracy 0.170
Epoch  2, CIFAR-10 Batch 1:  valid loss 2.177, accuracy 0.202
Epoch  2, CIFAR-10 Batch 2:  valid loss 2.132, accuracy 0.215
Epoch  2, CIFAR-10 Batch 3:  valid loss 2.080, accuracy 0.235
Epoch  2, CIFAR-10 Batch 4:  valid loss 2.015, accuracy 0.271
Epoch  2, CIFAR-10 Batch 5:  valid loss 1.972, accuracy 0.287
Epoch  3, CIFAR-10 Batch 1:  valid loss 1.882, accuracy 0.327
Epoch  3, CIFAR-10 Batch 2:  valid loss 1.883, accuracy 0.321
Epoch  3, CIFAR-10 Batch 3:  valid loss 1.829, accuracy 0.335
Epoch  3, CIFAR-10 Batch 4:  valid loss 1.784, accuracy 0.357
Epoch  3, CIFAR-10 Batch 5:  valid loss 1.785, accuracy 0.355
Epoch  4, CIFAR-10 Batch 1:  valid loss 1.720, accuracy 0.386
Epoch  4, CIFAR-10 Batch 2:  valid loss 1.771, accuracy 0.355
Epoch  4, CIFAR-10 Batch 3:  valid loss 1.690, accuracy 0.380
Epoch  4, CIFAR-10 Batch 4:  valid loss 1.648, accuracy 0.408
Epoch  4, CIFAR-10 Batch 5:  valid loss 1.659, accuracy 0.398
Epoch  5, CIFAR-10 Batch 1:  valid loss 1.630, accuracy 0.409
Epoch  5, CIFAR-10 Batch 2:  valid loss 1.692, accuracy 0.375
Epoch  5, CIFAR-10 Batch 3:  valid loss 1.601, accuracy 0.417
Epoch  5, CIFAR-10 Batch 4:  valid loss 1.591, accuracy 0.413
Epoch  5, CIFAR-10 Batch 5:  valid loss 1.591, accuracy 0.424
Epoch  6, CIFAR-10 Batch 1:  valid loss 1.562, accuracy 0.432
Epoch  6, CIFAR-10 Batch 2:  valid loss 1.565, accuracy 0.433
Epoch  6, CIFAR-10 Batch 3:  valid loss 1.544, accuracy 0.439
Epoch  6, CIFAR-10 Batch 4:  valid loss 1.518, accuracy 0.447
Epoch  6, CIFAR-10 Batch 5:  valid loss 1.540, accuracy 0.439
Epoch  7, CIFAR-10 Batch 1:  valid loss 1.508, accuracy 0.453
Epoch  7, CIFAR-10 Batch 2:  valid loss 1.507, accuracy 0.455
Epoch  7, CIFAR-10 Batch 3:  valid loss 1.526, accuracy 0.444
Epoch  7, CIFAR-10 Batch 4:  valid loss 1.487, accuracy 0.454
Epoch  7, CIFAR-10 Batch 5:  valid loss 1.548, accuracy 0.432
Epoch  8, CIFAR-10 Batch 1:  valid loss 1.477, accuracy 0.460
Epoch  8, CIFAR-10 Batch 2:  valid loss 1.479, accuracy 0.457
Epoch  8, CIFAR-10 Batch 3:  valid loss 1.474, accuracy 0.461
Epoch  8, CIFAR-10 Batch 4:  valid loss 1.421, accuracy 0.481
Epoch  8, CIFAR-10 Batch 5:  valid loss 1.459, accuracy 0.465
Epoch  9, CIFAR-10 Batch 1:  valid loss 1.464, accuracy 0.472
Epoch  9, CIFAR-10 Batch 2:  valid loss 1.438, accuracy 0.478
Epoch  9, CIFAR-10 Batch 3:  valid loss 1.454, accuracy 0.465
Epoch  9, CIFAR-10 Batch 4:  valid loss 1.422, accuracy 0.478
Epoch  9, CIFAR-10 Batch 5:  valid loss 1.453, accuracy 0.472
Epoch 10, CIFAR-10 Batch 1:  valid loss 1.393, accuracy 0.499
Epoch 10, CIFAR-10 Batch 2:  valid loss 1.414, accuracy 0.501
Epoch 10, CIFAR-10 Batch 3:  valid loss 1.431, accuracy 0.468
Epoch 10, CIFAR-10 Batch 4:  valid loss 1.387, accuracy 0.488
Epoch 10, CIFAR-10 Batch 5:  valid loss 1.372, accuracy 0.496
Epoch 11, CIFAR-10 Batch 1:  valid loss 1.398, accuracy 0.493
Epoch 11, CIFAR-10 Batch 2:  valid loss 1.390, accuracy 0.496
Epoch 11, CIFAR-10 Batch 3:  valid loss 1.352, accuracy 0.506
Epoch 11, CIFAR-10 Batch 4:  valid loss 1.361, accuracy 0.501
Epoch 11, CIFAR-10 Batch 5:  valid loss 1.349, accuracy 0.499
Epoch 12, CIFAR-10 Batch 1:  valid loss 1.344, accuracy 0.511
Epoch 12, CIFAR-10 Batch 2:  valid loss 1.326, accuracy 0.521
Epoch 12, CIFAR-10 Batch 3:  valid loss 1.323, accuracy 0.520
Epoch 12, CIFAR-10 Batch 4:  valid loss 1.353, accuracy 0.505
Epoch 12, CIFAR-10 Batch 5:  valid loss 1.307, accuracy 0.526
Epoch 13, CIFAR-10 Batch 1:  valid loss 1.309, accuracy 0.529
Epoch 13, CIFAR-10 Batch 2:  valid loss 1.296, accuracy 0.532
Epoch 13, CIFAR-10 Batch 3:  valid loss 1.306, accuracy 0.528
Epoch 13, CIFAR-10 Batch 4:  valid loss 1.328, accuracy 0.517
Epoch 13, CIFAR-10 Batch 5:  valid loss 1.283, accuracy 0.539
Epoch 14, CIFAR-10 Batch 1:  valid loss 1.244, accuracy 0.555
Epoch 14, CIFAR-10 Batch 2:  valid loss 1.350, accuracy 0.515
Epoch 14, CIFAR-10 Batch 3:  valid loss 1.244, accuracy 0.548
Epoch 14, CIFAR-10 Batch 4:  valid loss 1.279, accuracy 0.533
Epoch 14, CIFAR-10 Batch 5:  valid loss 1.246, accuracy 0.549
Epoch 15, CIFAR-10 Batch 1:  valid loss 1.253, accuracy 0.555
Epoch 15, CIFAR-10 Batch 2:  valid loss 1.265, accuracy 0.539
Epoch 15, CIFAR-10 Batch 3:  valid loss 1.215, accuracy 0.568
Epoch 15, CIFAR-10 Batch 4:  valid loss 1.338, accuracy 0.511
Epoch 15, CIFAR-10 Batch 5:  valid loss 1.202, accuracy 0.568
Epoch 16, CIFAR-10 Batch 1:  valid loss 1.200, accuracy 0.572
Epoch 16, CIFAR-10 Batch 2:  valid loss 1.243, accuracy 0.554
Epoch 16, CIFAR-10 Batch 3:  valid loss 1.222, accuracy 0.561
Epoch 16, CIFAR-10 Batch 4:  valid loss 1.253, accuracy 0.548
Epoch 16, CIFAR-10 Batch 5:  valid loss 1.222, accuracy 0.557
Epoch 17, CIFAR-10 Batch 1:  valid loss 1.202, accuracy 0.573
Epoch 17, CIFAR-10 Batch 2:  valid loss 1.198, accuracy 0.569
Epoch 17, CIFAR-10 Batch 3:  valid loss 1.188, accuracy 0.575
Epoch 17, CIFAR-10 Batch 4:  valid loss 1.230, accuracy 0.561
Epoch 17, CIFAR-10 Batch 5:  valid loss 1.161, accuracy 0.586
Epoch 18, CIFAR-10 Batch 1:  valid loss 1.154, accuracy 0.595
Epoch 18, CIFAR-10 Batch 2:  valid loss 1.211, accuracy 0.568
Epoch 18, CIFAR-10 Batch 3:  valid loss 1.182, accuracy 0.578
Epoch 18, CIFAR-10 Batch 4:  valid loss 1.192, accuracy 0.575
Epoch 18, CIFAR-10 Batch 5:  valid loss 1.160, accuracy 0.588
Epoch 19, CIFAR-10 Batch 1:  valid loss 1.212, accuracy 0.577
Epoch 19, CIFAR-10 Batch 2:  valid loss 1.194, accuracy 0.574
Epoch 19, CIFAR-10 Batch 3:  valid loss 1.171, accuracy 0.586
Epoch 19, CIFAR-10 Batch 4:  valid loss 1.113, accuracy 0.611
Epoch 19, CIFAR-10 Batch 5:  valid loss 1.148, accuracy 0.591
Epoch 20, CIFAR-10 Batch 1:  valid loss 1.146, accuracy 0.592
Epoch 20, CIFAR-10 Batch 2:  valid loss 1.157, accuracy 0.592
Epoch 20, CIFAR-10 Batch 3:  valid loss 1.109, accuracy 0.606
Epoch 20, CIFAR-10 Batch 4:  valid loss 1.135, accuracy 0.601
Epoch 20, CIFAR-10 Batch 5:  valid loss 1.126, accuracy 0.598
Epoch 21, CIFAR-10 Batch 1:  valid loss 1.141, accuracy 0.595
Epoch 21, CIFAR-10 Batch 2:  valid loss 1.158, accuracy 0.584
Epoch 21, CIFAR-10 Batch 3:  valid loss 1.113, accuracy 0.606
Epoch 21, CIFAR-10 Batch 4:  valid loss 1.135, accuracy 0.597
Epoch 21, CIFAR-10 Batch 5:  valid loss 1.084, accuracy 0.612
Epoch 22, CIFAR-10 Batch 1:  valid loss 1.136, accuracy 0.608
Epoch 22, CIFAR-10 Batch 2:  valid loss 1.130, accuracy 0.601
Epoch 22, CIFAR-10 Batch 3:  valid loss 1.105, accuracy 0.613
Epoch 22, CIFAR-10 Batch 4:  valid loss 1.169, accuracy 0.585
Epoch 22, CIFAR-10 Batch 5:  valid loss 1.132, accuracy 0.594
Epoch 23, CIFAR-10 Batch 1:  valid loss 1.122, accuracy 0.606
Epoch 23, CIFAR-10 Batch 2:  valid loss 1.137, accuracy 0.596
Epoch 23, CIFAR-10 Batch 3:  valid loss 1.041, accuracy 0.632
Epoch 23, CIFAR-10 Batch 4:  valid loss 1.136, accuracy 0.596
Epoch 23, CIFAR-10 Batch 5:  valid loss 1.056, accuracy 0.632
Epoch 24, CIFAR-10 Batch 1:  valid loss 1.082, accuracy 0.622
Epoch 24, CIFAR-10 Batch 2:  valid loss 1.082, accuracy 0.615
Epoch 24, CIFAR-10 Batch 3:  valid loss 1.058, accuracy 0.630
Epoch 24, CIFAR-10 Batch 4:  valid loss 1.096, accuracy 0.613
Epoch 24, CIFAR-10 Batch 5:  valid loss 1.043, accuracy 0.634
Epoch 25, CIFAR-10 Batch 1:  valid loss 1.047, accuracy 0.631
Epoch 25, CIFAR-10 Batch 2:  valid loss 1.062, accuracy 0.628
Epoch 25, CIFAR-10 Batch 3:  valid loss 1.048, accuracy 0.631
Epoch 25, CIFAR-10 Batch 4:  valid loss 1.036, accuracy 0.630
Epoch 25, CIFAR-10 Batch 5:  valid loss 1.049, accuracy 0.634
Epoch 26, CIFAR-10 Batch 1:  valid loss 1.068, accuracy 0.625
Epoch 26, CIFAR-10 Batch 2:  valid loss 1.087, accuracy 0.623
Epoch 26, CIFAR-10 Batch 3:  valid loss 1.040, accuracy 0.637
Epoch 26, CIFAR-10 Batch 4:  valid loss 1.025, accuracy 0.646
Epoch 26, CIFAR-10 Batch 5:  valid loss 1.000, accuracy 0.645
Epoch 27, CIFAR-10 Batch 1:  valid loss 1.013, accuracy 0.641
Epoch 27, CIFAR-10 Batch 2:  valid loss 1.048, accuracy 0.635
Epoch 27, CIFAR-10 Batch 3:  valid loss 0.991, accuracy 0.651
Epoch 27, CIFAR-10 Batch 4:  valid loss 1.009, accuracy 0.643
Epoch 27, CIFAR-10 Batch 5:  valid loss 0.985, accuracy 0.650
Epoch 28, CIFAR-10 Batch 1:  valid loss 1.023, accuracy 0.646
Epoch 28, CIFAR-10 Batch 2:  valid loss 1.033, accuracy 0.644
Epoch 28, CIFAR-10 Batch 3:  valid loss 0.970, accuracy 0.652
Epoch 28, CIFAR-10 Batch 4:  valid loss 0.988, accuracy 0.653
Epoch 28, CIFAR-10 Batch 5:  valid loss 0.980, accuracy 0.656
Epoch 29, CIFAR-10 Batch 1:  valid loss 1.015, accuracy 0.650
Epoch 29, CIFAR-10 Batch 2:  valid loss 1.072, accuracy 0.623
Epoch 29, CIFAR-10 Batch 3:  valid loss 0.986, accuracy 0.650
Epoch 29, CIFAR-10 Batch 4:  valid loss 1.004, accuracy 0.650
Epoch 29, CIFAR-10 Batch 5:  valid loss 0.984, accuracy 0.659
Epoch 30, CIFAR-10 Batch 1:  valid loss 0.983, accuracy 0.654
Epoch 30, CIFAR-10 Batch 2:  valid loss 1.016, accuracy 0.645
Epoch 30, CIFAR-10 Batch 3:  valid loss 0.952, accuracy 0.666
Epoch 30, CIFAR-10 Batch 4:  valid loss 0.998, accuracy 0.646
Epoch 30, CIFAR-10 Batch 5:  valid loss 0.983, accuracy 0.651
Epoch 31, CIFAR-10 Batch 1:  valid loss 0.983, accuracy 0.657
Epoch 31, CIFAR-10 Batch 2:  valid loss 1.055, accuracy 0.634
Epoch 31, CIFAR-10 Batch 3:  valid loss 0.958, accuracy 0.661
Epoch 31, CIFAR-10 Batch 4:  valid loss 0.965, accuracy 0.656
Epoch 31, CIFAR-10 Batch 5:  valid loss 0.956, accuracy 0.664
Epoch 32, CIFAR-10 Batch 1:  valid loss 0.969, accuracy 0.656
Epoch 32, CIFAR-10 Batch 2:  valid loss 1.011, accuracy 0.649
Epoch 32, CIFAR-10 Batch 3:  valid loss 0.939, accuracy 0.668
Epoch 32, CIFAR-10 Batch 4:  valid loss 0.974, accuracy 0.656
Epoch 32, CIFAR-10 Batch 5:  valid loss 0.938, accuracy 0.673
Epoch 33, CIFAR-10 Batch 1:  valid loss 0.943, accuracy 0.669
Epoch 33, CIFAR-10 Batch 2:  valid loss 0.969, accuracy 0.660
Epoch 33, CIFAR-10 Batch 3:  valid loss 0.929, accuracy 0.669
Epoch 33, CIFAR-10 Batch 4:  valid loss 0.957, accuracy 0.662
Epoch 33, CIFAR-10 Batch 5:  valid loss 0.970, accuracy 0.658
Epoch 34, CIFAR-10 Batch 1:  valid loss 0.931, accuracy 0.677
Epoch 34, CIFAR-10 Batch 2:  valid loss 1.003, accuracy 0.648
Epoch 34, CIFAR-10 Batch 3:  valid loss 0.934, accuracy 0.671
Epoch 34, CIFAR-10 Batch 4:  valid loss 0.965, accuracy 0.659
Epoch 34, CIFAR-10 Batch 5:  valid loss 0.927, accuracy 0.675
Epoch 35, CIFAR-10 Batch 1:  valid loss 0.935, accuracy 0.673
Epoch 35, CIFAR-10 Batch 2:  valid loss 0.958, accuracy 0.664
Epoch 35, CIFAR-10 Batch 3:  valid loss 0.909, accuracy 0.679
Epoch 35, CIFAR-10 Batch 4:  valid loss 0.939, accuracy 0.666
Epoch 35, CIFAR-10 Batch 5:  valid loss 0.920, accuracy 0.679
Epoch 36, CIFAR-10 Batch 1:  valid loss 0.925, accuracy 0.676
Epoch 36, CIFAR-10 Batch 2:  valid loss 1.027, accuracy 0.639
Epoch 36, CIFAR-10 Batch 3:  valid loss 0.932, accuracy 0.675
Epoch 36, CIFAR-10 Batch 4:  valid loss 0.927, accuracy 0.671
Epoch 36, CIFAR-10 Batch 5:  valid loss 0.960, accuracy 0.664
Epoch 37, CIFAR-10 Batch 1:  valid loss 0.907, accuracy 0.687
Epoch 37, CIFAR-10 Batch 2:  valid loss 1.035, accuracy 0.636
Epoch 37, CIFAR-10 Batch 3:  valid loss 0.896, accuracy 0.688
Epoch 37, CIFAR-10 Batch 4:  valid loss 0.932, accuracy 0.671
Epoch 37, CIFAR-10 Batch 5:  valid loss 0.934, accuracy 0.670
Epoch 38, CIFAR-10 Batch 1:  valid loss 0.930, accuracy 0.672
Epoch 38, CIFAR-10 Batch 2:  valid loss 0.968, accuracy 0.661
Epoch 38, CIFAR-10 Batch 3:  valid loss 0.890, accuracy 0.684
Epoch 38, CIFAR-10 Batch 4:  valid loss 0.930, accuracy 0.677
Epoch 38, CIFAR-10 Batch 5:  valid loss 0.983, accuracy 0.653
Epoch 39, CIFAR-10 Batch 1:  valid loss 0.898, accuracy 0.686
Epoch 39, CIFAR-10 Batch 2:  valid loss 0.965, accuracy 0.658
Epoch 39, CIFAR-10 Batch 3:  valid loss 0.879, accuracy 0.692
Epoch 39, CIFAR-10 Batch 4:  valid loss 0.935, accuracy 0.673
Epoch 39, CIFAR-10 Batch 5:  valid loss 0.959, accuracy 0.669
Epoch 40, CIFAR-10 Batch 1:  valid loss 0.880, accuracy 0.694
Epoch 40, CIFAR-10 Batch 2:  valid loss 0.961, accuracy 0.668
Epoch 40, CIFAR-10 Batch 3:  valid loss 0.877, accuracy 0.693
Epoch 40, CIFAR-10 Batch 4:  valid loss 0.919, accuracy 0.683
Epoch 40, CIFAR-10 Batch 5:  valid loss 0.976, accuracy 0.661
Epoch 41, CIFAR-10 Batch 1:  valid loss 0.876, accuracy 0.699
Epoch 41, CIFAR-10 Batch 2:  valid loss 0.995, accuracy 0.656
Epoch 41, CIFAR-10 Batch 3:  valid loss 0.933, accuracy 0.679
Epoch 41, CIFAR-10 Batch 4:  valid loss 0.939, accuracy 0.670
Epoch 41, CIFAR-10 Batch 5:  valid loss 0.930, accuracy 0.675
Epoch 42, CIFAR-10 Batch 1:  valid loss 0.891, accuracy 0.695
Epoch 42, CIFAR-10 Batch 2:  valid loss 0.993, accuracy 0.660
Epoch 42, CIFAR-10 Batch 3:  valid loss 0.853, accuracy 0.700
Epoch 42, CIFAR-10 Batch 4:  valid loss 0.927, accuracy 0.682
Epoch 42, CIFAR-10 Batch 5:  valid loss 0.873, accuracy 0.692
Epoch 43, CIFAR-10 Batch 1:  valid loss 0.880, accuracy 0.696
Epoch 43, CIFAR-10 Batch 2:  valid loss 0.941, accuracy 0.671
Epoch 43, CIFAR-10 Batch 3:  valid loss 0.893, accuracy 0.691
Epoch 43, CIFAR-10 Batch 4:  valid loss 0.896, accuracy 0.694
Epoch 43, CIFAR-10 Batch 5:  valid loss 0.872, accuracy 0.700
Epoch 44, CIFAR-10 Batch 1:  valid loss 0.870, accuracy 0.700
Epoch 44, CIFAR-10 Batch 2:  valid loss 0.924, accuracy 0.685
Epoch 44, CIFAR-10 Batch 3:  valid loss 0.876, accuracy 0.697
Epoch 44, CIFAR-10 Batch 4:  valid loss 0.887, accuracy 0.692
Epoch 44, CIFAR-10 Batch 5:  valid loss 0.917, accuracy 0.688
Epoch 45, CIFAR-10 Batch 1:  valid loss 0.888, accuracy 0.692
Epoch 45, CIFAR-10 Batch 2:  valid loss 0.945, accuracy 0.680
Epoch 45, CIFAR-10 Batch 3:  valid loss 0.864, accuracy 0.701
Epoch 45, CIFAR-10 Batch 4:  valid loss 0.947, accuracy 0.672
Epoch 45, CIFAR-10 Batch 5:  valid loss 0.872, accuracy 0.698
Epoch 46, CIFAR-10 Batch 1:  valid loss 0.845, accuracy 0.708
Epoch 46, CIFAR-10 Batch 2:  valid loss 0.884, accuracy 0.692
Epoch 46, CIFAR-10 Batch 3:  valid loss 0.854, accuracy 0.701
Epoch 46, CIFAR-10 Batch 4:  valid loss 0.857, accuracy 0.701
Epoch 46, CIFAR-10 Batch 5:  valid loss 0.905, accuracy 0.687
Epoch 47, CIFAR-10 Batch 1:  valid loss 0.858, accuracy 0.710
Epoch 47, CIFAR-10 Batch 2:  valid loss 0.932, accuracy 0.678
Epoch 47, CIFAR-10 Batch 3:  valid loss 0.851, accuracy 0.706
Epoch 47, CIFAR-10 Batch 4:  valid loss 0.935, accuracy 0.682
Epoch 47, CIFAR-10 Batch 5:  valid loss 0.921, accuracy 0.683
Epoch 48, CIFAR-10 Batch 1:  valid loss 0.850, accuracy 0.707
Epoch 48, CIFAR-10 Batch 2:  valid loss 0.957, accuracy 0.667
Epoch 48, CIFAR-10 Batch 3:  valid loss 0.851, accuracy 0.707
Epoch 48, CIFAR-10 Batch 4:  valid loss 0.950, accuracy 0.673
Epoch 48, CIFAR-10 Batch 5:  valid loss 0.906, accuracy 0.689
Epoch 49, CIFAR-10 Batch 1:  valid loss 0.820, accuracy 0.716
Epoch 49, CIFAR-10 Batch 2:  valid loss 0.938, accuracy 0.672
Epoch 49, CIFAR-10 Batch 3:  valid loss 0.902, accuracy 0.694
Epoch 49, CIFAR-10 Batch 4:  valid loss 0.921, accuracy 0.681
Epoch 49, CIFAR-10 Batch 5:  valid loss 0.883, accuracy 0.696
Epoch 50, CIFAR-10 Batch 1:  valid loss 0.847, accuracy 0.712
Epoch 50, CIFAR-10 Batch 2:  valid loss 0.932, accuracy 0.678
Epoch 50, CIFAR-10 Batch 3:  valid loss 0.895, accuracy 0.700
Epoch 50, CIFAR-10 Batch 4:  valid loss 0.979, accuracy 0.665
Epoch 50, CIFAR-10 Batch 5:  valid loss 0.978, accuracy 0.664
Epoch 51, CIFAR-10 Batch 1:  valid loss 0.850, accuracy 0.706
Epoch 51, CIFAR-10 Batch 2:  valid loss 0.909, accuracy 0.685
Epoch 51, CIFAR-10 Batch 3:  valid loss 0.879, accuracy 0.702
Epoch 51, CIFAR-10 Batch 4:  valid loss 0.893, accuracy 0.690
Epoch 51, CIFAR-10 Batch 5:  valid loss 0.869, accuracy 0.697
Epoch 52, CIFAR-10 Batch 1:  valid loss 0.839, accuracy 0.710
Epoch 52, CIFAR-10 Batch 2:  valid loss 0.871, accuracy 0.701
Epoch 52, CIFAR-10 Batch 3:  valid loss 0.843, accuracy 0.711
Epoch 52, CIFAR-10 Batch 4:  valid loss 0.880, accuracy 0.693
Epoch 52, CIFAR-10 Batch 5:  valid loss 0.849, accuracy 0.707
Epoch 53, CIFAR-10 Batch 1:  valid loss 0.842, accuracy 0.709
Epoch 53, CIFAR-10 Batch 2:  valid loss 0.898, accuracy 0.693
Epoch 53, CIFAR-10 Batch 3:  valid loss 0.852, accuracy 0.713
Epoch 53, CIFAR-10 Batch 4:  valid loss 0.935, accuracy 0.674
Epoch 53, CIFAR-10 Batch 5:  valid loss 0.856, accuracy 0.706
Epoch 54, CIFAR-10 Batch 1:  valid loss 0.813, accuracy 0.720
Epoch 54, CIFAR-10 Batch 2:  valid loss 0.841, accuracy 0.710
Epoch 54, CIFAR-10 Batch 3:  valid loss 0.874, accuracy 0.697
Epoch 54, CIFAR-10 Batch 4:  valid loss 0.843, accuracy 0.706
Epoch 54, CIFAR-10 Batch 5:  valid loss 0.888, accuracy 0.694
Epoch 55, CIFAR-10 Batch 1:  valid loss 0.823, accuracy 0.713
Epoch 55, CIFAR-10 Batch 2:  valid loss 0.835, accuracy 0.715
Epoch 55, CIFAR-10 Batch 3:  valid loss 0.873, accuracy 0.702
Epoch 55, CIFAR-10 Batch 4:  valid loss 0.856, accuracy 0.700
Epoch 55, CIFAR-10 Batch 5:  valid loss 0.826, accuracy 0.713
Epoch 56, CIFAR-10 Batch 1:  valid loss 0.809, accuracy 0.723
Epoch 56, CIFAR-10 Batch 2:  valid loss 0.840, accuracy 0.708
Epoch 56, CIFAR-10 Batch 3:  valid loss 0.853, accuracy 0.710
Epoch 56, CIFAR-10 Batch 4:  valid loss 0.857, accuracy 0.696
Epoch 56, CIFAR-10 Batch 5:  valid loss 0.859, accuracy 0.708
Epoch 57, CIFAR-10 Batch 1:  valid loss 0.803, accuracy 0.723
Epoch 57, CIFAR-10 Batch 2:  valid loss 0.848, accuracy 0.709
Epoch 57, CIFAR-10 Batch 3:  valid loss 0.862, accuracy 0.704
Epoch 57, CIFAR-10 Batch 4:  valid loss 0.856, accuracy 0.705
Epoch 57, CIFAR-10 Batch 5:  valid loss 0.862, accuracy 0.704
Epoch 58, CIFAR-10 Batch 1:  valid loss 0.813, accuracy 0.718
Epoch 58, CIFAR-10 Batch 2:  valid loss 0.838, accuracy 0.714
Epoch 58, CIFAR-10 Batch 3:  valid loss 0.808, accuracy 0.718
Epoch 58, CIFAR-10 Batch 4:  valid loss 0.879, accuracy 0.698
Epoch 58, CIFAR-10 Batch 5:  valid loss 0.838, accuracy 0.712
Epoch 59, CIFAR-10 Batch 1:  valid loss 0.810, accuracy 0.719
Epoch 59, CIFAR-10 Batch 2:  valid loss 0.834, accuracy 0.713
Epoch 59, CIFAR-10 Batch 3:  valid loss 0.831, accuracy 0.706
Epoch 59, CIFAR-10 Batch 4:  valid loss 0.832, accuracy 0.709
Epoch 59, CIFAR-10 Batch 5:  valid loss 0.872, accuracy 0.701
Epoch 60, CIFAR-10 Batch 1:  valid loss 0.797, accuracy 0.728
Epoch 60, CIFAR-10 Batch 2:  valid loss 0.836, accuracy 0.713
Epoch 60, CIFAR-10 Batch 3:  valid loss 0.860, accuracy 0.702
Epoch 60, CIFAR-10 Batch 4:  valid loss 0.885, accuracy 0.700
Epoch 60, CIFAR-10 Batch 5:  valid loss 0.854, accuracy 0.706
Epoch 61, CIFAR-10 Batch 1:  valid loss 0.806, accuracy 0.724
Epoch 61, CIFAR-10 Batch 2:  valid loss 0.875, accuracy 0.700
Epoch 61, CIFAR-10 Batch 3:  valid loss 0.836, accuracy 0.715
Epoch 61, CIFAR-10 Batch 4:  valid loss 0.827, accuracy 0.720
Epoch 61, CIFAR-10 Batch 5:  valid loss 0.849, accuracy 0.712
Epoch 62, CIFAR-10 Batch 1:  valid loss 0.808, accuracy 0.725
Epoch 62, CIFAR-10 Batch 2:  valid loss 0.830, accuracy 0.712
Epoch 62, CIFAR-10 Batch 3:  valid loss 0.853, accuracy 0.706
Epoch 62, CIFAR-10 Batch 4:  valid loss 0.827, accuracy 0.712
Epoch 62, CIFAR-10 Batch 5:  valid loss 0.847, accuracy 0.715
Epoch 63, CIFAR-10 Batch 1:  valid loss 0.814, accuracy 0.722
Epoch 63, CIFAR-10 Batch 2:  valid loss 0.873, accuracy 0.699
Epoch 63, CIFAR-10 Batch 3:  valid loss 0.825, accuracy 0.716
Epoch 63, CIFAR-10 Batch 4:  valid loss 0.844, accuracy 0.707
Epoch 63, CIFAR-10 Batch 5:  valid loss 0.829, accuracy 0.716
Epoch 64, CIFAR-10 Batch 1:  valid loss 0.852, accuracy 0.706
Epoch 64, CIFAR-10 Batch 2:  valid loss 0.830, accuracy 0.708
Epoch 64, CIFAR-10 Batch 3:  valid loss 0.815, accuracy 0.714
Epoch 64, CIFAR-10 Batch 4:  valid loss 0.826, accuracy 0.713
Epoch 64, CIFAR-10 Batch 5:  valid loss 0.797, accuracy 0.727
Epoch 65, CIFAR-10 Batch 1:  valid loss 0.800, accuracy 0.724
Epoch 65, CIFAR-10 Batch 2:  valid loss 0.833, accuracy 0.707
Epoch 65, CIFAR-10 Batch 3:  valid loss 0.853, accuracy 0.705
Epoch 65, CIFAR-10 Batch 4:  valid loss 0.846, accuracy 0.706
Epoch 65, CIFAR-10 Batch 5:  valid loss 0.872, accuracy 0.704
Epoch 66, CIFAR-10 Batch 1:  valid loss 0.797, accuracy 0.724
Epoch 66, CIFAR-10 Batch 2:  valid loss 0.889, accuracy 0.692
Epoch 66, CIFAR-10 Batch 3:  valid loss 0.819, accuracy 0.719
Epoch 66, CIFAR-10 Batch 4:  valid loss 0.829, accuracy 0.716
Epoch 66, CIFAR-10 Batch 5:  valid loss 0.840, accuracy 0.718
Epoch 67, CIFAR-10 Batch 1:  valid loss 0.783, accuracy 0.732
Epoch 67, CIFAR-10 Batch 2:  valid loss 0.835, accuracy 0.713
Epoch 67, CIFAR-10 Batch 3:  valid loss 0.809, accuracy 0.726
Epoch 67, CIFAR-10 Batch 4:  valid loss 0.863, accuracy 0.701
Epoch 67, CIFAR-10 Batch 5:  valid loss 0.829, accuracy 0.712
Epoch 68, CIFAR-10 Batch 1:  valid loss 0.789, accuracy 0.727
Epoch 68, CIFAR-10 Batch 2:  valid loss 0.861, accuracy 0.706
Epoch 68, CIFAR-10 Batch 3:  valid loss 0.822, accuracy 0.721
Epoch 68, CIFAR-10 Batch 4:  valid loss 0.868, accuracy 0.699
Epoch 68, CIFAR-10 Batch 5:  valid loss 0.821, accuracy 0.715
Epoch 69, CIFAR-10 Batch 1:  valid loss 0.815, accuracy 0.718
Epoch 69, CIFAR-10 Batch 2:  valid loss 0.858, accuracy 0.704
Epoch 69, CIFAR-10 Batch 3:  valid loss 0.823, accuracy 0.719
Epoch 69, CIFAR-10 Batch 4:  valid loss 0.857, accuracy 0.710
Epoch 69, CIFAR-10 Batch 5:  valid loss 0.856, accuracy 0.712
Epoch 70, CIFAR-10 Batch 1:  valid loss 0.817, accuracy 0.721
Epoch 70, CIFAR-10 Batch 2:  valid loss 0.841, accuracy 0.712
Epoch 70, CIFAR-10 Batch 3:  valid loss 0.844, accuracy 0.715
Epoch 70, CIFAR-10 Batch 4:  valid loss 0.833, accuracy 0.713
Epoch 70, CIFAR-10 Batch 5:  valid loss 0.848, accuracy 0.709
Epoch 71, CIFAR-10 Batch 1:  valid loss 0.795, accuracy 0.732
Epoch 71, CIFAR-10 Batch 2:  valid loss 0.860, accuracy 0.702
Epoch 71, CIFAR-10 Batch 3:  valid loss 0.960, accuracy 0.680
Epoch 71, CIFAR-10 Batch 4:  valid loss 0.823, accuracy 0.718
Epoch 71, CIFAR-10 Batch 5:  valid loss 0.812, accuracy 0.724
Epoch 72, CIFAR-10 Batch 1:  valid loss 0.784, accuracy 0.736
Epoch 72, CIFAR-10 Batch 2:  valid loss 0.860, accuracy 0.704
Epoch 72, CIFAR-10 Batch 3:  valid loss 0.848, accuracy 0.713
Epoch 72, CIFAR-10 Batch 4:  valid loss 0.857, accuracy 0.706
Epoch 72, CIFAR-10 Batch 5:  valid loss 0.812, accuracy 0.724
Epoch 73, CIFAR-10 Batch 1:  valid loss 0.814, accuracy 0.727
Epoch 73, CIFAR-10 Batch 2:  valid loss 0.844, accuracy 0.715
Epoch 73, CIFAR-10 Batch 3:  valid loss 0.849, accuracy 0.717
Epoch 73, CIFAR-10 Batch 4:  valid loss 0.816, accuracy 0.722
Epoch 73, CIFAR-10 Batch 5:  valid loss 0.815, accuracy 0.724
Epoch 74, CIFAR-10 Batch 1:  valid loss 0.797, accuracy 0.729
Epoch 74, CIFAR-10 Batch 2:  valid loss 0.825, accuracy 0.716
Epoch 74, CIFAR-10 Batch 3:  valid loss 0.869, accuracy 0.706
Epoch 74, CIFAR-10 Batch 4:  valid loss 0.827, accuracy 0.715
Epoch 74, CIFAR-10 Batch 5:  valid loss 0.832, accuracy 0.721
Epoch 75, CIFAR-10 Batch 1:  valid loss 0.777, accuracy 0.738
Epoch 75, CIFAR-10 Batch 2:  valid loss 0.883, accuracy 0.700
Epoch 75, CIFAR-10 Batch 3:  valid loss 0.862, accuracy 0.711
Epoch 75, CIFAR-10 Batch 4:  valid loss 0.803, accuracy 0.720
Epoch 75, CIFAR-10 Batch 5:  valid loss 0.815, accuracy 0.722
Epoch 76, CIFAR-10 Batch 1:  valid loss 0.799, accuracy 0.730
Epoch 76, CIFAR-10 Batch 2:  valid loss 0.839, accuracy 0.715
Epoch 76, CIFAR-10 Batch 3:  valid loss 0.819, accuracy 0.725
Epoch 76, CIFAR-10 Batch 4:  valid loss 0.851, accuracy 0.710
Epoch 76, CIFAR-10 Batch 5:  valid loss 0.823, accuracy 0.724
Epoch 77, CIFAR-10 Batch 1:  valid loss 0.804, accuracy 0.724
Epoch 77, CIFAR-10 Batch 2:  valid loss 0.903, accuracy 0.698
Epoch 77, CIFAR-10 Batch 3:  valid loss 0.837, accuracy 0.716
Epoch 77, CIFAR-10 Batch 4:  valid loss 0.898, accuracy 0.696
Epoch 77, CIFAR-10 Batch 5:  valid loss 0.792, accuracy 0.729
Epoch 78, CIFAR-10 Batch 1:  valid loss 0.775, accuracy 0.734
Epoch 78, CIFAR-10 Batch 2:  valid loss 0.870, accuracy 0.706
Epoch 78, CIFAR-10 Batch 3:  valid loss 0.829, accuracy 0.716
Epoch 78, CIFAR-10 Batch 4:  valid loss 0.834, accuracy 0.721
Epoch 78, CIFAR-10 Batch 5:  valid loss 0.804, accuracy 0.732
Epoch 79, CIFAR-10 Batch 1:  valid loss 0.784, accuracy 0.739
Epoch 79, CIFAR-10 Batch 2:  valid loss 0.842, accuracy 0.715
Epoch 79, CIFAR-10 Batch 3:  valid loss 0.845, accuracy 0.722
Epoch 79, CIFAR-10 Batch 4:  valid loss 0.807, accuracy 0.726
Epoch 79, CIFAR-10 Batch 5:  valid loss 0.785, accuracy 0.735
Epoch 80, CIFAR-10 Batch 1:  valid loss 0.783, accuracy 0.740
Epoch 80, CIFAR-10 Batch 2:  valid loss 0.833, accuracy 0.716
Epoch 80, CIFAR-10 Batch 3:  valid loss 0.847, accuracy 0.720
Epoch 80, CIFAR-10 Batch 4:  valid loss 0.796, accuracy 0.727
Epoch 80, CIFAR-10 Batch 5:  valid loss 0.785, accuracy 0.732
Epoch 81, CIFAR-10 Batch 1:  valid loss 0.774, accuracy 0.745
Epoch 81, CIFAR-10 Batch 2:  valid loss 0.816, accuracy 0.725
Epoch 81, CIFAR-10 Batch 3:  valid loss 0.839, accuracy 0.723
Epoch 81, CIFAR-10 Batch 4:  valid loss 0.788, accuracy 0.734
Epoch 81, CIFAR-10 Batch 5:  valid loss 0.821, accuracy 0.725
Epoch 82, CIFAR-10 Batch 1:  valid loss 0.795, accuracy 0.735
Epoch 82, CIFAR-10 Batch 2:  valid loss 0.871, accuracy 0.705
Epoch 82, CIFAR-10 Batch 3:  valid loss 0.850, accuracy 0.719
Epoch 82, CIFAR-10 Batch 4:  valid loss 0.838, accuracy 0.719
Epoch 82, CIFAR-10 Batch 5:  valid loss 0.828, accuracy 0.721
Epoch 83, CIFAR-10 Batch 1:  valid loss 0.792, accuracy 0.737
Epoch 83, CIFAR-10 Batch 2:  valid loss 0.847, accuracy 0.713
Epoch 83, CIFAR-10 Batch 3:  valid loss 0.842, accuracy 0.721
Epoch 83, CIFAR-10 Batch 4:  valid loss 0.788, accuracy 0.733
Epoch 83, CIFAR-10 Batch 5:  valid loss 0.806, accuracy 0.728
Epoch 84, CIFAR-10 Batch 1:  valid loss 0.796, accuracy 0.733
Epoch 84, CIFAR-10 Batch 2:  valid loss 0.823, accuracy 0.724
Epoch 84, CIFAR-10 Batch 3:  valid loss 0.825, accuracy 0.727
Epoch 84, CIFAR-10 Batch 4:  valid loss 0.796, accuracy 0.733
Epoch 84, CIFAR-10 Batch 5:  valid loss 0.844, accuracy 0.718
Epoch 85, CIFAR-10 Batch 1:  valid loss 0.776, accuracy 0.744
Epoch 85, CIFAR-10 Batch 2:  valid loss 0.837, accuracy 0.716
Epoch 85, CIFAR-10 Batch 3:  valid loss 0.859, accuracy 0.720
Epoch 85, CIFAR-10 Batch 4:  valid loss 0.799, accuracy 0.730
Epoch 85, CIFAR-10 Batch 5:  valid loss 0.848, accuracy 0.719
Epoch 86, CIFAR-10 Batch 1:  valid loss 0.784, accuracy 0.737
Epoch 86, CIFAR-10 Batch 2:  valid loss 0.866, accuracy 0.712
Epoch 86, CIFAR-10 Batch 3:  valid loss 0.851, accuracy 0.721
Epoch 86, CIFAR-10 Batch 4:  valid loss 0.834, accuracy 0.717
Epoch 86, CIFAR-10 Batch 5:  valid loss 0.822, accuracy 0.727
Epoch 87, CIFAR-10 Batch 1:  valid loss 0.784, accuracy 0.736
Epoch 87, CIFAR-10 Batch 2:  valid loss 0.852, accuracy 0.717
Epoch 87, CIFAR-10 Batch 3:  valid loss 0.818, accuracy 0.731
Epoch 87, CIFAR-10 Batch 4:  valid loss 0.812, accuracy 0.726
Epoch 87, CIFAR-10 Batch 5:  valid loss 0.818, accuracy 0.728
Epoch 88, CIFAR-10 Batch 1:  valid loss 0.781, accuracy 0.746
Epoch 88, CIFAR-10 Batch 2:  valid loss 0.813, accuracy 0.728
Epoch 88, CIFAR-10 Batch 3:  valid loss 0.875, accuracy 0.716
Epoch 88, CIFAR-10 Batch 4:  valid loss 0.826, accuracy 0.723
Epoch 88, CIFAR-10 Batch 5:  valid loss 0.808, accuracy 0.734
Epoch 89, CIFAR-10 Batch 1:  valid loss 0.803, accuracy 0.732
Epoch 89, CIFAR-10 Batch 2:  valid loss 0.860, accuracy 0.716
Epoch 89, CIFAR-10 Batch 3:  valid loss 0.830, accuracy 0.725
Epoch 89, CIFAR-10 Batch 4:  valid loss 0.795, accuracy 0.735
Epoch 89, CIFAR-10 Batch 5:  valid loss 0.855, accuracy 0.719
Epoch 90, CIFAR-10 Batch 1:  valid loss 0.778, accuracy 0.742
Epoch 90, CIFAR-10 Batch 2:  valid loss 0.849, accuracy 0.720
Epoch 90, CIFAR-10 Batch 3:  valid loss 0.893, accuracy 0.708
Epoch 90, CIFAR-10 Batch 4:  valid loss 0.834, accuracy 0.723
Epoch 90, CIFAR-10 Batch 5:  valid loss 0.794, accuracy 0.740
Epoch 91, CIFAR-10 Batch 1:  valid loss 0.775, accuracy 0.744
Epoch 91, CIFAR-10 Batch 2:  valid loss 0.839, accuracy 0.724
Epoch 91, CIFAR-10 Batch 3:  valid loss 0.882, accuracy 0.709
Epoch 91, CIFAR-10 Batch 4:  valid loss 0.814, accuracy 0.726
Epoch 91, CIFAR-10 Batch 5:  valid loss 0.821, accuracy 0.732
Epoch 92, CIFAR-10 Batch 1:  valid loss 0.779, accuracy 0.742
Epoch 92, CIFAR-10 Batch 2:  valid loss 0.889, accuracy 0.703
Epoch 92, CIFAR-10 Batch 3:  valid loss 0.838, accuracy 0.727
Epoch 92, CIFAR-10 Batch 4:  valid loss 0.781, accuracy 0.739
Epoch 92, CIFAR-10 Batch 5:  valid loss 0.829, accuracy 0.729
Epoch 93, CIFAR-10 Batch 1:  valid loss 0.786, accuracy 0.743
Epoch 93, CIFAR-10 Batch 2:  valid loss 0.905, accuracy 0.703
Epoch 93, CIFAR-10 Batch 3:  valid loss 0.868, accuracy 0.717
Epoch 93, CIFAR-10 Batch 4:  valid loss 0.781, accuracy 0.735
Epoch 93, CIFAR-10 Batch 5:  valid loss 0.825, accuracy 0.730
Epoch 94, CIFAR-10 Batch 1:  valid loss 0.787, accuracy 0.742
Epoch 94, CIFAR-10 Batch 2:  valid loss 0.944, accuracy 0.694
Epoch 94, CIFAR-10 Batch 3:  valid loss 0.882, accuracy 0.717
Epoch 94, CIFAR-10 Batch 4:  valid loss 0.803, accuracy 0.734
Epoch 94, CIFAR-10 Batch 5:  valid loss 0.810, accuracy 0.734
Epoch 95, CIFAR-10 Batch 1:  valid loss 0.788, accuracy 0.741
Epoch 95, CIFAR-10 Batch 2:  valid loss 0.937, accuracy 0.690
Epoch 95, CIFAR-10 Batch 3:  valid loss 0.868, accuracy 0.719
Epoch 95, CIFAR-10 Batch 4:  valid loss 0.806, accuracy 0.739
Epoch 95, CIFAR-10 Batch 5:  valid loss 0.823, accuracy 0.733
Epoch 96, CIFAR-10 Batch 1:  valid loss 0.796, accuracy 0.736
Epoch 96, CIFAR-10 Batch 2:  valid loss 0.840, accuracy 0.722
Epoch 96, CIFAR-10 Batch 3:  valid loss 0.866, accuracy 0.719
Epoch 96, CIFAR-10 Batch 4:  valid loss 0.811, accuracy 0.728
Epoch 96, CIFAR-10 Batch 5:  valid loss 0.805, accuracy 0.735
Epoch 97, CIFAR-10 Batch 1:  valid loss 0.797, accuracy 0.739
Epoch 97, CIFAR-10 Batch 2:  valid loss 0.884, accuracy 0.713
Epoch 97, CIFAR-10 Batch 3:  valid loss 0.817, accuracy 0.733
Epoch 97, CIFAR-10 Batch 4:  valid loss 0.817, accuracy 0.729
Epoch 97, CIFAR-10 Batch 5:  valid loss 0.794, accuracy 0.743
Epoch 98, CIFAR-10 Batch 1:  valid loss 0.774, accuracy 0.746
Epoch 98, CIFAR-10 Batch 2:  valid loss 0.845, accuracy 0.724
Epoch 98, CIFAR-10 Batch 3:  valid loss 0.838, accuracy 0.730
Epoch 98, CIFAR-10 Batch 4:  valid loss 0.789, accuracy 0.735
Epoch 98, CIFAR-10 Batch 5:  valid loss 0.830, accuracy 0.731
Epoch 99, CIFAR-10 Batch 1:  valid loss 0.790, accuracy 0.743
Epoch 99, CIFAR-10 Batch 2:  valid loss 0.875, accuracy 0.714
Epoch 99, CIFAR-10 Batch 3:  valid loss 0.811, accuracy 0.734
Epoch 99, CIFAR-10 Batch 4:  valid loss 0.792, accuracy 0.737
Epoch 99, CIFAR-10 Batch 5:  valid loss 0.808, accuracy 0.734
Epoch 100, CIFAR-10 Batch 1:  valid loss 0.813, accuracy 0.736
Epoch 100, CIFAR-10 Batch 2:  valid loss 0.859, accuracy 0.720
Epoch 100, CIFAR-10 Batch 3:  valid loss 0.777, accuracy 0.743
Epoch 100, CIFAR-10 Batch 4:  valid loss 0.799, accuracy 0.738
Epoch 100, CIFAR-10 Batch 5:  valid loss 0.819, accuracy 0.735

检查点

模型已保存到本地。

测试模型

利用测试数据集测试你的模型。这将是最终的准确率。你的准确率应该高于 50%。如果没达到,请继续调整模型结构和参数。


In [34]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import tensorflow as tf
import pickle
import helper
import random

# Set batch size if not already set
try:
    if batch_size:
        pass
except NameError:
    batch_size = 64

save_model_path = './image_classification'
n_samples = 4
top_n_predictions = 3

def test_model():
    """
    Test the saved model against the test dataset
    """

    test_features, test_labels = pickle.load(open('preprocess_test.p', mode='rb'))
    loaded_graph = tf.Graph()

    with tf.Session(graph=loaded_graph) as sess:
        # Load model
        loader = tf.train.import_meta_graph(save_model_path + '.meta')
        loader.restore(sess, save_model_path)

        # Get Tensors from loaded model
        loaded_x = loaded_graph.get_tensor_by_name('x:0')
        loaded_y = loaded_graph.get_tensor_by_name('y:0')
        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
        
        # Get accuracy in batches for memory limitations
        test_batch_acc_total = 0
        test_batch_count = 0
        
        for test_feature_batch, test_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
            test_batch_acc_total += sess.run(
                loaded_acc,
                feed_dict={loaded_x: test_feature_batch, loaded_y: test_label_batch, loaded_keep_prob: 1.0})
            test_batch_count += 1

        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))

        # Print Random Samples
        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
        random_test_predictions = sess.run(
            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)


test_model()


INFO:tensorflow:Restoring parameters from ./image_classification
Testing Accuracy: 0.72919921875

为何准确率只有50-80%?

你可能想问,为何准确率不能更高了?首先,对于简单的 CNN 网络来说,50% 已经不低了。纯粹猜测的准确率为10%。但是,你可能注意到有人的准确率远远超过 80%。这是因为我们还没有介绍所有的神经网络知识。我们还需要掌握一些其他技巧。

提交项目

提交项目时,确保先运行所有单元,然后再保存记事本。将 notebook 文件另存为“dlnd_image_classification.ipynb”,再在目录 "File" -> "Download as" 另存为 HTML 格式。请在提交的项目中包含 “helper.py” 和 “problem_unittests.py” 文件。