Convolutional Autoencoder

Sticking with the MNIST dataset, let's improve our autoencoder's performance using convolutional layers. Again, loading modules and the data.


In [2]:
%matplotlib inline

import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt

In [3]:
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets('MNIST_data', validation_size=0)


Successfully downloaded train-images-idx3-ubyte.gz 9912422 bytes.
Extracting MNIST_data/train-images-idx3-ubyte.gz
Successfully downloaded train-labels-idx1-ubyte.gz 28881 bytes.
Extracting MNIST_data/train-labels-idx1-ubyte.gz
Successfully downloaded t10k-images-idx3-ubyte.gz 1648877 bytes.
Extracting MNIST_data/t10k-images-idx3-ubyte.gz
Successfully downloaded t10k-labels-idx1-ubyte.gz 4542 bytes.
Extracting MNIST_data/t10k-labels-idx1-ubyte.gz

In [6]:
img = mnist.train.images[300]
plt.imshow(img.reshape((28, 28)), cmap='Greys_r')


Out[6]:
<matplotlib.image.AxesImage at 0x12acc7400>

In [7]:
len(mnist.train.images)


Out[7]:
60000

Network Architecture

The encoder part of the network will be a typical convolutional pyramid. Each convolutional layer will be followed by a max-pooling layer to reduce the dimensions of the layers. The decoder though might be something new to you. The decoder needs to convert from a narrow representation to a wide reconstructed image. For example, the representation could be a 4x4x8 max-pool layer. This is the output of the encoder, but also the input to the decoder. We want to get a 28x28x1 image out from the decoder so we need to work our way back up from the narrow decoder input layer. A schematic of the network is shown below.

Here our final encoder layer has size 4x4x8 = 128. The original images have size 28x28 = 784, so the encoded vector is roughly 16% the size of the original image. These are just suggested sizes for each of the layers. Feel free to change the depths and sizes, but remember our goal here is to find a small representation of the input data.

What's going on with the decoder

Okay, so the decoder has these "Upsample" layers that you might not have seen before. First off, I'll discuss a bit what these layers aren't. Usually, you'll see deconvolutional layers used to increase the width and height of the layers. They work almost exactly the same as convolutional layers, but it reverse. A stride in the input layer results in a larger stride in the deconvolutional layer. For example, if you have a 3x3 kernel, a 3x3 patch in the input layer will be reduced to one unit in a convolutional layer. Comparatively, one unit in the input layer will be expanded to a 3x3 path in a deconvolutional layer. Deconvolution is often called "transpose convolution" which is what you'll find with the TensorFlow API, with tf.nn.conv2d_transpose.

However, deconvolutional layers can lead to artifacts in the final images, such as checkerboard patterns. This is due to overlap in the kernels which can be avoided by setting the stride and kernel size equal. In this Distill article from Augustus Odena, et al, the authors show that these checkerboard artifacts can be avoided by resizing the layers using nearest neighbor or bilinear interpolation (upsampling) followed by a convolutional layer. In TensorFlow, this is easily done with tf.image.resize_images, followed by a convolution. Be sure to read the Distill article to get a better understanding of deconvolutional layers and why we're using upsampling.

Exercise: Build the network shown above. Remember that a convolutional layer with strides of 1 and 'same' padding won't reduce the height and width. That is, if the input is 28x28 and the convolution layer has stride = 1 and 'same' padding, the convolutional layer will also be 28x28. The max-pool layers are used the reduce the width and height. A stride of 2 will reduce the size by 2. Odena et al claim that nearest neighbor interpolation works best for the upsampling, so make sure to include that as a parameter in tf.image.resize_images or use tf.image.resize_nearest_neighbor.


In [11]:
learning_rate = 0.001
image_size = 28
inputs_ = tf.placeholder(shape= [None,image_size,image_size,1], dtype=tf.float32)
targets_ = tf.placeholder(shape= [None,image_size,image_size,1], dtype=tf.float32)

### Encoder
filter1 = tf.Variable(tf.random_normal([7,7,1,16],stddev=0.1))
conv1 = tf.nn.conv2d(inputs_, filter1, strides=[1, 1, 1, 1], padding='SAME')

# Now 28x28x16
maxpool1 = tf.nn.max_pool(conv1,ksize=[1,2,2,1],strides=[1, 2, 2, 1],padding='SAME')
# Now 14x14x16
filter2 = tf.Variable(tf.random_normal([2,2,16,8],stddev=0.1))
conv2 = tf.nn.conv2d(maxpool1, filter2, strides=[1, 1, 1, 1], padding='SAME')
# Now 14x14x8
maxpool2 = tf.nn.max_pool(conv2,ksize=[1,2,2,1],strides=[1, 2, 2, 1],padding='SAME')
# Now 7x7x8
filter3 = tf.Variable(tf.random_normal([2,2,8,8],stddev=0.1))
conv3 = tf.nn.conv2d(maxpool2, filter3, strides=[1, 1, 1, 1], padding='SAME')
# Now 7x7x8
encoded = tf.nn.max_pool(conv3,ksize=[1,2,2,1],strides=[1, 2, 2, 1],padding='SAME')
# Now 4x4x8

### Decoder
upsample1 = tf.image.resize_nearest_neighbor(encoded,size=[7,7])
# Now 7x7x8
filter4 = tf.Variable(tf.random_normal([2,2,8,8],stddev=0.1))
conv4 = tf.nn.conv2d(upsample1, filter4, strides=[1, 1, 1, 1], padding='SAME')
# Now 7x7x8
upsample2 = tf.image.resize_nearest_neighbor(conv4,size=[14,14])
# Now 14x14x8
conv5 = tf.nn.conv2d(upsample2, filter4, strides=[1, 1, 1, 1], padding='SAME')
# Now 14x14x8
upsample3 = tf.image.resize_nearest_neighbor(conv5,size=[28,28])
# Now 28x28x8
filter5 = tf.Variable(tf.random_normal([2,2,8,16],stddev=0.1))
conv6 = tf.nn.conv2d(upsample3, filter5, strides=[1, 1, 1, 1], padding='SAME')
# Now 28x28x16
filter_logits = tf.Variable(tf.random_normal([2,2,16,1],stddev=0.1))
logits = tf.nn.conv2d(conv6, filter_logits, strides=[1, 1, 1, 1], padding='SAME')
#Now 28x28x1

# Pass logits through sigmoid to get reconstructed image
decoded = tf.nn.sigmoid(logits)

# Pass logits through sigmoid and calculate the cross-entropy loss
loss = tf.nn.sigmoid_cross_entropy_with_logits(labels=targets_, logits = logits)

# Get cost and define the optimizer
cost = tf.reduce_mean(loss)
opt = tf.train.AdamOptimizer(learning_rate).minimize(cost)

Training

As before, here wi'll train the network. Instead of flattening the images though, we can pass them in as 28x28x1 arrays.


In [12]:
sess = tf.Session()

In [14]:
epochs = 5
batch_size = 200
sess.run(tf.global_variables_initializer())
for e in range(epochs):
    for ii in range(mnist.train.num_examples//batch_size):
        batch = mnist.train.next_batch(batch_size)
        imgs = batch[0].reshape((-1, 28, 28, 1))
        batch_cost, _ = sess.run([cost, opt], feed_dict={inputs_: imgs,
                                                         targets_: imgs})

        print("Epoch: {}/{}...".format(e+1, epochs),
              "Training loss: {:.4f}".format(batch_cost))


Epoch: 1/5... Training loss: 0.6933
Epoch: 1/5... Training loss: 0.6914
Epoch: 1/5... Training loss: 0.6895
Epoch: 1/5... Training loss: 0.6876
Epoch: 1/5... Training loss: 0.6851
Epoch: 1/5... Training loss: 0.6824
Epoch: 1/5... Training loss: 0.6790
Epoch: 1/5... Training loss: 0.6751
Epoch: 1/5... Training loss: 0.6702
Epoch: 1/5... Training loss: 0.6643
Epoch: 1/5... Training loss: 0.6574
Epoch: 1/5... Training loss: 0.6496
Epoch: 1/5... Training loss: 0.6406
Epoch: 1/5... Training loss: 0.6299
Epoch: 1/5... Training loss: 0.6190
Epoch: 1/5... Training loss: 0.6064
Epoch: 1/5... Training loss: 0.5964
Epoch: 1/5... Training loss: 0.5838
Epoch: 1/5... Training loss: 0.5745
Epoch: 1/5... Training loss: 0.5664
Epoch: 1/5... Training loss: 0.5638
Epoch: 1/5... Training loss: 0.5649
Epoch: 1/5... Training loss: 0.5661
Epoch: 1/5... Training loss: 0.5935
Epoch: 1/5... Training loss: 0.5701
Epoch: 1/5... Training loss: 0.5789
Epoch: 1/5... Training loss: 0.5729
Epoch: 1/5... Training loss: 0.5733
Epoch: 1/5... Training loss: 0.5675
Epoch: 1/5... Training loss: 0.5605
Epoch: 1/5... Training loss: 0.5594
Epoch: 1/5... Training loss: 0.5521
Epoch: 1/5... Training loss: 0.5528
Epoch: 1/5... Training loss: 0.5501
Epoch: 1/5... Training loss: 0.5521
Epoch: 1/5... Training loss: 0.5528
Epoch: 1/5... Training loss: 0.5514
Epoch: 1/5... Training loss: 0.5484
Epoch: 1/5... Training loss: 0.5525
Epoch: 1/5... Training loss: 0.5533
Epoch: 1/5... Training loss: 0.5497
Epoch: 1/5... Training loss: 0.5479
Epoch: 1/5... Training loss: 0.5462
Epoch: 1/5... Training loss: 0.5406
Epoch: 1/5... Training loss: 0.5455
Epoch: 1/5... Training loss: 0.5455
Epoch: 1/5... Training loss: 0.5441
Epoch: 1/5... Training loss: 0.5337
Epoch: 1/5... Training loss: 0.5409
Epoch: 1/5... Training loss: 0.5400
Epoch: 1/5... Training loss: 0.5460
Epoch: 1/5... Training loss: 0.5360
Epoch: 1/5... Training loss: 0.5369
Epoch: 1/5... Training loss: 0.5381
Epoch: 1/5... Training loss: 0.5369
Epoch: 1/5... Training loss: 0.5397
Epoch: 1/5... Training loss: 0.5345
Epoch: 1/5... Training loss: 0.5359
Epoch: 1/5... Training loss: 0.5323
Epoch: 1/5... Training loss: 0.5391
Epoch: 1/5... Training loss: 0.5319
Epoch: 1/5... Training loss: 0.5343
Epoch: 1/5... Training loss: 0.5299
Epoch: 1/5... Training loss: 0.5330
Epoch: 1/5... Training loss: 0.5292
Epoch: 1/5... Training loss: 0.5303
Epoch: 1/5... Training loss: 0.5273
Epoch: 1/5... Training loss: 0.5306
Epoch: 1/5... Training loss: 0.5270
Epoch: 1/5... Training loss: 0.5305
Epoch: 1/5... Training loss: 0.5219
Epoch: 1/5... Training loss: 0.5165
Epoch: 1/5... Training loss: 0.5207
Epoch: 1/5... Training loss: 0.5184
Epoch: 1/5... Training loss: 0.5212
Epoch: 1/5... Training loss: 0.5199
Epoch: 1/5... Training loss: 0.5225
Epoch: 1/5... Training loss: 0.5210
Epoch: 1/5... Training loss: 0.5178
Epoch: 1/5... Training loss: 0.5185
Epoch: 1/5... Training loss: 0.5113
Epoch: 1/5... Training loss: 0.5102
Epoch: 1/5... Training loss: 0.5129
Epoch: 1/5... Training loss: 0.5125
Epoch: 1/5... Training loss: 0.5109
Epoch: 1/5... Training loss: 0.5100
Epoch: 1/5... Training loss: 0.5075
Epoch: 1/5... Training loss: 0.5041
Epoch: 1/5... Training loss: 0.5041
Epoch: 1/5... Training loss: 0.5024
Epoch: 1/5... Training loss: 0.4979
Epoch: 1/5... Training loss: 0.4969
Epoch: 1/5... Training loss: 0.4915
Epoch: 1/5... Training loss: 0.4899
Epoch: 1/5... Training loss: 0.4922
Epoch: 1/5... Training loss: 0.4938
Epoch: 1/5... Training loss: 0.4844
Epoch: 1/5... Training loss: 0.4829
Epoch: 1/5... Training loss: 0.4809
Epoch: 1/5... Training loss: 0.4783
Epoch: 1/5... Training loss: 0.4823
Epoch: 1/5... Training loss: 0.4753
Epoch: 1/5... Training loss: 0.4751
Epoch: 1/5... Training loss: 0.4717
Epoch: 1/5... Training loss: 0.4759
Epoch: 1/5... Training loss: 0.4697
Epoch: 1/5... Training loss: 0.4669
Epoch: 1/5... Training loss: 0.4661
Epoch: 1/5... Training loss: 0.4689
Epoch: 1/5... Training loss: 0.4603
Epoch: 1/5... Training loss: 0.4613
Epoch: 1/5... Training loss: 0.4593
Epoch: 1/5... Training loss: 0.4600
Epoch: 1/5... Training loss: 0.4641
Epoch: 1/5... Training loss: 0.4603
Epoch: 1/5... Training loss: 0.4607
Epoch: 1/5... Training loss: 0.4555
Epoch: 1/5... Training loss: 0.4560
Epoch: 1/5... Training loss: 0.4527
Epoch: 1/5... Training loss: 0.4543
Epoch: 1/5... Training loss: 0.4506
Epoch: 1/5... Training loss: 0.4460
Epoch: 1/5... Training loss: 0.4445
Epoch: 1/5... Training loss: 0.4432
Epoch: 1/5... Training loss: 0.4465
Epoch: 1/5... Training loss: 0.4418
Epoch: 1/5... Training loss: 0.4411
Epoch: 1/5... Training loss: 0.4312
Epoch: 1/5... Training loss: 0.4424
Epoch: 1/5... Training loss: 0.4384
Epoch: 1/5... Training loss: 0.4387
Epoch: 1/5... Training loss: 0.4360
Epoch: 1/5... Training loss: 0.4332
Epoch: 1/5... Training loss: 0.4293
Epoch: 1/5... Training loss: 0.4283
Epoch: 1/5... Training loss: 0.4274
Epoch: 1/5... Training loss: 0.4233
Epoch: 1/5... Training loss: 0.4252
Epoch: 1/5... Training loss: 0.4151
Epoch: 1/5... Training loss: 0.4259
Epoch: 1/5... Training loss: 0.4202
Epoch: 1/5... Training loss: 0.4125
Epoch: 1/5... Training loss: 0.4150
Epoch: 1/5... Training loss: 0.4164
Epoch: 1/5... Training loss: 0.4165
Epoch: 1/5... Training loss: 0.4106
Epoch: 1/5... Training loss: 0.4058
Epoch: 1/5... Training loss: 0.4089
Epoch: 1/5... Training loss: 0.4062
Epoch: 1/5... Training loss: 0.4049
Epoch: 1/5... Training loss: 0.4038
Epoch: 1/5... Training loss: 0.4015
Epoch: 1/5... Training loss: 0.4001
Epoch: 1/5... Training loss: 0.4001
Epoch: 1/5... Training loss: 0.3985
Epoch: 1/5... Training loss: 0.3929
Epoch: 1/5... Training loss: 0.3915
Epoch: 1/5... Training loss: 0.3855
Epoch: 1/5... Training loss: 0.3963
Epoch: 1/5... Training loss: 0.3835
Epoch: 1/5... Training loss: 0.3894
Epoch: 1/5... Training loss: 0.3845
Epoch: 1/5... Training loss: 0.3812
Epoch: 1/5... Training loss: 0.3752
Epoch: 1/5... Training loss: 0.3807
Epoch: 1/5... Training loss: 0.3748
Epoch: 1/5... Training loss: 0.3769
Epoch: 1/5... Training loss: 0.3740
Epoch: 1/5... Training loss: 0.3735
Epoch: 1/5... Training loss: 0.3713
Epoch: 1/5... Training loss: 0.3654
Epoch: 1/5... Training loss: 0.3676
Epoch: 1/5... Training loss: 0.3648
Epoch: 1/5... Training loss: 0.3614
Epoch: 1/5... Training loss: 0.3582
Epoch: 1/5... Training loss: 0.3624
Epoch: 1/5... Training loss: 0.3646
Epoch: 1/5... Training loss: 0.3568
Epoch: 1/5... Training loss: 0.3542
Epoch: 1/5... Training loss: 0.3549
Epoch: 1/5... Training loss: 0.3486
Epoch: 1/5... Training loss: 0.3487
Epoch: 1/5... Training loss: 0.3462
Epoch: 1/5... Training loss: 0.3443
Epoch: 1/5... Training loss: 0.3454
Epoch: 1/5... Training loss: 0.3434
Epoch: 1/5... Training loss: 0.3464
Epoch: 1/5... Training loss: 0.3391
Epoch: 1/5... Training loss: 0.3363
Epoch: 1/5... Training loss: 0.3364
Epoch: 1/5... Training loss: 0.3366
Epoch: 1/5... Training loss: 0.3341
Epoch: 1/5... Training loss: 0.3347
Epoch: 1/5... Training loss: 0.3303
Epoch: 1/5... Training loss: 0.3270
Epoch: 1/5... Training loss: 0.3263
Epoch: 1/5... Training loss: 0.3290
Epoch: 1/5... Training loss: 0.3272
Epoch: 1/5... Training loss: 0.3225
Epoch: 1/5... Training loss: 0.3204
Epoch: 1/5... Training loss: 0.3255
Epoch: 1/5... Training loss: 0.3227
Epoch: 1/5... Training loss: 0.3242
Epoch: 1/5... Training loss: 0.3208
Epoch: 1/5... Training loss: 0.3204
Epoch: 1/5... Training loss: 0.3115
Epoch: 1/5... Training loss: 0.3131
Epoch: 1/5... Training loss: 0.3131
Epoch: 1/5... Training loss: 0.3122
Epoch: 1/5... Training loss: 0.3131
Epoch: 1/5... Training loss: 0.3110
Epoch: 1/5... Training loss: 0.3090
Epoch: 1/5... Training loss: 0.3109
Epoch: 1/5... Training loss: 0.3066
Epoch: 1/5... Training loss: 0.3067
Epoch: 1/5... Training loss: 0.3063
Epoch: 1/5... Training loss: 0.3065
Epoch: 1/5... Training loss: 0.3025
Epoch: 1/5... Training loss: 0.3047
Epoch: 1/5... Training loss: 0.3036
Epoch: 1/5... Training loss: 0.2999
Epoch: 1/5... Training loss: 0.3033
Epoch: 1/5... Training loss: 0.3010
Epoch: 1/5... Training loss: 0.3024
Epoch: 1/5... Training loss: 0.3023
Epoch: 1/5... Training loss: 0.2955
Epoch: 1/5... Training loss: 0.3013
Epoch: 1/5... Training loss: 0.2990
Epoch: 1/5... Training loss: 0.2990
Epoch: 1/5... Training loss: 0.2997
Epoch: 1/5... Training loss: 0.2963
Epoch: 1/5... Training loss: 0.2926
Epoch: 1/5... Training loss: 0.2940
Epoch: 1/5... Training loss: 0.2957
Epoch: 1/5... Training loss: 0.2907
Epoch: 1/5... Training loss: 0.2959
Epoch: 1/5... Training loss: 0.2981
Epoch: 1/5... Training loss: 0.2918
Epoch: 1/5... Training loss: 0.2947
Epoch: 1/5... Training loss: 0.2887
Epoch: 1/5... Training loss: 0.2880
Epoch: 1/5... Training loss: 0.2943
Epoch: 1/5... Training loss: 0.2895
Epoch: 1/5... Training loss: 0.2912
Epoch: 1/5... Training loss: 0.2881
Epoch: 1/5... Training loss: 0.2936
Epoch: 1/5... Training loss: 0.2877
Epoch: 1/5... Training loss: 0.2850
Epoch: 1/5... Training loss: 0.2888
Epoch: 1/5... Training loss: 0.2868
Epoch: 1/5... Training loss: 0.2890
Epoch: 1/5... Training loss: 0.2839
Epoch: 1/5... Training loss: 0.2868
Epoch: 1/5... Training loss: 0.2859
Epoch: 1/5... Training loss: 0.2813
Epoch: 1/5... Training loss: 0.2809
Epoch: 1/5... Training loss: 0.2857
Epoch: 1/5... Training loss: 0.2813
Epoch: 1/5... Training loss: 0.2869
Epoch: 1/5... Training loss: 0.2831
Epoch: 1/5... Training loss: 0.2840
Epoch: 1/5... Training loss: 0.2835
Epoch: 1/5... Training loss: 0.2870
Epoch: 1/5... Training loss: 0.2885
Epoch: 1/5... Training loss: 0.2845
Epoch: 1/5... Training loss: 0.2828
Epoch: 1/5... Training loss: 0.2830
Epoch: 1/5... Training loss: 0.2821
Epoch: 1/5... Training loss: 0.2803
Epoch: 1/5... Training loss: 0.2838
Epoch: 1/5... Training loss: 0.2849
Epoch: 1/5... Training loss: 0.2854
Epoch: 1/5... Training loss: 0.2863
Epoch: 1/5... Training loss: 0.2811
Epoch: 1/5... Training loss: 0.2847
Epoch: 1/5... Training loss: 0.2714
Epoch: 1/5... Training loss: 0.2799
Epoch: 1/5... Training loss: 0.2771
Epoch: 1/5... Training loss: 0.2806
Epoch: 1/5... Training loss: 0.2778
Epoch: 1/5... Training loss: 0.2784
Epoch: 1/5... Training loss: 0.2762
Epoch: 1/5... Training loss: 0.2749
Epoch: 1/5... Training loss: 0.2754
Epoch: 1/5... Training loss: 0.2737
Epoch: 1/5... Training loss: 0.2770
Epoch: 1/5... Training loss: 0.2820
Epoch: 1/5... Training loss: 0.2773
Epoch: 1/5... Training loss: 0.2722
Epoch: 1/5... Training loss: 0.2804
Epoch: 1/5... Training loss: 0.2788
Epoch: 1/5... Training loss: 0.2765
Epoch: 1/5... Training loss: 0.2735
Epoch: 1/5... Training loss: 0.2765
Epoch: 1/5... Training loss: 0.2724
Epoch: 1/5... Training loss: 0.2789
Epoch: 1/5... Training loss: 0.2755
Epoch: 1/5... Training loss: 0.2767
Epoch: 1/5... Training loss: 0.2729
Epoch: 1/5... Training loss: 0.2753
Epoch: 2/5... Training loss: 0.2710
Epoch: 2/5... Training loss: 0.2729
Epoch: 2/5... Training loss: 0.2751
Epoch: 2/5... Training loss: 0.2726
Epoch: 2/5... Training loss: 0.2733
Epoch: 2/5... Training loss: 0.2718
Epoch: 2/5... Training loss: 0.2751
Epoch: 2/5... Training loss: 0.2718
Epoch: 2/5... Training loss: 0.2720
Epoch: 2/5... Training loss: 0.2681
Epoch: 2/5... Training loss: 0.2743
Epoch: 2/5... Training loss: 0.2726
Epoch: 2/5... Training loss: 0.2746
Epoch: 2/5... Training loss: 0.2738
Epoch: 2/5... Training loss: 0.2700
Epoch: 2/5... Training loss: 0.2724
Epoch: 2/5... Training loss: 0.2704
Epoch: 2/5... Training loss: 0.2733
Epoch: 2/5... Training loss: 0.2696
Epoch: 2/5... Training loss: 0.2782
Epoch: 2/5... Training loss: 0.2724
Epoch: 2/5... Training loss: 0.2738
Epoch: 2/5... Training loss: 0.2734
Epoch: 2/5... Training loss: 0.2709
Epoch: 2/5... Training loss: 0.2724
Epoch: 2/5... Training loss: 0.2734
Epoch: 2/5... Training loss: 0.2751
Epoch: 2/5... Training loss: 0.2711
Epoch: 2/5... Training loss: 0.2774
Epoch: 2/5... Training loss: 0.2718
Epoch: 2/5... Training loss: 0.2700
Epoch: 2/5... Training loss: 0.2736
Epoch: 2/5... Training loss: 0.2740
Epoch: 2/5... Training loss: 0.2745
Epoch: 2/5... Training loss: 0.2748
Epoch: 2/5... Training loss: 0.2735
Epoch: 2/5... Training loss: 0.2774
Epoch: 2/5... Training loss: 0.2702
Epoch: 2/5... Training loss: 0.2683
Epoch: 2/5... Training loss: 0.2663
Epoch: 2/5... Training loss: 0.2759
Epoch: 2/5... Training loss: 0.2727
Epoch: 2/5... Training loss: 0.2692
Epoch: 2/5... Training loss: 0.2670
Epoch: 2/5... Training loss: 0.2671
Epoch: 2/5... Training loss: 0.2712
Epoch: 2/5... Training loss: 0.2709
Epoch: 2/5... Training loss: 0.2699
Epoch: 2/5... Training loss: 0.2718
Epoch: 2/5... Training loss: 0.2658
Epoch: 2/5... Training loss: 0.2692
Epoch: 2/5... Training loss: 0.2706
Epoch: 2/5... Training loss: 0.2672
Epoch: 2/5... Training loss: 0.2662
Epoch: 2/5... Training loss: 0.2669
Epoch: 2/5... Training loss: 0.2696
Epoch: 2/5... Training loss: 0.2711
Epoch: 2/5... Training loss: 0.2760
Epoch: 2/5... Training loss: 0.2681
Epoch: 2/5... Training loss: 0.2707
Epoch: 2/5... Training loss: 0.2704
Epoch: 2/5... Training loss: 0.2678
Epoch: 2/5... Training loss: 0.2684
Epoch: 2/5... Training loss: 0.2741
Epoch: 2/5... Training loss: 0.2668
Epoch: 2/5... Training loss: 0.2739
Epoch: 2/5... Training loss: 0.2711
Epoch: 2/5... Training loss: 0.2710
Epoch: 2/5... Training loss: 0.2712
Epoch: 2/5... Training loss: 0.2665
Epoch: 2/5... Training loss: 0.2715
Epoch: 2/5... Training loss: 0.2700
Epoch: 2/5... Training loss: 0.2682
Epoch: 2/5... Training loss: 0.2637
Epoch: 2/5... Training loss: 0.2677
Epoch: 2/5... Training loss: 0.2684
Epoch: 2/5... Training loss: 0.2699
Epoch: 2/5... Training loss: 0.2669
Epoch: 2/5... Training loss: 0.2707
Epoch: 2/5... Training loss: 0.2683
Epoch: 2/5... Training loss: 0.2660
Epoch: 2/5... Training loss: 0.2695
Epoch: 2/5... Training loss: 0.2721
Epoch: 2/5... Training loss: 0.2615
Epoch: 2/5... Training loss: 0.2694
Epoch: 2/5... Training loss: 0.2650
Epoch: 2/5... Training loss: 0.2639
Epoch: 2/5... Training loss: 0.2677
Epoch: 2/5... Training loss: 0.2705
Epoch: 2/5... Training loss: 0.2647
Epoch: 2/5... Training loss: 0.2703
Epoch: 2/5... Training loss: 0.2646
Epoch: 2/5... Training loss: 0.2637
Epoch: 2/5... Training loss: 0.2643
Epoch: 2/5... Training loss: 0.2679
Epoch: 2/5... Training loss: 0.2683
Epoch: 2/5... Training loss: 0.2617
Epoch: 2/5... Training loss: 0.2633
Epoch: 2/5... Training loss: 0.2632
Epoch: 2/5... Training loss: 0.2686
Epoch: 2/5... Training loss: 0.2696
Epoch: 2/5... Training loss: 0.2653
Epoch: 2/5... Training loss: 0.2629
Epoch: 2/5... Training loss: 0.2644
Epoch: 2/5... Training loss: 0.2661
Epoch: 2/5... Training loss: 0.2706
Epoch: 2/5... Training loss: 0.2669
Epoch: 2/5... Training loss: 0.2663
Epoch: 2/5... Training loss: 0.2649
Epoch: 2/5... Training loss: 0.2652
Epoch: 2/5... Training loss: 0.2628
Epoch: 2/5... Training loss: 0.2700
Epoch: 2/5... Training loss: 0.2645
Epoch: 2/5... Training loss: 0.2664
Epoch: 2/5... Training loss: 0.2557
Epoch: 2/5... Training loss: 0.2577
Epoch: 2/5... Training loss: 0.2649
Epoch: 2/5... Training loss: 0.2677
Epoch: 2/5... Training loss: 0.2648
Epoch: 2/5... Training loss: 0.2606
Epoch: 2/5... Training loss: 0.2672
Epoch: 2/5... Training loss: 0.2633
Epoch: 2/5... Training loss: 0.2607
Epoch: 2/5... Training loss: 0.2598
Epoch: 2/5... Training loss: 0.2603
Epoch: 2/5... Training loss: 0.2638
Epoch: 2/5... Training loss: 0.2646
Epoch: 2/5... Training loss: 0.2603
Epoch: 2/5... Training loss: 0.2698
Epoch: 2/5... Training loss: 0.2596
Epoch: 2/5... Training loss: 0.2625
Epoch: 2/5... Training loss: 0.2678
Epoch: 2/5... Training loss: 0.2598
Epoch: 2/5... Training loss: 0.2650
Epoch: 2/5... Training loss: 0.2633
Epoch: 2/5... Training loss: 0.2635
Epoch: 2/5... Training loss: 0.2676
Epoch: 2/5... Training loss: 0.2627
Epoch: 2/5... Training loss: 0.2630
Epoch: 2/5... Training loss: 0.2635
Epoch: 2/5... Training loss: 0.2639
Epoch: 2/5... Training loss: 0.2638
Epoch: 2/5... Training loss: 0.2636
Epoch: 2/5... Training loss: 0.2671
Epoch: 2/5... Training loss: 0.2595
Epoch: 2/5... Training loss: 0.2566
Epoch: 2/5... Training loss: 0.2629
Epoch: 2/5... Training loss: 0.2623
Epoch: 2/5... Training loss: 0.2671
Epoch: 2/5... Training loss: 0.2587
Epoch: 2/5... Training loss: 0.2644
Epoch: 2/5... Training loss: 0.2604
Epoch: 2/5... Training loss: 0.2640
Epoch: 2/5... Training loss: 0.2618
Epoch: 2/5... Training loss: 0.2604
Epoch: 2/5... Training loss: 0.2647
Epoch: 2/5... Training loss: 0.2659
Epoch: 2/5... Training loss: 0.2680
Epoch: 2/5... Training loss: 0.2645
Epoch: 2/5... Training loss: 0.2637
Epoch: 2/5... Training loss: 0.2553
Epoch: 2/5... Training loss: 0.2620
Epoch: 2/5... Training loss: 0.2650
Epoch: 2/5... Training loss: 0.2604
Epoch: 2/5... Training loss: 0.2652
Epoch: 2/5... Training loss: 0.2619
Epoch: 2/5... Training loss: 0.2581
Epoch: 2/5... Training loss: 0.2614
Epoch: 2/5... Training loss: 0.2621
Epoch: 2/5... Training loss: 0.2647
Epoch: 2/5... Training loss: 0.2594
Epoch: 2/5... Training loss: 0.2604
Epoch: 2/5... Training loss: 0.2608
Epoch: 2/5... Training loss: 0.2636
Epoch: 2/5... Training loss: 0.2591
Epoch: 2/5... Training loss: 0.2590
Epoch: 2/5... Training loss: 0.2647
Epoch: 2/5... Training loss: 0.2635
Epoch: 2/5... Training loss: 0.2660
Epoch: 2/5... Training loss: 0.2637
Epoch: 2/5... Training loss: 0.2619
Epoch: 2/5... Training loss: 0.2599
Epoch: 2/5... Training loss: 0.2631
Epoch: 2/5... Training loss: 0.2647
Epoch: 2/5... Training loss: 0.2587
Epoch: 2/5... Training loss: 0.2641
Epoch: 2/5... Training loss: 0.2600
Epoch: 2/5... Training loss: 0.2610
Epoch: 2/5... Training loss: 0.2596
Epoch: 2/5... Training loss: 0.2572
Epoch: 2/5... Training loss: 0.2654
Epoch: 2/5... Training loss: 0.2567
Epoch: 2/5... Training loss: 0.2608
Epoch: 2/5... Training loss: 0.2664
Epoch: 2/5... Training loss: 0.2587
Epoch: 2/5... Training loss: 0.2561
Epoch: 2/5... Training loss: 0.2612
Epoch: 2/5... Training loss: 0.2605
Epoch: 2/5... Training loss: 0.2582
Epoch: 2/5... Training loss: 0.2596
Epoch: 2/5... Training loss: 0.2612
Epoch: 2/5... Training loss: 0.2574
Epoch: 2/5... Training loss: 0.2576
Epoch: 2/5... Training loss: 0.2595
Epoch: 2/5... Training loss: 0.2574
Epoch: 2/5... Training loss: 0.2581
Epoch: 2/5... Training loss: 0.2602
Epoch: 2/5... Training loss: 0.2603
Epoch: 2/5... Training loss: 0.2598
Epoch: 2/5... Training loss: 0.2605
Epoch: 2/5... Training loss: 0.2614
Epoch: 2/5... Training loss: 0.2574
Epoch: 2/5... Training loss: 0.2557
Epoch: 2/5... Training loss: 0.2584
Epoch: 2/5... Training loss: 0.2591
Epoch: 2/5... Training loss: 0.2574
Epoch: 2/5... Training loss: 0.2561
Epoch: 2/5... Training loss: 0.2563
Epoch: 2/5... Training loss: 0.2579
Epoch: 2/5... Training loss: 0.2560
Epoch: 2/5... Training loss: 0.2610
Epoch: 2/5... Training loss: 0.2624
Epoch: 2/5... Training loss: 0.2588
Epoch: 2/5... Training loss: 0.2597
Epoch: 2/5... Training loss: 0.2610
Epoch: 2/5... Training loss: 0.2630
Epoch: 2/5... Training loss: 0.2565
Epoch: 2/5... Training loss: 0.2639
Epoch: 2/5... Training loss: 0.2591
Epoch: 2/5... Training loss: 0.2598
Epoch: 2/5... Training loss: 0.2611
Epoch: 2/5... Training loss: 0.2614
Epoch: 2/5... Training loss: 0.2576
Epoch: 2/5... Training loss: 0.2576
Epoch: 2/5... Training loss: 0.2592
Epoch: 2/5... Training loss: 0.2593
Epoch: 2/5... Training loss: 0.2562
Epoch: 2/5... Training loss: 0.2522
Epoch: 2/5... Training loss: 0.2568
Epoch: 2/5... Training loss: 0.2574
Epoch: 2/5... Training loss: 0.2558
Epoch: 2/5... Training loss: 0.2615
Epoch: 2/5... Training loss: 0.2634
Epoch: 2/5... Training loss: 0.2578
Epoch: 2/5... Training loss: 0.2611
Epoch: 2/5... Training loss: 0.2552
Epoch: 2/5... Training loss: 0.2591
Epoch: 2/5... Training loss: 0.2634
Epoch: 2/5... Training loss: 0.2544
Epoch: 2/5... Training loss: 0.2641
Epoch: 2/5... Training loss: 0.2588
Epoch: 2/5... Training loss: 0.2556
Epoch: 2/5... Training loss: 0.2596
Epoch: 2/5... Training loss: 0.2572
Epoch: 2/5... Training loss: 0.2589
Epoch: 2/5... Training loss: 0.2563
Epoch: 2/5... Training loss: 0.2591
Epoch: 2/5... Training loss: 0.2567
Epoch: 2/5... Training loss: 0.2576
Epoch: 2/5... Training loss: 0.2563
Epoch: 2/5... Training loss: 0.2570
Epoch: 2/5... Training loss: 0.2532
Epoch: 2/5... Training loss: 0.2576
Epoch: 2/5... Training loss: 0.2600
Epoch: 2/5... Training loss: 0.2588
Epoch: 2/5... Training loss: 0.2603
Epoch: 2/5... Training loss: 0.2568
Epoch: 2/5... Training loss: 0.2533
Epoch: 2/5... Training loss: 0.2547
Epoch: 2/5... Training loss: 0.2612
Epoch: 2/5... Training loss: 0.2581
Epoch: 2/5... Training loss: 0.2554
Epoch: 2/5... Training loss: 0.2569
Epoch: 2/5... Training loss: 0.2562
Epoch: 2/5... Training loss: 0.2549
Epoch: 2/5... Training loss: 0.2576
Epoch: 2/5... Training loss: 0.2557
Epoch: 2/5... Training loss: 0.2533
Epoch: 2/5... Training loss: 0.2535
Epoch: 2/5... Training loss: 0.2537
Epoch: 2/5... Training loss: 0.2556
Epoch: 2/5... Training loss: 0.2573
Epoch: 2/5... Training loss: 0.2506
Epoch: 2/5... Training loss: 0.2560
Epoch: 2/5... Training loss: 0.2602
Epoch: 2/5... Training loss: 0.2613
Epoch: 2/5... Training loss: 0.2574
Epoch: 2/5... Training loss: 0.2531
Epoch: 2/5... Training loss: 0.2595
Epoch: 2/5... Training loss: 0.2598
Epoch: 2/5... Training loss: 0.2534
Epoch: 2/5... Training loss: 0.2565
Epoch: 2/5... Training loss: 0.2528
Epoch: 2/5... Training loss: 0.2585
Epoch: 2/5... Training loss: 0.2573
Epoch: 2/5... Training loss: 0.2564
Epoch: 2/5... Training loss: 0.2551
Epoch: 2/5... Training loss: 0.2531
Epoch: 2/5... Training loss: 0.2512
Epoch: 2/5... Training loss: 0.2530
Epoch: 3/5... Training loss: 0.2542
Epoch: 3/5... Training loss: 0.2566
Epoch: 3/5... Training loss: 0.2556
Epoch: 3/5... Training loss: 0.2524
Epoch: 3/5... Training loss: 0.2593
Epoch: 3/5... Training loss: 0.2551
Epoch: 3/5... Training loss: 0.2534
Epoch: 3/5... Training loss: 0.2559
Epoch: 3/5... Training loss: 0.2527
Epoch: 3/5... Training loss: 0.2486
Epoch: 3/5... Training loss: 0.2542
Epoch: 3/5... Training loss: 0.2538
Epoch: 3/5... Training loss: 0.2549
Epoch: 3/5... Training loss: 0.2523
Epoch: 3/5... Training loss: 0.2552
Epoch: 3/5... Training loss: 0.2553
Epoch: 3/5... Training loss: 0.2535
Epoch: 3/5... Training loss: 0.2566
Epoch: 3/5... Training loss: 0.2544
Epoch: 3/5... Training loss: 0.2526
Epoch: 3/5... Training loss: 0.2502
Epoch: 3/5... Training loss: 0.2565
Epoch: 3/5... Training loss: 0.2578
Epoch: 3/5... Training loss: 0.2502
Epoch: 3/5... Training loss: 0.2531
Epoch: 3/5... Training loss: 0.2594
Epoch: 3/5... Training loss: 0.2577
Epoch: 3/5... Training loss: 0.2583
Epoch: 3/5... Training loss: 0.2542
Epoch: 3/5... Training loss: 0.2518
Epoch: 3/5... Training loss: 0.2573
Epoch: 3/5... Training loss: 0.2539
Epoch: 3/5... Training loss: 0.2545
Epoch: 3/5... Training loss: 0.2551
Epoch: 3/5... Training loss: 0.2575
Epoch: 3/5... Training loss: 0.2536
Epoch: 3/5... Training loss: 0.2504
Epoch: 3/5... Training loss: 0.2563
Epoch: 3/5... Training loss: 0.2559
Epoch: 3/5... Training loss: 0.2578
Epoch: 3/5... Training loss: 0.2561
Epoch: 3/5... Training loss: 0.2511
Epoch: 3/5... Training loss: 0.2554
Epoch: 3/5... Training loss: 0.2525
Epoch: 3/5... Training loss: 0.2530
Epoch: 3/5... Training loss: 0.2483
Epoch: 3/5... Training loss: 0.2545
Epoch: 3/5... Training loss: 0.2569
Epoch: 3/5... Training loss: 0.2528
Epoch: 3/5... Training loss: 0.2530
Epoch: 3/5... Training loss: 0.2537
Epoch: 3/5... Training loss: 0.2558
Epoch: 3/5... Training loss: 0.2527
Epoch: 3/5... Training loss: 0.2601
Epoch: 3/5... Training loss: 0.2516
Epoch: 3/5... Training loss: 0.2523
Epoch: 3/5... Training loss: 0.2552
Epoch: 3/5... Training loss: 0.2511
Epoch: 3/5... Training loss: 0.2558
Epoch: 3/5... Training loss: 0.2559
Epoch: 3/5... Training loss: 0.2508
Epoch: 3/5... Training loss: 0.2511
Epoch: 3/5... Training loss: 0.2499
Epoch: 3/5... Training loss: 0.2534
Epoch: 3/5... Training loss: 0.2514
Epoch: 3/5... Training loss: 0.2576
Epoch: 3/5... Training loss: 0.2532
Epoch: 3/5... Training loss: 0.2502
Epoch: 3/5... Training loss: 0.2521
Epoch: 3/5... Training loss: 0.2539
Epoch: 3/5... Training loss: 0.2518
Epoch: 3/5... Training loss: 0.2485
Epoch: 3/5... Training loss: 0.2476
Epoch: 3/5... Training loss: 0.2538
Epoch: 3/5... Training loss: 0.2525
Epoch: 3/5... Training loss: 0.2581
Epoch: 3/5... Training loss: 0.2536
Epoch: 3/5... Training loss: 0.2545
Epoch: 3/5... Training loss: 0.2462
Epoch: 3/5... Training loss: 0.2541
Epoch: 3/5... Training loss: 0.2508
Epoch: 3/5... Training loss: 0.2532
Epoch: 3/5... Training loss: 0.2537
Epoch: 3/5... Training loss: 0.2553
Epoch: 3/5... Training loss: 0.2500
Epoch: 3/5... Training loss: 0.2469
Epoch: 3/5... Training loss: 0.2535
Epoch: 3/5... Training loss: 0.2504
Epoch: 3/5... Training loss: 0.2500
Epoch: 3/5... Training loss: 0.2536
Epoch: 3/5... Training loss: 0.2538
Epoch: 3/5... Training loss: 0.2520
Epoch: 3/5... Training loss: 0.2575
Epoch: 3/5... Training loss: 0.2503
Epoch: 3/5... Training loss: 0.2549
Epoch: 3/5... Training loss: 0.2562
Epoch: 3/5... Training loss: 0.2504
Epoch: 3/5... Training loss: 0.2501
Epoch: 3/5... Training loss: 0.2528
Epoch: 3/5... Training loss: 0.2524
Epoch: 3/5... Training loss: 0.2534
Epoch: 3/5... Training loss: 0.2562
Epoch: 3/5... Training loss: 0.2526
Epoch: 3/5... Training loss: 0.2535
Epoch: 3/5... Training loss: 0.2537
Epoch: 3/5... Training loss: 0.2551
Epoch: 3/5... Training loss: 0.2537
Epoch: 3/5... Training loss: 0.2520
Epoch: 3/5... Training loss: 0.2554
Epoch: 3/5... Training loss: 0.2504
Epoch: 3/5... Training loss: 0.2526
Epoch: 3/5... Training loss: 0.2520
Epoch: 3/5... Training loss: 0.2521
Epoch: 3/5... Training loss: 0.2506
Epoch: 3/5... Training loss: 0.2545
Epoch: 3/5... Training loss: 0.2469
Epoch: 3/5... Training loss: 0.2536
Epoch: 3/5... Training loss: 0.2543
Epoch: 3/5... Training loss: 0.2559
Epoch: 3/5... Training loss: 0.2605
Epoch: 3/5... Training loss: 0.2514
Epoch: 3/5... Training loss: 0.2549
Epoch: 3/5... Training loss: 0.2544
Epoch: 3/5... Training loss: 0.2491
Epoch: 3/5... Training loss: 0.2500
Epoch: 3/5... Training loss: 0.2532
Epoch: 3/5... Training loss: 0.2510
Epoch: 3/5... Training loss: 0.2566
Epoch: 3/5... Training loss: 0.2533
Epoch: 3/5... Training loss: 0.2504
Epoch: 3/5... Training loss: 0.2500
Epoch: 3/5... Training loss: 0.2531
Epoch: 3/5... Training loss: 0.2479
Epoch: 3/5... Training loss: 0.2492
Epoch: 3/5... Training loss: 0.2508
Epoch: 3/5... Training loss: 0.2584
Epoch: 3/5... Training loss: 0.2533
Epoch: 3/5... Training loss: 0.2473
Epoch: 3/5... Training loss: 0.2468
Epoch: 3/5... Training loss: 0.2594
Epoch: 3/5... Training loss: 0.2501
Epoch: 3/5... Training loss: 0.2527
Epoch: 3/5... Training loss: 0.2494
Epoch: 3/5... Training loss: 0.2522
Epoch: 3/5... Training loss: 0.2535
Epoch: 3/5... Training loss: 0.2495
Epoch: 3/5... Training loss: 0.2529
Epoch: 3/5... Training loss: 0.2457
Epoch: 3/5... Training loss: 0.2481
Epoch: 3/5... Training loss: 0.2521
Epoch: 3/5... Training loss: 0.2463
Epoch: 3/5... Training loss: 0.2501
Epoch: 3/5... Training loss: 0.2511
Epoch: 3/5... Training loss: 0.2459
Epoch: 3/5... Training loss: 0.2515
Epoch: 3/5... Training loss: 0.2465
Epoch: 3/5... Training loss: 0.2562
Epoch: 3/5... Training loss: 0.2526
Epoch: 3/5... Training loss: 0.2485
Epoch: 3/5... Training loss: 0.2504
Epoch: 3/5... Training loss: 0.2468
Epoch: 3/5... Training loss: 0.2484
Epoch: 3/5... Training loss: 0.2503
Epoch: 3/5... Training loss: 0.2541
Epoch: 3/5... Training loss: 0.2509
Epoch: 3/5... Training loss: 0.2552
Epoch: 3/5... Training loss: 0.2508
Epoch: 3/5... Training loss: 0.2543
Epoch: 3/5... Training loss: 0.2527
Epoch: 3/5... Training loss: 0.2488
Epoch: 3/5... Training loss: 0.2471
Epoch: 3/5... Training loss: 0.2503
Epoch: 3/5... Training loss: 0.2493
Epoch: 3/5... Training loss: 0.2502
Epoch: 3/5... Training loss: 0.2484
Epoch: 3/5... Training loss: 0.2442
Epoch: 3/5... Training loss: 0.2482
Epoch: 3/5... Training loss: 0.2458
Epoch: 3/5... Training loss: 0.2509
Epoch: 3/5... Training loss: 0.2512
Epoch: 3/5... Training loss: 0.2474
Epoch: 3/5... Training loss: 0.2533
Epoch: 3/5... Training loss: 0.2497
Epoch: 3/5... Training loss: 0.2469
Epoch: 3/5... Training loss: 0.2497
Epoch: 3/5... Training loss: 0.2507
Epoch: 3/5... Training loss: 0.2482
Epoch: 3/5... Training loss: 0.2454
Epoch: 3/5... Training loss: 0.2499
Epoch: 3/5... Training loss: 0.2513
Epoch: 3/5... Training loss: 0.2475
Epoch: 3/5... Training loss: 0.2474
Epoch: 3/5... Training loss: 0.2498
Epoch: 3/5... Training loss: 0.2535
Epoch: 3/5... Training loss: 0.2508
Epoch: 3/5... Training loss: 0.2470
Epoch: 3/5... Training loss: 0.2578
Epoch: 3/5... Training loss: 0.2519
Epoch: 3/5... Training loss: 0.2558
Epoch: 3/5... Training loss: 0.2471
Epoch: 3/5... Training loss: 0.2515
Epoch: 3/5... Training loss: 0.2518
Epoch: 3/5... Training loss: 0.2461
Epoch: 3/5... Training loss: 0.2548
Epoch: 3/5... Training loss: 0.2505
Epoch: 3/5... Training loss: 0.2483
Epoch: 3/5... Training loss: 0.2513
Epoch: 3/5... Training loss: 0.2456
Epoch: 3/5... Training loss: 0.2471
Epoch: 3/5... Training loss: 0.2501
Epoch: 3/5... Training loss: 0.2514
Epoch: 3/5... Training loss: 0.2507
Epoch: 3/5... Training loss: 0.2506
Epoch: 3/5... Training loss: 0.2458
Epoch: 3/5... Training loss: 0.2440
Epoch: 3/5... Training loss: 0.2520
Epoch: 3/5... Training loss: 0.2523
Epoch: 3/5... Training loss: 0.2523
Epoch: 3/5... Training loss: 0.2507
Epoch: 3/5... Training loss: 0.2513
Epoch: 3/5... Training loss: 0.2529
Epoch: 3/5... Training loss: 0.2475
Epoch: 3/5... Training loss: 0.2463
Epoch: 3/5... Training loss: 0.2462
Epoch: 3/5... Training loss: 0.2473
Epoch: 3/5... Training loss: 0.2488
Epoch: 3/5... Training loss: 0.2447
Epoch: 3/5... Training loss: 0.2476
Epoch: 3/5... Training loss: 0.2466
Epoch: 3/5... Training loss: 0.2436
Epoch: 3/5... Training loss: 0.2438
Epoch: 3/5... Training loss: 0.2462
Epoch: 3/5... Training loss: 0.2580
Epoch: 3/5... Training loss: 0.2506
Epoch: 3/5... Training loss: 0.2474
Epoch: 3/5... Training loss: 0.2470
Epoch: 3/5... Training loss: 0.2461
Epoch: 3/5... Training loss: 0.2514
Epoch: 3/5... Training loss: 0.2468
Epoch: 3/5... Training loss: 0.2494
Epoch: 3/5... Training loss: 0.2498
Epoch: 3/5... Training loss: 0.2496
Epoch: 3/5... Training loss: 0.2429
Epoch: 3/5... Training loss: 0.2497
Epoch: 3/5... Training loss: 0.2511
Epoch: 3/5... Training loss: 0.2472
Epoch: 3/5... Training loss: 0.2485
Epoch: 3/5... Training loss: 0.2479
Epoch: 3/5... Training loss: 0.2508
Epoch: 3/5... Training loss: 0.2475
Epoch: 3/5... Training loss: 0.2461
Epoch: 3/5... Training loss: 0.2474
Epoch: 3/5... Training loss: 0.2496
Epoch: 3/5... Training loss: 0.2472
Epoch: 3/5... Training loss: 0.2491
Epoch: 3/5... Training loss: 0.2489
Epoch: 3/5... Training loss: 0.2511
Epoch: 3/5... Training loss: 0.2503
Epoch: 3/5... Training loss: 0.2472
Epoch: 3/5... Training loss: 0.2492
Epoch: 3/5... Training loss: 0.2482
Epoch: 3/5... Training loss: 0.2454
Epoch: 3/5... Training loss: 0.2476
Epoch: 3/5... Training loss: 0.2493
Epoch: 3/5... Training loss: 0.2461
Epoch: 3/5... Training loss: 0.2462
Epoch: 3/5... Training loss: 0.2517
Epoch: 3/5... Training loss: 0.2484
Epoch: 3/5... Training loss: 0.2483
Epoch: 3/5... Training loss: 0.2503
Epoch: 3/5... Training loss: 0.2482
Epoch: 3/5... Training loss: 0.2501
Epoch: 3/5... Training loss: 0.2461
Epoch: 3/5... Training loss: 0.2517
Epoch: 3/5... Training loss: 0.2461
Epoch: 3/5... Training loss: 0.2533
Epoch: 3/5... Training loss: 0.2448
Epoch: 3/5... Training loss: 0.2475
Epoch: 3/5... Training loss: 0.2501
Epoch: 3/5... Training loss: 0.2488
Epoch: 3/5... Training loss: 0.2471
Epoch: 3/5... Training loss: 0.2491
Epoch: 3/5... Training loss: 0.2434
Epoch: 3/5... Training loss: 0.2465
Epoch: 3/5... Training loss: 0.2461
Epoch: 3/5... Training loss: 0.2500
Epoch: 3/5... Training loss: 0.2463
Epoch: 3/5... Training loss: 0.2448
Epoch: 3/5... Training loss: 0.2457
Epoch: 3/5... Training loss: 0.2454
Epoch: 3/5... Training loss: 0.2501
Epoch: 3/5... Training loss: 0.2453
Epoch: 3/5... Training loss: 0.2460
Epoch: 3/5... Training loss: 0.2435
Epoch: 3/5... Training loss: 0.2444
Epoch: 3/5... Training loss: 0.2492
Epoch: 3/5... Training loss: 0.2459
Epoch: 3/5... Training loss: 0.2514
Epoch: 3/5... Training loss: 0.2431
Epoch: 3/5... Training loss: 0.2499
Epoch: 4/5... Training loss: 0.2460
Epoch: 4/5... Training loss: 0.2513
Epoch: 4/5... Training loss: 0.2465
Epoch: 4/5... Training loss: 0.2507
Epoch: 4/5... Training loss: 0.2437
Epoch: 4/5... Training loss: 0.2500
Epoch: 4/5... Training loss: 0.2478
Epoch: 4/5... Training loss: 0.2444
Epoch: 4/5... Training loss: 0.2501
Epoch: 4/5... Training loss: 0.2476
Epoch: 4/5... Training loss: 0.2488
Epoch: 4/5... Training loss: 0.2491
Epoch: 4/5... Training loss: 0.2425
Epoch: 4/5... Training loss: 0.2447
Epoch: 4/5... Training loss: 0.2413
Epoch: 4/5... Training loss: 0.2449
Epoch: 4/5... Training loss: 0.2515
Epoch: 4/5... Training loss: 0.2514
Epoch: 4/5... Training loss: 0.2445
Epoch: 4/5... Training loss: 0.2455
Epoch: 4/5... Training loss: 0.2469
Epoch: 4/5... Training loss: 0.2497
Epoch: 4/5... Training loss: 0.2513
Epoch: 4/5... Training loss: 0.2463
Epoch: 4/5... Training loss: 0.2503
Epoch: 4/5... Training loss: 0.2476
Epoch: 4/5... Training loss: 0.2391
Epoch: 4/5... Training loss: 0.2486
Epoch: 4/5... Training loss: 0.2459
Epoch: 4/5... Training loss: 0.2523
Epoch: 4/5... Training loss: 0.2463
Epoch: 4/5... Training loss: 0.2444
Epoch: 4/5... Training loss: 0.2444
Epoch: 4/5... Training loss: 0.2510
Epoch: 4/5... Training loss: 0.2479
Epoch: 4/5... Training loss: 0.2522
Epoch: 4/5... Training loss: 0.2506
Epoch: 4/5... Training loss: 0.2474
Epoch: 4/5... Training loss: 0.2486
Epoch: 4/5... Training loss: 0.2462
Epoch: 4/5... Training loss: 0.2470
Epoch: 4/5... Training loss: 0.2491
Epoch: 4/5... Training loss: 0.2476
Epoch: 4/5... Training loss: 0.2473
Epoch: 4/5... Training loss: 0.2454
Epoch: 4/5... Training loss: 0.2475
Epoch: 4/5... Training loss: 0.2511
Epoch: 4/5... Training loss: 0.2498
Epoch: 4/5... Training loss: 0.2433
Epoch: 4/5... Training loss: 0.2462
Epoch: 4/5... Training loss: 0.2491
Epoch: 4/5... Training loss: 0.2462
Epoch: 4/5... Training loss: 0.2478
Epoch: 4/5... Training loss: 0.2438
Epoch: 4/5... Training loss: 0.2515
Epoch: 4/5... Training loss: 0.2506
Epoch: 4/5... Training loss: 0.2494
Epoch: 4/5... Training loss: 0.2452
Epoch: 4/5... Training loss: 0.2475
Epoch: 4/5... Training loss: 0.2468
Epoch: 4/5... Training loss: 0.2481
Epoch: 4/5... Training loss: 0.2535
Epoch: 4/5... Training loss: 0.2478
Epoch: 4/5... Training loss: 0.2489
Epoch: 4/5... Training loss: 0.2471
Epoch: 4/5... Training loss: 0.2448
Epoch: 4/5... Training loss: 0.2431
Epoch: 4/5... Training loss: 0.2476
Epoch: 4/5... Training loss: 0.2481
Epoch: 4/5... Training loss: 0.2472
Epoch: 4/5... Training loss: 0.2514
Epoch: 4/5... Training loss: 0.2421
Epoch: 4/5... Training loss: 0.2482
Epoch: 4/5... Training loss: 0.2476
Epoch: 4/5... Training loss: 0.2484
Epoch: 4/5... Training loss: 0.2477
Epoch: 4/5... Training loss: 0.2452
Epoch: 4/5... Training loss: 0.2449
Epoch: 4/5... Training loss: 0.2489
Epoch: 4/5... Training loss: 0.2489
Epoch: 4/5... Training loss: 0.2440
Epoch: 4/5... Training loss: 0.2474
Epoch: 4/5... Training loss: 0.2440
Epoch: 4/5... Training loss: 0.2475
Epoch: 4/5... Training loss: 0.2463
Epoch: 4/5... Training loss: 0.2446
Epoch: 4/5... Training loss: 0.2423
Epoch: 4/5... Training loss: 0.2409
Epoch: 4/5... Training loss: 0.2498
Epoch: 4/5... Training loss: 0.2431
Epoch: 4/5... Training loss: 0.2447
Epoch: 4/5... Training loss: 0.2397
Epoch: 4/5... Training loss: 0.2419
Epoch: 4/5... Training loss: 0.2449
Epoch: 4/5... Training loss: 0.2458
Epoch: 4/5... Training loss: 0.2448
Epoch: 4/5... Training loss: 0.2442
Epoch: 4/5... Training loss: 0.2427
Epoch: 4/5... Training loss: 0.2411
Epoch: 4/5... Training loss: 0.2387
Epoch: 4/5... Training loss: 0.2412
Epoch: 4/5... Training loss: 0.2475
Epoch: 4/5... Training loss: 0.2479
Epoch: 4/5... Training loss: 0.2436
Epoch: 4/5... Training loss: 0.2480
Epoch: 4/5... Training loss: 0.2458
Epoch: 4/5... Training loss: 0.2462
Epoch: 4/5... Training loss: 0.2443
Epoch: 4/5... Training loss: 0.2468
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2429
Epoch: 4/5... Training loss: 0.2423
Epoch: 4/5... Training loss: 0.2421
Epoch: 4/5... Training loss: 0.2433
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2410
Epoch: 4/5... Training loss: 0.2429
Epoch: 4/5... Training loss: 0.2449
Epoch: 4/5... Training loss: 0.2487
Epoch: 4/5... Training loss: 0.2469
Epoch: 4/5... Training loss: 0.2460
Epoch: 4/5... Training loss: 0.2459
Epoch: 4/5... Training loss: 0.2426
Epoch: 4/5... Training loss: 0.2422
Epoch: 4/5... Training loss: 0.2430
Epoch: 4/5... Training loss: 0.2409
Epoch: 4/5... Training loss: 0.2462
Epoch: 4/5... Training loss: 0.2433
Epoch: 4/5... Training loss: 0.2431
Epoch: 4/5... Training loss: 0.2480
Epoch: 4/5... Training loss: 0.2451
Epoch: 4/5... Training loss: 0.2453
Epoch: 4/5... Training loss: 0.2476
Epoch: 4/5... Training loss: 0.2391
Epoch: 4/5... Training loss: 0.2417
Epoch: 4/5... Training loss: 0.2386
Epoch: 4/5... Training loss: 0.2490
Epoch: 4/5... Training loss: 0.2433
Epoch: 4/5... Training loss: 0.2384
Epoch: 4/5... Training loss: 0.2485
Epoch: 4/5... Training loss: 0.2437
Epoch: 4/5... Training loss: 0.2443
Epoch: 4/5... Training loss: 0.2450
Epoch: 4/5... Training loss: 0.2398
Epoch: 4/5... Training loss: 0.2422
Epoch: 4/5... Training loss: 0.2421
Epoch: 4/5... Training loss: 0.2420
Epoch: 4/5... Training loss: 0.2480
Epoch: 4/5... Training loss: 0.2410
Epoch: 4/5... Training loss: 0.2456
Epoch: 4/5... Training loss: 0.2444
Epoch: 4/5... Training loss: 0.2444
Epoch: 4/5... Training loss: 0.2417
Epoch: 4/5... Training loss: 0.2440
Epoch: 4/5... Training loss: 0.2423
Epoch: 4/5... Training loss: 0.2433
Epoch: 4/5... Training loss: 0.2424
Epoch: 4/5... Training loss: 0.2413
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2440
Epoch: 4/5... Training loss: 0.2475
Epoch: 4/5... Training loss: 0.2444
Epoch: 4/5... Training loss: 0.2469
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2455
Epoch: 4/5... Training loss: 0.2416
Epoch: 4/5... Training loss: 0.2420
Epoch: 4/5... Training loss: 0.2433
Epoch: 4/5... Training loss: 0.2432
Epoch: 4/5... Training loss: 0.2434
Epoch: 4/5... Training loss: 0.2418
Epoch: 4/5... Training loss: 0.2445
Epoch: 4/5... Training loss: 0.2458
Epoch: 4/5... Training loss: 0.2484
Epoch: 4/5... Training loss: 0.2447
Epoch: 4/5... Training loss: 0.2408
Epoch: 4/5... Training loss: 0.2465
Epoch: 4/5... Training loss: 0.2441
Epoch: 4/5... Training loss: 0.2483
Epoch: 4/5... Training loss: 0.2434
Epoch: 4/5... Training loss: 0.2455
Epoch: 4/5... Training loss: 0.2417
Epoch: 4/5... Training loss: 0.2471
Epoch: 4/5... Training loss: 0.2418
Epoch: 4/5... Training loss: 0.2386
Epoch: 4/5... Training loss: 0.2428
Epoch: 4/5... Training loss: 0.2459
Epoch: 4/5... Training loss: 0.2420
Epoch: 4/5... Training loss: 0.2436
Epoch: 4/5... Training loss: 0.2456
Epoch: 4/5... Training loss: 0.2442
Epoch: 4/5... Training loss: 0.2443
Epoch: 4/5... Training loss: 0.2428
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2452
Epoch: 4/5... Training loss: 0.2421
Epoch: 4/5... Training loss: 0.2433
Epoch: 4/5... Training loss: 0.2396
Epoch: 4/5... Training loss: 0.2399
Epoch: 4/5... Training loss: 0.2440
Epoch: 4/5... Training loss: 0.2450
Epoch: 4/5... Training loss: 0.2422
Epoch: 4/5... Training loss: 0.2408
Epoch: 4/5... Training loss: 0.2440
Epoch: 4/5... Training loss: 0.2466
Epoch: 4/5... Training loss: 0.2437
Epoch: 4/5... Training loss: 0.2441
Epoch: 4/5... Training loss: 0.2402
Epoch: 4/5... Training loss: 0.2475
Epoch: 4/5... Training loss: 0.2434
Epoch: 4/5... Training loss: 0.2432
Epoch: 4/5... Training loss: 0.2370
Epoch: 4/5... Training loss: 0.2459
Epoch: 4/5... Training loss: 0.2388
Epoch: 4/5... Training loss: 0.2421
Epoch: 4/5... Training loss: 0.2388
Epoch: 4/5... Training loss: 0.2461
Epoch: 4/5... Training loss: 0.2420
Epoch: 4/5... Training loss: 0.2440
Epoch: 4/5... Training loss: 0.2434
Epoch: 4/5... Training loss: 0.2408
Epoch: 4/5... Training loss: 0.2460
Epoch: 4/5... Training loss: 0.2428
Epoch: 4/5... Training loss: 0.2473
Epoch: 4/5... Training loss: 0.2463
Epoch: 4/5... Training loss: 0.2421
Epoch: 4/5... Training loss: 0.2475
Epoch: 4/5... Training loss: 0.2461
Epoch: 4/5... Training loss: 0.2432
Epoch: 4/5... Training loss: 0.2421
Epoch: 4/5... Training loss: 0.2448
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2445
Epoch: 4/5... Training loss: 0.2394
Epoch: 4/5... Training loss: 0.2424
Epoch: 4/5... Training loss: 0.2370
Epoch: 4/5... Training loss: 0.2436
Epoch: 4/5... Training loss: 0.2427
Epoch: 4/5... Training loss: 0.2406
Epoch: 4/5... Training loss: 0.2416
Epoch: 4/5... Training loss: 0.2436
Epoch: 4/5... Training loss: 0.2394
Epoch: 4/5... Training loss: 0.2429
Epoch: 4/5... Training loss: 0.2435
Epoch: 4/5... Training loss: 0.2454
Epoch: 4/5... Training loss: 0.2402
Epoch: 4/5... Training loss: 0.2425
Epoch: 4/5... Training loss: 0.2381
Epoch: 4/5... Training loss: 0.2428
Epoch: 4/5... Training loss: 0.2422
Epoch: 4/5... Training loss: 0.2338
Epoch: 4/5... Training loss: 0.2397
Epoch: 4/5... Training loss: 0.2417
Epoch: 4/5... Training loss: 0.2397
Epoch: 4/5... Training loss: 0.2438
Epoch: 4/5... Training loss: 0.2444
Epoch: 4/5... Training loss: 0.2481
Epoch: 4/5... Training loss: 0.2407
Epoch: 4/5... Training loss: 0.2437
Epoch: 4/5... Training loss: 0.2436
Epoch: 4/5... Training loss: 0.2394
Epoch: 4/5... Training loss: 0.2459
Epoch: 4/5... Training loss: 0.2398
Epoch: 4/5... Training loss: 0.2396
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2409
Epoch: 4/5... Training loss: 0.2462
Epoch: 4/5... Training loss: 0.2441
Epoch: 4/5... Training loss: 0.2458
Epoch: 4/5... Training loss: 0.2402
Epoch: 4/5... Training loss: 0.2430
Epoch: 4/5... Training loss: 0.2398
Epoch: 4/5... Training loss: 0.2449
Epoch: 4/5... Training loss: 0.2422
Epoch: 4/5... Training loss: 0.2445
Epoch: 4/5... Training loss: 0.2451
Epoch: 4/5... Training loss: 0.2414
Epoch: 4/5... Training loss: 0.2412
Epoch: 4/5... Training loss: 0.2418
Epoch: 4/5... Training loss: 0.2400
Epoch: 4/5... Training loss: 0.2436
Epoch: 4/5... Training loss: 0.2432
Epoch: 4/5... Training loss: 0.2358
Epoch: 4/5... Training loss: 0.2446
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2434
Epoch: 4/5... Training loss: 0.2387
Epoch: 4/5... Training loss: 0.2378
Epoch: 4/5... Training loss: 0.2435
Epoch: 4/5... Training loss: 0.2439
Epoch: 4/5... Training loss: 0.2379
Epoch: 4/5... Training loss: 0.2454
Epoch: 4/5... Training loss: 0.2412
Epoch: 4/5... Training loss: 0.2412
Epoch: 4/5... Training loss: 0.2437
Epoch: 4/5... Training loss: 0.2422
Epoch: 4/5... Training loss: 0.2461
Epoch: 4/5... Training loss: 0.2403
Epoch: 4/5... Training loss: 0.2374
Epoch: 4/5... Training loss: 0.2381
Epoch: 5/5... Training loss: 0.2455
Epoch: 5/5... Training loss: 0.2409
Epoch: 5/5... Training loss: 0.2423
Epoch: 5/5... Training loss: 0.2378
Epoch: 5/5... Training loss: 0.2435
Epoch: 5/5... Training loss: 0.2454
Epoch: 5/5... Training loss: 0.2399
Epoch: 5/5... Training loss: 0.2440
Epoch: 5/5... Training loss: 0.2398
Epoch: 5/5... Training loss: 0.2435
Epoch: 5/5... Training loss: 0.2401
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2405
Epoch: 5/5... Training loss: 0.2412
Epoch: 5/5... Training loss: 0.2430
Epoch: 5/5... Training loss: 0.2417
Epoch: 5/5... Training loss: 0.2452
Epoch: 5/5... Training loss: 0.2433
Epoch: 5/5... Training loss: 0.2411
Epoch: 5/5... Training loss: 0.2388
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2436
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2342
Epoch: 5/5... Training loss: 0.2424
Epoch: 5/5... Training loss: 0.2411
Epoch: 5/5... Training loss: 0.2421
Epoch: 5/5... Training loss: 0.2412
Epoch: 5/5... Training loss: 0.2445
Epoch: 5/5... Training loss: 0.2397
Epoch: 5/5... Training loss: 0.2397
Epoch: 5/5... Training loss: 0.2430
Epoch: 5/5... Training loss: 0.2438
Epoch: 5/5... Training loss: 0.2431
Epoch: 5/5... Training loss: 0.2389
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2462
Epoch: 5/5... Training loss: 0.2407
Epoch: 5/5... Training loss: 0.2424
Epoch: 5/5... Training loss: 0.2383
Epoch: 5/5... Training loss: 0.2400
Epoch: 5/5... Training loss: 0.2425
Epoch: 5/5... Training loss: 0.2426
Epoch: 5/5... Training loss: 0.2393
Epoch: 5/5... Training loss: 0.2415
Epoch: 5/5... Training loss: 0.2413
Epoch: 5/5... Training loss: 0.2432
Epoch: 5/5... Training loss: 0.2358
Epoch: 5/5... Training loss: 0.2405
Epoch: 5/5... Training loss: 0.2428
Epoch: 5/5... Training loss: 0.2408
Epoch: 5/5... Training loss: 0.2457
Epoch: 5/5... Training loss: 0.2412
Epoch: 5/5... Training loss: 0.2384
Epoch: 5/5... Training loss: 0.2365
Epoch: 5/5... Training loss: 0.2373
Epoch: 5/5... Training loss: 0.2380
Epoch: 5/5... Training loss: 0.2406
Epoch: 5/5... Training loss: 0.2408
Epoch: 5/5... Training loss: 0.2357
Epoch: 5/5... Training loss: 0.2429
Epoch: 5/5... Training loss: 0.2379
Epoch: 5/5... Training loss: 0.2406
Epoch: 5/5... Training loss: 0.2398
Epoch: 5/5... Training loss: 0.2351
Epoch: 5/5... Training loss: 0.2382
Epoch: 5/5... Training loss: 0.2392
Epoch: 5/5... Training loss: 0.2378
Epoch: 5/5... Training loss: 0.2348
Epoch: 5/5... Training loss: 0.2405
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2408
Epoch: 5/5... Training loss: 0.2399
Epoch: 5/5... Training loss: 0.2421
Epoch: 5/5... Training loss: 0.2430
Epoch: 5/5... Training loss: 0.2439
Epoch: 5/5... Training loss: 0.2380
Epoch: 5/5... Training loss: 0.2343
Epoch: 5/5... Training loss: 0.2403
Epoch: 5/5... Training loss: 0.2412
Epoch: 5/5... Training loss: 0.2413
Epoch: 5/5... Training loss: 0.2386
Epoch: 5/5... Training loss: 0.2388
Epoch: 5/5... Training loss: 0.2385
Epoch: 5/5... Training loss: 0.2404
Epoch: 5/5... Training loss: 0.2405
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2407
Epoch: 5/5... Training loss: 0.2404
Epoch: 5/5... Training loss: 0.2420
Epoch: 5/5... Training loss: 0.2387
Epoch: 5/5... Training loss: 0.2378
Epoch: 5/5... Training loss: 0.2437
Epoch: 5/5... Training loss: 0.2407
Epoch: 5/5... Training loss: 0.2401
Epoch: 5/5... Training loss: 0.2436
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2400
Epoch: 5/5... Training loss: 0.2388
Epoch: 5/5... Training loss: 0.2361
Epoch: 5/5... Training loss: 0.2435
Epoch: 5/5... Training loss: 0.2423
Epoch: 5/5... Training loss: 0.2404
Epoch: 5/5... Training loss: 0.2392
Epoch: 5/5... Training loss: 0.2382
Epoch: 5/5... Training loss: 0.2399
Epoch: 5/5... Training loss: 0.2438
Epoch: 5/5... Training loss: 0.2376
Epoch: 5/5... Training loss: 0.2348
Epoch: 5/5... Training loss: 0.2384
Epoch: 5/5... Training loss: 0.2374
Epoch: 5/5... Training loss: 0.2447
Epoch: 5/5... Training loss: 0.2350
Epoch: 5/5... Training loss: 0.2377
Epoch: 5/5... Training loss: 0.2370
Epoch: 5/5... Training loss: 0.2412
Epoch: 5/5... Training loss: 0.2391
Epoch: 5/5... Training loss: 0.2353
Epoch: 5/5... Training loss: 0.2370
Epoch: 5/5... Training loss: 0.2380
Epoch: 5/5... Training loss: 0.2414
Epoch: 5/5... Training loss: 0.2398
Epoch: 5/5... Training loss: 0.2419
Epoch: 5/5... Training loss: 0.2456
Epoch: 5/5... Training loss: 0.2381
Epoch: 5/5... Training loss: 0.2365
Epoch: 5/5... Training loss: 0.2421
Epoch: 5/5... Training loss: 0.2432
Epoch: 5/5... Training loss: 0.2382
Epoch: 5/5... Training loss: 0.2384
Epoch: 5/5... Training loss: 0.2399
Epoch: 5/5... Training loss: 0.2423
Epoch: 5/5... Training loss: 0.2386
Epoch: 5/5... Training loss: 0.2404
Epoch: 5/5... Training loss: 0.2409
Epoch: 5/5... Training loss: 0.2343
Epoch: 5/5... Training loss: 0.2365
Epoch: 5/5... Training loss: 0.2361
Epoch: 5/5... Training loss: 0.2430
Epoch: 5/5... Training loss: 0.2386
Epoch: 5/5... Training loss: 0.2416
Epoch: 5/5... Training loss: 0.2444
Epoch: 5/5... Training loss: 0.2407
Epoch: 5/5... Training loss: 0.2400
Epoch: 5/5... Training loss: 0.2438
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2412
Epoch: 5/5... Training loss: 0.2431
Epoch: 5/5... Training loss: 0.2365
Epoch: 5/5... Training loss: 0.2384
Epoch: 5/5... Training loss: 0.2383
Epoch: 5/5... Training loss: 0.2398
Epoch: 5/5... Training loss: 0.2392
Epoch: 5/5... Training loss: 0.2362
Epoch: 5/5... Training loss: 0.2364
Epoch: 5/5... Training loss: 0.2426
Epoch: 5/5... Training loss: 0.2402
Epoch: 5/5... Training loss: 0.2390
Epoch: 5/5... Training loss: 0.2407
Epoch: 5/5... Training loss: 0.2440
Epoch: 5/5... Training loss: 0.2344
Epoch: 5/5... Training loss: 0.2411
Epoch: 5/5... Training loss: 0.2416
Epoch: 5/5... Training loss: 0.2376
Epoch: 5/5... Training loss: 0.2392
Epoch: 5/5... Training loss: 0.2396
Epoch: 5/5... Training loss: 0.2393
Epoch: 5/5... Training loss: 0.2447
Epoch: 5/5... Training loss: 0.2373
Epoch: 5/5... Training loss: 0.2401
Epoch: 5/5... Training loss: 0.2372
Epoch: 5/5... Training loss: 0.2322
Epoch: 5/5... Training loss: 0.2420
Epoch: 5/5... Training loss: 0.2371
Epoch: 5/5... Training loss: 0.2461
Epoch: 5/5... Training loss: 0.2375
Epoch: 5/5... Training loss: 0.2446
Epoch: 5/5... Training loss: 0.2403
Epoch: 5/5... Training loss: 0.2369
Epoch: 5/5... Training loss: 0.2418
Epoch: 5/5... Training loss: 0.2402
Epoch: 5/5... Training loss: 0.2384
Epoch: 5/5... Training loss: 0.2402
Epoch: 5/5... Training loss: 0.2404
Epoch: 5/5... Training loss: 0.2398
Epoch: 5/5... Training loss: 0.2440
Epoch: 5/5... Training loss: 0.2377
Epoch: 5/5... Training loss: 0.2401
Epoch: 5/5... Training loss: 0.2398
Epoch: 5/5... Training loss: 0.2375
Epoch: 5/5... Training loss: 0.2365
Epoch: 5/5... Training loss: 0.2327
Epoch: 5/5... Training loss: 0.2380
Epoch: 5/5... Training loss: 0.2375
Epoch: 5/5... Training loss: 0.2369
Epoch: 5/5... Training loss: 0.2383
Epoch: 5/5... Training loss: 0.2391
Epoch: 5/5... Training loss: 0.2394
Epoch: 5/5... Training loss: 0.2390
Epoch: 5/5... Training loss: 0.2373
Epoch: 5/5... Training loss: 0.2341
Epoch: 5/5... Training loss: 0.2372
Epoch: 5/5... Training loss: 0.2396
Epoch: 5/5... Training loss: 0.2356
Epoch: 5/5... Training loss: 0.2370
Epoch: 5/5... Training loss: 0.2408
Epoch: 5/5... Training loss: 0.2384
Epoch: 5/5... Training loss: 0.2403
Epoch: 5/5... Training loss: 0.2408
Epoch: 5/5... Training loss: 0.2399
Epoch: 5/5... Training loss: 0.2377
Epoch: 5/5... Training loss: 0.2364
Epoch: 5/5... Training loss: 0.2439
Epoch: 5/5... Training loss: 0.2370
Epoch: 5/5... Training loss: 0.2403
Epoch: 5/5... Training loss: 0.2370
Epoch: 5/5... Training loss: 0.2396
Epoch: 5/5... Training loss: 0.2381
Epoch: 5/5... Training loss: 0.2381
Epoch: 5/5... Training loss: 0.2383
Epoch: 5/5... Training loss: 0.2379
Epoch: 5/5... Training loss: 0.2397
Epoch: 5/5... Training loss: 0.2363
Epoch: 5/5... Training loss: 0.2348
Epoch: 5/5... Training loss: 0.2392
Epoch: 5/5... Training loss: 0.2434
Epoch: 5/5... Training loss: 0.2378
Epoch: 5/5... Training loss: 0.2383
Epoch: 5/5... Training loss: 0.2438
Epoch: 5/5... Training loss: 0.2373
Epoch: 5/5... Training loss: 0.2394
Epoch: 5/5... Training loss: 0.2390
Epoch: 5/5... Training loss: 0.2396
Epoch: 5/5... Training loss: 0.2367
Epoch: 5/5... Training loss: 0.2376
Epoch: 5/5... Training loss: 0.2387
Epoch: 5/5... Training loss: 0.2365
Epoch: 5/5... Training loss: 0.2421
Epoch: 5/5... Training loss: 0.2385
Epoch: 5/5... Training loss: 0.2395
Epoch: 5/5... Training loss: 0.2366
Epoch: 5/5... Training loss: 0.2372
Epoch: 5/5... Training loss: 0.2380
Epoch: 5/5... Training loss: 0.2373
Epoch: 5/5... Training loss: 0.2436
Epoch: 5/5... Training loss: 0.2420
Epoch: 5/5... Training loss: 0.2398
Epoch: 5/5... Training loss: 0.2349
Epoch: 5/5... Training loss: 0.2393
Epoch: 5/5... Training loss: 0.2313
Epoch: 5/5... Training loss: 0.2352
Epoch: 5/5... Training loss: 0.2402
Epoch: 5/5... Training loss: 0.2385
Epoch: 5/5... Training loss: 0.2367
Epoch: 5/5... Training loss: 0.2407
Epoch: 5/5... Training loss: 0.2444
Epoch: 5/5... Training loss: 0.2364
Epoch: 5/5... Training loss: 0.2334
Epoch: 5/5... Training loss: 0.2355
Epoch: 5/5... Training loss: 0.2385
Epoch: 5/5... Training loss: 0.2376
Epoch: 5/5... Training loss: 0.2365
Epoch: 5/5... Training loss: 0.2355
Epoch: 5/5... Training loss: 0.2426
Epoch: 5/5... Training loss: 0.2421
Epoch: 5/5... Training loss: 0.2377
Epoch: 5/5... Training loss: 0.2392
Epoch: 5/5... Training loss: 0.2344
Epoch: 5/5... Training loss: 0.2381
Epoch: 5/5... Training loss: 0.2365
Epoch: 5/5... Training loss: 0.2414
Epoch: 5/5... Training loss: 0.2359
Epoch: 5/5... Training loss: 0.2324
Epoch: 5/5... Training loss: 0.2385
Epoch: 5/5... Training loss: 0.2339
Epoch: 5/5... Training loss: 0.2360
Epoch: 5/5... Training loss: 0.2361
Epoch: 5/5... Training loss: 0.2390
Epoch: 5/5... Training loss: 0.2392
Epoch: 5/5... Training loss: 0.2382
Epoch: 5/5... Training loss: 0.2400
Epoch: 5/5... Training loss: 0.2349
Epoch: 5/5... Training loss: 0.2362
Epoch: 5/5... Training loss: 0.2403
Epoch: 5/5... Training loss: 0.2419
Epoch: 5/5... Training loss: 0.2406
Epoch: 5/5... Training loss: 0.2394
Epoch: 5/5... Training loss: 0.2397
Epoch: 5/5... Training loss: 0.2378
Epoch: 5/5... Training loss: 0.2355
Epoch: 5/5... Training loss: 0.2419
Epoch: 5/5... Training loss: 0.2354
Epoch: 5/5... Training loss: 0.2396
Epoch: 5/5... Training loss: 0.2338
Epoch: 5/5... Training loss: 0.2326
Epoch: 5/5... Training loss: 0.2401
Epoch: 5/5... Training loss: 0.2389
Epoch: 5/5... Training loss: 0.2414
Epoch: 5/5... Training loss: 0.2393
Epoch: 5/5... Training loss: 0.2384

In [15]:
fig, axes = plt.subplots(nrows=2, ncols=10, sharex=True, sharey=True, figsize=(20,4))
in_imgs = mnist.test.images[:10]
reconstructed = sess.run(decoded, feed_dict={inputs_: in_imgs.reshape((10, 28, 28, 1))})

for images, row in zip([in_imgs, reconstructed], axes):
    for img, ax in zip(images, row):
        ax.imshow(img.reshape((28, 28)), cmap='Greys_r')
        ax.get_xaxis().set_visible(False)
        ax.get_yaxis().set_visible(False)


fig.tight_layout(pad=0.1)



In [16]:
sess.close()

Denoising

As I've mentioned before, autoencoders like the ones you've built so far aren't too useful in practive. However, they can be used to denoise images quite successfully just by training the network on noisy images. We can create the noisy images ourselves by adding Gaussian noise to the training images, then clipping the values to be between 0 and 1. We'll use noisy images as input and the original, clean images as targets. Here's an example of the noisy images I generated and the denoised images.

Since this is a harder problem for the network, we'll want to use deeper convolutional layers here, more feature maps. I suggest something like 32-32-16 for the depths of the convolutional layers in the encoder, and the same depths going backward through the decoder. Otherwise the architecture is the same as before.

Exercise: Build the network for the denoising autoencoder. It's the same as before, but with deeper layers. I suggest 32-32-16 for the depths, but you can play with these numbers, or add more layers.


In [21]:
learning_rate = 0.001
inputs_ = tf.placeholder(tf.float32, (None, 28, 28, 1), name='inputs')
targets_ = tf.placeholder(tf.float32, (None, 28, 28, 1), name='targets')

### Encoder
conv1 = 
# Now 28x28x32
maxpool1 = 
# Now 14x14x32
conv2 = 
# Now 14x14x32
maxpool2 = 
# Now 7x7x32
conv3 = 
# Now 7x7x16
encoded = 
# Now 4x4x16

### Decoder
upsample1 = 
# Now 7x7x16
conv4 = 
# Now 7x7x16
upsample2 = 
# Now 14x14x16
conv5 = 
# Now 14x14x32
upsample3 = 
# Now 28x28x32
conv6 = 
# Now 28x28x32

logits = 
#Now 28x28x1

# Pass logits through sigmoid to get reconstructed image
decoded =

# Pass logits through sigmoid and calculate the cross-entropy loss
loss = 

# Get cost and define the optimizer
cost = tf.reduce_mean(loss)
opt = tf.train.AdamOptimizer(learning_rate).minimize(cost)

In [22]:
sess = tf.Session()

In [ ]:
epochs = 100
batch_size = 200
# Set's how much noise we're adding to the MNIST images
noise_factor = 0.5
sess.run(tf.global_variables_initializer())
for e in range(epochs):
    for ii in range(mnist.train.num_examples//batch_size):
        batch = mnist.train.next_batch(batch_size)
        # Get images from the batch
        imgs = batch[0].reshape((-1, 28, 28, 1))
        
        # Add random noise to the input images
        noisy_imgs = imgs + noise_factor * np.random.randn(*imgs.shape)
        # Clip the images to be between 0 and 1
        noisy_imgs = np.clip(noisy_imgs, 0., 1.)
        
        # Noisy images as inputs, original images as targets
        batch_cost, _ = sess.run([cost, opt], feed_dict={inputs_: noisy_imgs,
                                                         targets_: imgs})

        print("Epoch: {}/{}...".format(e+1, epochs),
              "Training loss: {:.4f}".format(batch_cost))

Checking out the performance

Here I'm adding noise to the test images and passing them through the autoencoder. It does a suprisingly great job of removing the noise, even though it's sometimes difficult to tell what the original number is.


In [29]:
fig, axes = plt.subplots(nrows=2, ncols=10, sharex=True, sharey=True, figsize=(20,4))
in_imgs = mnist.test.images[:10]
noisy_imgs = in_imgs + noise_factor * np.random.randn(*in_imgs.shape)
noisy_imgs = np.clip(noisy_imgs, 0., 1.)

reconstructed = sess.run(decoded, feed_dict={inputs_: noisy_imgs.reshape((10, 28, 28, 1))})

for images, row in zip([noisy_imgs, reconstructed], axes):
    for img, ax in zip(images, row):
        ax.imshow(img.reshape((28, 28)), cmap='Greys_r')
        ax.get_xaxis().set_visible(False)
        ax.get_yaxis().set_visible(False)

fig.tight_layout(pad=0.1)