CIFAR - 10

Improved CNN

Activate virtual environment


In [1]:
%%bash
source ~/kerai/bin/activate

Imports


In [1]:
%matplotlib inline
import numpy as np
import matplotlib
from matplotlib import pyplot as plt
from keras.models import Sequential
from keras.optimizers import Adam, SGD
from keras.callbacks import ModelCheckpoint
from keras.constraints import maxnorm
from keras.models import load_model
from keras.layers import GlobalAveragePooling2D, Lambda, Conv2D, MaxPooling2D, Dropout, Dense, Flatten, Activation
from keras.preprocessing.image import ImageDataGenerator


Using TensorFlow backend.

Import helper functions


In [41]:
from helper import get_class_names, get_train_data, get_test_data, plot_images
from helper import plot_model, predict_classes, visualize_errors

Change matplotlib graph style


In [3]:
matplotlib.style.use('ggplot')

Constants

Import class names


In [4]:
class_names = get_class_names()
print(class_names)


Decoding file: data/batches.meta
['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']

Get number of classes


In [5]:
num_classes = len(class_names)
print(num_classes)


10

In [1]:
# Hight and width of the images
IMAGE_SIZE = 32
# 3 channels, Red, Green and Blue
CHANNELS = 3
# Number of epochs
NUM_EPOCH = 350
# learning rate
LEARN_RATE = 1.0e-4

Fetch and decode data

Load the training dataset. Labels are integers whereas class is one-hot encoded vectors.


In [6]:
images_train, labels_train, class_train = get_train_data()


Decoding file: data/data_batch_1
Decoding file: data/data_batch_2
Decoding file: data/data_batch_3
Decoding file: data/data_batch_4
Decoding file: data/data_batch_5

Load the testing dataset.


In [7]:
images_test, labels_test, class_test = get_test_data()


Decoding file: data/test_batch

In [8]:
print("Training set size:\t",len(images_train))
print("Testing set size:\t",len(images_test))


Training set size:	 50000
Testing set size:	 10000

The CIFAR-10 dataset has been loaded and consists of a total of 60,000 images and corresponding labels.

Improving accuracy

1. Define a better CNN model

A pure CNN model from https://arxiv.org/pdf/1412.6806.pdf


In [15]:
def pure_cnn_model():
    
    model = Sequential()
    
    model.add(Conv2D(96, (3, 3), activation='relu', padding = 'same', input_shape=(IMAGE_SIZE,IMAGE_SIZE,CHANNELS)))    
    model.add(Dropout(0.2))
    
    model.add(Conv2D(96, (3, 3), activation='relu', padding = 'same'))  
    model.add(Conv2D(96, (3, 3), activation='relu', padding = 'same', strides = 2))    
    model.add(Dropout(0.5))
    
    model.add(Conv2D(192, (3, 3), activation='relu', padding = 'same'))    
    model.add(Conv2D(192, (3, 3), activation='relu', padding = 'same'))
    model.add(Conv2D(192, (3, 3), activation='relu', padding = 'same', strides = 2))    
    model.add(Dropout(0.5))    
    
    model.add(Conv2D(192, (3, 3), padding = 'same'))
    model.add(Activation('relu'))
    model.add(Conv2D(192, (1, 1),padding='valid'))
    model.add(Activation('relu'))
    model.add(Conv2D(10, (1, 1), padding='valid'))

    model.add(GlobalAveragePooling2D())
    
    model.add(Activation('softmax'))

    model.summary()
    
    return model

Build the model


In [16]:
model = pure_cnn_model()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_10 (Conv2D)           (None, 32, 32, 96)        2688      
_________________________________________________________________
dropout_4 (Dropout)          (None, 32, 32, 96)        0         
_________________________________________________________________
conv2d_11 (Conv2D)           (None, 32, 32, 96)        83040     
_________________________________________________________________
conv2d_12 (Conv2D)           (None, 16, 16, 96)        83040     
_________________________________________________________________
dropout_5 (Dropout)          (None, 16, 16, 96)        0         
_________________________________________________________________
conv2d_13 (Conv2D)           (None, 16, 16, 192)       166080    
_________________________________________________________________
conv2d_14 (Conv2D)           (None, 16, 16, 192)       331968    
_________________________________________________________________
conv2d_15 (Conv2D)           (None, 8, 8, 192)         331968    
_________________________________________________________________
dropout_6 (Dropout)          (None, 8, 8, 192)         0         
_________________________________________________________________
conv2d_16 (Conv2D)           (None, 8, 8, 192)         331968    
_________________________________________________________________
activation_4 (Activation)    (None, 8, 8, 192)         0         
_________________________________________________________________
conv2d_17 (Conv2D)           (None, 8, 8, 192)         37056     
_________________________________________________________________
activation_5 (Activation)    (None, 8, 8, 192)         0         
_________________________________________________________________
conv2d_18 (Conv2D)           (None, 8, 8, 10)          1930      
_________________________________________________________________
global_average_pooling2d_2 ( (None, 10)                0         
_________________________________________________________________
activation_6 (Activation)    (None, 10)                0         
=================================================================
Total params: 1,369,738
Trainable params: 1,369,738
Non-trainable params: 0
_________________________________________________________________

Train model on the training data

Save the model after every epoch


In [17]:
checkpoint = ModelCheckpoint('best_model_improved.h5',  # model filename
                             monitor='val_loss', # quantity to monitor
                             verbose=0, # verbosity - 0 or 1
                             save_best_only= True, # The latest best model will not be overwritten
                             mode='auto') # The decision to overwrite model is made 
                                          # automatically depending on the quantity to monitor

Configure the model for training


In [18]:
model.compile(loss='categorical_crossentropy', # Better loss function for neural networks
              optimizer=Adam(lr=LEARN_RATE), # Adam optimizer with 1.0e-4 learning rate
              metrics = ['accuracy']) # Metrics to be evaluated by the model

Fit the model on the data provided


In [13]:
model_details = model.fit(images_train, class_train,
                    batch_size = 128,
                    epochs = NUM_EPOCH, # number of iterations
                    validation_data= (images_test, class_test),
                    callbacks=[checkpoint],
                    verbose=1)


Train on 50000 samples, validate on 10000 samples
Epoch 1/350
50000/50000 [==============================] - 95s - loss: 2.0042 - acc: 0.2385 - val_loss: 1.8122 - val_acc: 0.3176
Epoch 2/350
50000/50000 [==============================] - 92s - loss: 1.6972 - acc: 0.3663 - val_loss: 1.6534 - val_acc: 0.3903
Epoch 3/350
50000/50000 [==============================] - 93s - loss: 1.5731 - acc: 0.4184 - val_loss: 1.5575 - val_acc: 0.4300
Epoch 4/350
50000/50000 [==============================] - 93s - loss: 1.5032 - acc: 0.4480 - val_loss: 1.5217 - val_acc: 0.4372
Epoch 5/350
50000/50000 [==============================] - 93s - loss: 1.4448 - acc: 0.4723 - val_loss: 1.3933 - val_acc: 0.4878
Epoch 6/350
50000/50000 [==============================] - 93s - loss: 1.3925 - acc: 0.4948 - val_loss: 1.3786 - val_acc: 0.5099
Epoch 7/350
50000/50000 [==============================] - 93s - loss: 1.3506 - acc: 0.5115 - val_loss: 1.3002 - val_acc: 0.5368
Epoch 8/350
50000/50000 [==============================] - 93s - loss: 1.2986 - acc: 0.5307 - val_loss: 1.2381 - val_acc: 0.5628
Epoch 9/350
50000/50000 [==============================] - 93s - loss: 1.2505 - acc: 0.5499 - val_loss: 1.1893 - val_acc: 0.5809
Epoch 10/350
50000/50000 [==============================] - 93s - loss: 1.2173 - acc: 0.5657 - val_loss: 1.1476 - val_acc: 0.5895
Epoch 11/350
50000/50000 [==============================] - 93s - loss: 1.1813 - acc: 0.5763 - val_loss: 1.1389 - val_acc: 0.5974
Epoch 12/350
50000/50000 [==============================] - 93s - loss: 1.1493 - acc: 0.5900 - val_loss: 1.0917 - val_acc: 0.6136
Epoch 13/350
50000/50000 [==============================] - 93s - loss: 1.1211 - acc: 0.6009 - val_loss: 1.0892 - val_acc: 0.6162
Epoch 14/350
50000/50000 [==============================] - 93s - loss: 1.1008 - acc: 0.6070 - val_loss: 1.0934 - val_acc: 0.6118
Epoch 15/350
50000/50000 [==============================] - 93s - loss: 1.0738 - acc: 0.6201 - val_loss: 1.0263 - val_acc: 0.6350
Epoch 16/350
50000/50000 [==============================] - 93s - loss: 1.0509 - acc: 0.6296 - val_loss: 1.0201 - val_acc: 0.6386
Epoch 17/350
50000/50000 [==============================] - 93s - loss: 1.0333 - acc: 0.6331 - val_loss: 0.9933 - val_acc: 0.6480
Epoch 18/350
50000/50000 [==============================] - 93s - loss: 1.0125 - acc: 0.6405 - val_loss: 0.9838 - val_acc: 0.6552
Epoch 19/350
50000/50000 [==============================] - 93s - loss: 0.9887 - acc: 0.6489 - val_loss: 0.9735 - val_acc: 0.6536
Epoch 20/350
50000/50000 [==============================] - 93s - loss: 0.9726 - acc: 0.6572 - val_loss: 0.9817 - val_acc: 0.6501
Epoch 21/350
50000/50000 [==============================] - 93s - loss: 0.9558 - acc: 0.6637 - val_loss: 0.9261 - val_acc: 0.6737
Epoch 22/350
50000/50000 [==============================] - 93s - loss: 0.9376 - acc: 0.6685 - val_loss: 0.9240 - val_acc: 0.6678
Epoch 23/350
50000/50000 [==============================] - 93s - loss: 0.9331 - acc: 0.6720 - val_loss: 0.9142 - val_acc: 0.6751
Epoch 24/350
50000/50000 [==============================] - 93s - loss: 0.9064 - acc: 0.6813 - val_loss: 0.9189 - val_acc: 0.6717
Epoch 25/350
50000/50000 [==============================] - 93s - loss: 0.8913 - acc: 0.6845 - val_loss: 0.8738 - val_acc: 0.6908
Epoch 26/350
50000/50000 [==============================] - 93s - loss: 0.8805 - acc: 0.6899 - val_loss: 0.8706 - val_acc: 0.6887
Epoch 27/350
50000/50000 [==============================] - 93s - loss: 0.8642 - acc: 0.6978 - val_loss: 0.8663 - val_acc: 0.6911
Epoch 28/350
50000/50000 [==============================] - 93s - loss: 0.8518 - acc: 0.6995 - val_loss: 0.8401 - val_acc: 0.7024
Epoch 29/350
50000/50000 [==============================] - 93s - loss: 0.8415 - acc: 0.7037 - val_loss: 0.8268 - val_acc: 0.7104
Epoch 30/350
50000/50000 [==============================] - 93s - loss: 0.8272 - acc: 0.7090 - val_loss: 0.8339 - val_acc: 0.7075
Epoch 31/350
50000/50000 [==============================] - 93s - loss: 0.8131 - acc: 0.7142 - val_loss: 0.8077 - val_acc: 0.7183
Epoch 32/350
50000/50000 [==============================] - 93s - loss: 0.7993 - acc: 0.7188 - val_loss: 0.8632 - val_acc: 0.6934
Epoch 33/350
50000/50000 [==============================] - 93s - loss: 0.7936 - acc: 0.7195 - val_loss: 0.7951 - val_acc: 0.7247
Epoch 34/350
50000/50000 [==============================] - 93s - loss: 0.7852 - acc: 0.7229 - val_loss: 0.7753 - val_acc: 0.7267
Epoch 35/350
50000/50000 [==============================] - 93s - loss: 0.7719 - acc: 0.7279 - val_loss: 0.8116 - val_acc: 0.7141
Epoch 36/350
50000/50000 [==============================] - 93s - loss: 0.7569 - acc: 0.7343 - val_loss: 0.7882 - val_acc: 0.7218
Epoch 37/350
50000/50000 [==============================] - 93s - loss: 0.7486 - acc: 0.7364 - val_loss: 0.7836 - val_acc: 0.7247
Epoch 38/350
50000/50000 [==============================] - 93s - loss: 0.7446 - acc: 0.7409 - val_loss: 0.7781 - val_acc: 0.7215
Epoch 39/350
50000/50000 [==============================] - 93s - loss: 0.7346 - acc: 0.7410 - val_loss: 0.7509 - val_acc: 0.7292
Epoch 40/350
50000/50000 [==============================] - 93s - loss: 0.7213 - acc: 0.7458 - val_loss: 0.7410 - val_acc: 0.7415
Epoch 41/350
50000/50000 [==============================] - 93s - loss: 0.7103 - acc: 0.7490 - val_loss: 0.7229 - val_acc: 0.7466
Epoch 42/350
50000/50000 [==============================] - 93s - loss: 0.7015 - acc: 0.7541 - val_loss: 0.7380 - val_acc: 0.7401
Epoch 43/350
50000/50000 [==============================] - 93s - loss: 0.6990 - acc: 0.7529 - val_loss: 0.7170 - val_acc: 0.7471
Epoch 44/350
50000/50000 [==============================] - 93s - loss: 0.6875 - acc: 0.7590 - val_loss: 0.7139 - val_acc: 0.7516
Epoch 45/350
50000/50000 [==============================] - 93s - loss: 0.6794 - acc: 0.7628 - val_loss: 0.7038 - val_acc: 0.7514
Epoch 46/350
50000/50000 [==============================] - 93s - loss: 0.6687 - acc: 0.7642 - val_loss: 0.6959 - val_acc: 0.7565
Epoch 47/350
50000/50000 [==============================] - 93s - loss: 0.6634 - acc: 0.7664 - val_loss: 0.7046 - val_acc: 0.7542
Epoch 48/350
50000/50000 [==============================] - 93s - loss: 0.6527 - acc: 0.7699 - val_loss: 0.6942 - val_acc: 0.7569
Epoch 49/350
50000/50000 [==============================] - 93s - loss: 0.6489 - acc: 0.7711 - val_loss: 0.6980 - val_acc: 0.7562
Epoch 50/350
50000/50000 [==============================] - 93s - loss: 0.6443 - acc: 0.7743 - val_loss: 0.7010 - val_acc: 0.7571
Epoch 51/350
50000/50000 [==============================] - 93s - loss: 0.6344 - acc: 0.7784 - val_loss: 0.6661 - val_acc: 0.7670
Epoch 52/350
50000/50000 [==============================] - 93s - loss: 0.6284 - acc: 0.7799 - val_loss: 0.6616 - val_acc: 0.7710
Epoch 53/350
50000/50000 [==============================] - 92s - loss: 0.6176 - acc: 0.7843 - val_loss: 0.6822 - val_acc: 0.7631
Epoch 54/350
50000/50000 [==============================] - 93s - loss: 0.6108 - acc: 0.7839 - val_loss: 0.6507 - val_acc: 0.7739
Epoch 55/350
50000/50000 [==============================] - 92s - loss: 0.6072 - acc: 0.7861 - val_loss: 0.6573 - val_acc: 0.7722
Epoch 56/350
50000/50000 [==============================] - 93s - loss: 0.5988 - acc: 0.7891 - val_loss: 0.6493 - val_acc: 0.7726
Epoch 57/350
50000/50000 [==============================] - 93s - loss: 0.5924 - acc: 0.7916 - val_loss: 0.6314 - val_acc: 0.7810
Epoch 58/350
50000/50000 [==============================] - 92s - loss: 0.5829 - acc: 0.7953 - val_loss: 0.6408 - val_acc: 0.7810
Epoch 59/350
50000/50000 [==============================] - 93s - loss: 0.5773 - acc: 0.7961 - val_loss: 0.6301 - val_acc: 0.7829
Epoch 60/350
50000/50000 [==============================] - 93s - loss: 0.5714 - acc: 0.7986 - val_loss: 0.6344 - val_acc: 0.7801
Epoch 61/350
50000/50000 [==============================] - 92s - loss: 0.5677 - acc: 0.8006 - val_loss: 0.6471 - val_acc: 0.7729
Epoch 62/350
50000/50000 [==============================] - 92s - loss: 0.5593 - acc: 0.8032 - val_loss: 0.6270 - val_acc: 0.7828
Epoch 63/350
50000/50000 [==============================] - 92s - loss: 0.5523 - acc: 0.8049 - val_loss: 0.6107 - val_acc: 0.7918
Epoch 64/350
50000/50000 [==============================] - 92s - loss: 0.5479 - acc: 0.8081 - val_loss: 0.6430 - val_acc: 0.7778
Epoch 65/350
50000/50000 [==============================] - 92s - loss: 0.5440 - acc: 0.8084 - val_loss: 0.6222 - val_acc: 0.7872
Epoch 66/350
50000/50000 [==============================] - 92s - loss: 0.5350 - acc: 0.8139 - val_loss: 0.6283 - val_acc: 0.7849
Epoch 67/350
50000/50000 [==============================] - 92s - loss: 0.5343 - acc: 0.8111 - val_loss: 0.6106 - val_acc: 0.7896
Epoch 68/350
50000/50000 [==============================] - 92s - loss: 0.5230 - acc: 0.8165 - val_loss: 0.6019 - val_acc: 0.7927
Epoch 69/350
50000/50000 [==============================] - 92s - loss: 0.5184 - acc: 0.8173 - val_loss: 0.6141 - val_acc: 0.7896
Epoch 70/350
50000/50000 [==============================] - 93s - loss: 0.5131 - acc: 0.8198 - val_loss: 0.5987 - val_acc: 0.7937
Epoch 71/350
50000/50000 [==============================] - 92s - loss: 0.5126 - acc: 0.8189 - val_loss: 0.6088 - val_acc: 0.7903
Epoch 72/350
50000/50000 [==============================] - 92s - loss: 0.5034 - acc: 0.8221 - val_loss: 0.5936 - val_acc: 0.7987
Epoch 73/350
50000/50000 [==============================] - 92s - loss: 0.4999 - acc: 0.8243 - val_loss: 0.5851 - val_acc: 0.8012
Epoch 74/350
50000/50000 [==============================] - 92s - loss: 0.4949 - acc: 0.8259 - val_loss: 0.5772 - val_acc: 0.8009
Epoch 75/350
50000/50000 [==============================] - 92s - loss: 0.4849 - acc: 0.8296 - val_loss: 0.5770 - val_acc: 0.7996
Epoch 76/350
50000/50000 [==============================] - 92s - loss: 0.4863 - acc: 0.8285 - val_loss: 0.5791 - val_acc: 0.8031
Epoch 77/350
50000/50000 [==============================] - 92s - loss: 0.4781 - acc: 0.8319 - val_loss: 0.6049 - val_acc: 0.7967
Epoch 78/350
50000/50000 [==============================] - 92s - loss: 0.4684 - acc: 0.8347 - val_loss: 0.5797 - val_acc: 0.8055
Epoch 79/350
50000/50000 [==============================] - 92s - loss: 0.4685 - acc: 0.8332 - val_loss: 0.5667 - val_acc: 0.8107
Epoch 80/350
50000/50000 [==============================] - 92s - loss: 0.4625 - acc: 0.8379 - val_loss: 0.5870 - val_acc: 0.8010
Epoch 81/350
50000/50000 [==============================] - 92s - loss: 0.4624 - acc: 0.8382 - val_loss: 0.5859 - val_acc: 0.8033
Epoch 82/350
50000/50000 [==============================] - 92s - loss: 0.4498 - acc: 0.8396 - val_loss: 0.5777 - val_acc: 0.8060
Epoch 83/350
50000/50000 [==============================] - 92s - loss: 0.4511 - acc: 0.8394 - val_loss: 0.5773 - val_acc: 0.8045
Epoch 84/350
50000/50000 [==============================] - 92s - loss: 0.4417 - acc: 0.8435 - val_loss: 0.5757 - val_acc: 0.8070
Epoch 85/350
50000/50000 [==============================] - 92s - loss: 0.4375 - acc: 0.8458 - val_loss: 0.5603 - val_acc: 0.8091
Epoch 86/350
50000/50000 [==============================] - 92s - loss: 0.4349 - acc: 0.8453 - val_loss: 0.5862 - val_acc: 0.8017
Epoch 87/350
50000/50000 [==============================] - 92s - loss: 0.4335 - acc: 0.8459 - val_loss: 0.5648 - val_acc: 0.8110
Epoch 88/350
50000/50000 [==============================] - 92s - loss: 0.4226 - acc: 0.8495 - val_loss: 0.5849 - val_acc: 0.8053
Epoch 89/350
50000/50000 [==============================] - 93s - loss: 0.4234 - acc: 0.8502 - val_loss: 0.5581 - val_acc: 0.8156
Epoch 90/350
50000/50000 [==============================] - 92s - loss: 0.4177 - acc: 0.8515 - val_loss: 0.5608 - val_acc: 0.8135
Epoch 91/350
50000/50000 [==============================] - 92s - loss: 0.4152 - acc: 0.8527 - val_loss: 0.5627 - val_acc: 0.8154
Epoch 92/350
50000/50000 [==============================] - 92s - loss: 0.4080 - acc: 0.8563 - val_loss: 0.5709 - val_acc: 0.8147
Epoch 93/350
50000/50000 [==============================] - 92s - loss: 0.4049 - acc: 0.8569 - val_loss: 0.5975 - val_acc: 0.8005
Epoch 94/350
50000/50000 [==============================] - 92s - loss: 0.4014 - acc: 0.8584 - val_loss: 0.5541 - val_acc: 0.8141
Epoch 95/350
50000/50000 [==============================] - 92s - loss: 0.3930 - acc: 0.8610 - val_loss: 0.5626 - val_acc: 0.8097
Epoch 96/350
50000/50000 [==============================] - 93s - loss: 0.3918 - acc: 0.8613 - val_loss: 0.5510 - val_acc: 0.8154
Epoch 97/350
50000/50000 [==============================] - 92s - loss: 0.3912 - acc: 0.8615 - val_loss: 0.5429 - val_acc: 0.8177
Epoch 98/350
50000/50000 [==============================] - 93s - loss: 0.3813 - acc: 0.8663 - val_loss: 0.5717 - val_acc: 0.8140
Epoch 99/350
50000/50000 [==============================] - 92s - loss: 0.3817 - acc: 0.8642 - val_loss: 0.5754 - val_acc: 0.8109
Epoch 100/350
50000/50000 [==============================] - 93s - loss: 0.3748 - acc: 0.8677 - val_loss: 0.5641 - val_acc: 0.8169
Epoch 101/350
50000/50000 [==============================] - 92s - loss: 0.3722 - acc: 0.8684 - val_loss: 0.5516 - val_acc: 0.8216
Epoch 102/350
50000/50000 [==============================] - 92s - loss: 0.3689 - acc: 0.8690 - val_loss: 0.5633 - val_acc: 0.8149
Epoch 103/350
50000/50000 [==============================] - 93s - loss: 0.3676 - acc: 0.8699 - val_loss: 0.5605 - val_acc: 0.8162
Epoch 104/350
50000/50000 [==============================] - 92s - loss: 0.3614 - acc: 0.8724 - val_loss: 0.5700 - val_acc: 0.8149
Epoch 105/350
50000/50000 [==============================] - 93s - loss: 0.3589 - acc: 0.8727 - val_loss: 0.5465 - val_acc: 0.8244
Epoch 106/350
50000/50000 [==============================] - 92s - loss: 0.3581 - acc: 0.8718 - val_loss: 0.5659 - val_acc: 0.8203
Epoch 107/350
50000/50000 [==============================] - 92s - loss: 0.3538 - acc: 0.8741 - val_loss: 0.5489 - val_acc: 0.8196
Epoch 108/350
50000/50000 [==============================] - 92s - loss: 0.3445 - acc: 0.8768 - val_loss: 0.5492 - val_acc: 0.8170
Epoch 109/350
50000/50000 [==============================] - 92s - loss: 0.3440 - acc: 0.8784 - val_loss: 0.5593 - val_acc: 0.8224
Epoch 110/350
50000/50000 [==============================] - 92s - loss: 0.3412 - acc: 0.8778 - val_loss: 0.5445 - val_acc: 0.8265
Epoch 111/350
50000/50000 [==============================] - 92s - loss: 0.3320 - acc: 0.8820 - val_loss: 0.5371 - val_acc: 0.8252
Epoch 112/350
50000/50000 [==============================] - 92s - loss: 0.3294 - acc: 0.8836 - val_loss: 0.5409 - val_acc: 0.8249
Epoch 113/350
50000/50000 [==============================] - 92s - loss: 0.3326 - acc: 0.8814 - val_loss: 0.5327 - val_acc: 0.8266
Epoch 114/350
50000/50000 [==============================] - 92s - loss: 0.3290 - acc: 0.8821 - val_loss: 0.5484 - val_acc: 0.8231
Epoch 115/350
50000/50000 [==============================] - 92s - loss: 0.3248 - acc: 0.8841 - val_loss: 0.5596 - val_acc: 0.8262
Epoch 116/350
50000/50000 [==============================] - 92s - loss: 0.3221 - acc: 0.8856 - val_loss: 0.5542 - val_acc: 0.8237
Epoch 117/350
50000/50000 [==============================] - 92s - loss: 0.3191 - acc: 0.8869 - val_loss: 0.5624 - val_acc: 0.8199
Epoch 118/350
50000/50000 [==============================] - 92s - loss: 0.3188 - acc: 0.8863 - val_loss: 0.5632 - val_acc: 0.8219
Epoch 119/350
50000/50000 [==============================] - 92s - loss: 0.3100 - acc: 0.8903 - val_loss: 0.5550 - val_acc: 0.8246
Epoch 120/350
50000/50000 [==============================] - 92s - loss: 0.3089 - acc: 0.8891 - val_loss: 0.5314 - val_acc: 0.8290
Epoch 121/350
50000/50000 [==============================] - 92s - loss: 0.3045 - acc: 0.8914 - val_loss: 0.5624 - val_acc: 0.8232
Epoch 122/350
50000/50000 [==============================] - 92s - loss: 0.3032 - acc: 0.8913 - val_loss: 0.5478 - val_acc: 0.8294
Epoch 123/350
50000/50000 [==============================] - 92s - loss: 0.3008 - acc: 0.8916 - val_loss: 0.5579 - val_acc: 0.8269
Epoch 124/350
50000/50000 [==============================] - 92s - loss: 0.2926 - acc: 0.8943 - val_loss: 0.5533 - val_acc: 0.8263
Epoch 125/350
50000/50000 [==============================] - 92s - loss: 0.2952 - acc: 0.8951 - val_loss: 0.5524 - val_acc: 0.8277
Epoch 126/350
50000/50000 [==============================] - 92s - loss: 0.2953 - acc: 0.8947 - val_loss: 0.5345 - val_acc: 0.8289
Epoch 127/350
50000/50000 [==============================] - 92s - loss: 0.2840 - acc: 0.8988 - val_loss: 0.5551 - val_acc: 0.8270
Epoch 128/350
50000/50000 [==============================] - 92s - loss: 0.2848 - acc: 0.8968 - val_loss: 0.5613 - val_acc: 0.8241
Epoch 129/350
50000/50000 [==============================] - 92s - loss: 0.2856 - acc: 0.8964 - val_loss: 0.5461 - val_acc: 0.8273
Epoch 130/350
50000/50000 [==============================] - 92s - loss: 0.2766 - acc: 0.9003 - val_loss: 0.5487 - val_acc: 0.8315
Epoch 131/350
50000/50000 [==============================] - 92s - loss: 0.2816 - acc: 0.8992 - val_loss: 0.5496 - val_acc: 0.8273
Epoch 132/350
50000/50000 [==============================] - 92s - loss: 0.2734 - acc: 0.9023 - val_loss: 0.5642 - val_acc: 0.8269
Epoch 133/350
50000/50000 [==============================] - 92s - loss: 0.2735 - acc: 0.9020 - val_loss: 0.5447 - val_acc: 0.8322
Epoch 134/350
50000/50000 [==============================] - 92s - loss: 0.2661 - acc: 0.9035 - val_loss: 0.5651 - val_acc: 0.8266
Epoch 135/350
50000/50000 [==============================] - 92s - loss: 0.2679 - acc: 0.9050 - val_loss: 0.5477 - val_acc: 0.8356
Epoch 136/350
50000/50000 [==============================] - 92s - loss: 0.2628 - acc: 0.9044 - val_loss: 0.5560 - val_acc: 0.8339
Epoch 137/350
50000/50000 [==============================] - 92s - loss: 0.2581 - acc: 0.9080 - val_loss: 0.5797 - val_acc: 0.8285
Epoch 138/350
50000/50000 [==============================] - 92s - loss: 0.2619 - acc: 0.9066 - val_loss: 0.5419 - val_acc: 0.8325
Epoch 139/350
50000/50000 [==============================] - 92s - loss: 0.2575 - acc: 0.9068 - val_loss: 0.5415 - val_acc: 0.8329
Epoch 140/350
50000/50000 [==============================] - 92s - loss: 0.2516 - acc: 0.9089 - val_loss: 0.5311 - val_acc: 0.8345
Epoch 141/350
50000/50000 [==============================] - 92s - loss: 0.2457 - acc: 0.9123 - val_loss: 0.5604 - val_acc: 0.8276
Epoch 142/350
50000/50000 [==============================] - 92s - loss: 0.2535 - acc: 0.9084 - val_loss: 0.5466 - val_acc: 0.8327
Epoch 143/350
50000/50000 [==============================] - 92s - loss: 0.2477 - acc: 0.9114 - val_loss: 0.5667 - val_acc: 0.8311
Epoch 144/350
50000/50000 [==============================] - 92s - loss: 0.2447 - acc: 0.9111 - val_loss: 0.5431 - val_acc: 0.8346
Epoch 145/350
50000/50000 [==============================] - 92s - loss: 0.2419 - acc: 0.9141 - val_loss: 0.5668 - val_acc: 0.8340
Epoch 146/350
50000/50000 [==============================] - 92s - loss: 0.2402 - acc: 0.9127 - val_loss: 0.5616 - val_acc: 0.8321
Epoch 147/350
50000/50000 [==============================] - 92s - loss: 0.2416 - acc: 0.9127 - val_loss: 0.5525 - val_acc: 0.8342
Epoch 148/350
50000/50000 [==============================] - 92s - loss: 0.2349 - acc: 0.9157 - val_loss: 0.5451 - val_acc: 0.8361
Epoch 149/350
50000/50000 [==============================] - 92s - loss: 0.2370 - acc: 0.9134 - val_loss: 0.5544 - val_acc: 0.8337
Epoch 150/350
50000/50000 [==============================] - 92s - loss: 0.2277 - acc: 0.9184 - val_loss: 0.5793 - val_acc: 0.8320
Epoch 151/350
50000/50000 [==============================] - 92s - loss: 0.2312 - acc: 0.9164 - val_loss: 0.5551 - val_acc: 0.8361
Epoch 152/350
50000/50000 [==============================] - 92s - loss: 0.2301 - acc: 0.9172 - val_loss: 0.5539 - val_acc: 0.8358
Epoch 153/350
50000/50000 [==============================] - 92s - loss: 0.2273 - acc: 0.9179 - val_loss: 0.5858 - val_acc: 0.8273
Epoch 154/350
50000/50000 [==============================] - 92s - loss: 0.2177 - acc: 0.9210 - val_loss: 0.5663 - val_acc: 0.8340
Epoch 155/350
50000/50000 [==============================] - 92s - loss: 0.2216 - acc: 0.9200 - val_loss: 0.5668 - val_acc: 0.8355
Epoch 156/350
50000/50000 [==============================] - 92s - loss: 0.2192 - acc: 0.9210 - val_loss: 0.5643 - val_acc: 0.8327
Epoch 157/350
50000/50000 [==============================] - 92s - loss: 0.2162 - acc: 0.9223 - val_loss: 0.5952 - val_acc: 0.8260
Epoch 158/350
50000/50000 [==============================] - 92s - loss: 0.2152 - acc: 0.9218 - val_loss: 0.5705 - val_acc: 0.8363
Epoch 159/350
50000/50000 [==============================] - 92s - loss: 0.2162 - acc: 0.9212 - val_loss: 0.5829 - val_acc: 0.8278
Epoch 160/350
50000/50000 [==============================] - 92s - loss: 0.2140 - acc: 0.9232 - val_loss: 0.5511 - val_acc: 0.8359
Epoch 161/350
50000/50000 [==============================] - 92s - loss: 0.2093 - acc: 0.9236 - val_loss: 0.5714 - val_acc: 0.8355
Epoch 162/350
50000/50000 [==============================] - 92s - loss: 0.2079 - acc: 0.9248 - val_loss: 0.5636 - val_acc: 0.8359
Epoch 163/350
50000/50000 [==============================] - 92s - loss: 0.2062 - acc: 0.9246 - val_loss: 0.5744 - val_acc: 0.8316
Epoch 164/350
50000/50000 [==============================] - 92s - loss: 0.2020 - acc: 0.9262 - val_loss: 0.5714 - val_acc: 0.8345
Epoch 165/350
50000/50000 [==============================] - 92s - loss: 0.2025 - acc: 0.9273 - val_loss: 0.5749 - val_acc: 0.8321
Epoch 166/350
50000/50000 [==============================] - 92s - loss: 0.1986 - acc: 0.9281 - val_loss: 0.5825 - val_acc: 0.8362
Epoch 167/350
50000/50000 [==============================] - 92s - loss: 0.1984 - acc: 0.9282 - val_loss: 0.6042 - val_acc: 0.8321
Epoch 168/350
50000/50000 [==============================] - 92s - loss: 0.1958 - acc: 0.9281 - val_loss: 0.5854 - val_acc: 0.8352
Epoch 169/350
50000/50000 [==============================] - 92s - loss: 0.1980 - acc: 0.9286 - val_loss: 0.5865 - val_acc: 0.8400
Epoch 170/350
50000/50000 [==============================] - 92s - loss: 0.1962 - acc: 0.9292 - val_loss: 0.5832 - val_acc: 0.8369
Epoch 171/350
50000/50000 [==============================] - 92s - loss: 0.1929 - acc: 0.9307 - val_loss: 0.5767 - val_acc: 0.8416
Epoch 172/350
50000/50000 [==============================] - 92s - loss: 0.1887 - acc: 0.9294 - val_loss: 0.5681 - val_acc: 0.8391
Epoch 173/350
50000/50000 [==============================] - 92s - loss: 0.1964 - acc: 0.9289 - val_loss: 0.5921 - val_acc: 0.8398
Epoch 174/350
50000/50000 [==============================] - 92s - loss: 0.1881 - acc: 0.9311 - val_loss: 0.6142 - val_acc: 0.8283
Epoch 175/350
50000/50000 [==============================] - 92s - loss: 0.1875 - acc: 0.9311 - val_loss: 0.5754 - val_acc: 0.8372
Epoch 176/350
50000/50000 [==============================] - 92s - loss: 0.1846 - acc: 0.9321 - val_loss: 0.6213 - val_acc: 0.8342
Epoch 177/350
50000/50000 [==============================] - 92s - loss: 0.1818 - acc: 0.9336 - val_loss: 0.5922 - val_acc: 0.8324
Epoch 178/350
50000/50000 [==============================] - 92s - loss: 0.1839 - acc: 0.9339 - val_loss: 0.5896 - val_acc: 0.8371
Epoch 179/350
50000/50000 [==============================] - 92s - loss: 0.1822 - acc: 0.9334 - val_loss: 0.6121 - val_acc: 0.8365
Epoch 180/350
50000/50000 [==============================] - 92s - loss: 0.1789 - acc: 0.9353 - val_loss: 0.5709 - val_acc: 0.8385
Epoch 181/350
50000/50000 [==============================] - 92s - loss: 0.1816 - acc: 0.9346 - val_loss: 0.5820 - val_acc: 0.8379
Epoch 182/350
50000/50000 [==============================] - 92s - loss: 0.1785 - acc: 0.9351 - val_loss: 0.5989 - val_acc: 0.8381
Epoch 183/350
50000/50000 [==============================] - 92s - loss: 0.1764 - acc: 0.9369 - val_loss: 0.5926 - val_acc: 0.8373
Epoch 184/350
50000/50000 [==============================] - 92s - loss: 0.1712 - acc: 0.9364 - val_loss: 0.5842 - val_acc: 0.8415
Epoch 185/350
50000/50000 [==============================] - 92s - loss: 0.1706 - acc: 0.9395 - val_loss: 0.5914 - val_acc: 0.8404
Epoch 186/350
50000/50000 [==============================] - 92s - loss: 0.1731 - acc: 0.9371 - val_loss: 0.5997 - val_acc: 0.8363
Epoch 187/350
50000/50000 [==============================] - 92s - loss: 0.1717 - acc: 0.9382 - val_loss: 0.5938 - val_acc: 0.8376
Epoch 188/350
50000/50000 [==============================] - 92s - loss: 0.1675 - acc: 0.9383 - val_loss: 0.6077 - val_acc: 0.8358
Epoch 189/350
50000/50000 [==============================] - 92s - loss: 0.1694 - acc: 0.9385 - val_loss: 0.5811 - val_acc: 0.8388
Epoch 190/350
50000/50000 [==============================] - 92s - loss: 0.1629 - acc: 0.9400 - val_loss: 0.6183 - val_acc: 0.8336
Epoch 191/350
50000/50000 [==============================] - 92s - loss: 0.1666 - acc: 0.9404 - val_loss: 0.6177 - val_acc: 0.8342
Epoch 192/350
50000/50000 [==============================] - 92s - loss: 0.1676 - acc: 0.9393 - val_loss: 0.5861 - val_acc: 0.8410
Epoch 193/350
50000/50000 [==============================] - 92s - loss: 0.1584 - acc: 0.9407 - val_loss: 0.6432 - val_acc: 0.8341
Epoch 194/350
50000/50000 [==============================] - 92s - loss: 0.1630 - acc: 0.9407 - val_loss: 0.6130 - val_acc: 0.8393
Epoch 195/350
50000/50000 [==============================] - 92s - loss: 0.1557 - acc: 0.9440 - val_loss: 0.6087 - val_acc: 0.8359
Epoch 196/350
50000/50000 [==============================] - 92s - loss: 0.1583 - acc: 0.9424 - val_loss: 0.6358 - val_acc: 0.8351
Epoch 197/350
50000/50000 [==============================] - 92s - loss: 0.1574 - acc: 0.9434 - val_loss: 0.6256 - val_acc: 0.8384
Epoch 198/350
50000/50000 [==============================] - 92s - loss: 0.1535 - acc: 0.9447 - val_loss: 0.6227 - val_acc: 0.8442
Epoch 199/350
50000/50000 [==============================] - 92s - loss: 0.1581 - acc: 0.9433 - val_loss: 0.6158 - val_acc: 0.8424
Epoch 200/350
50000/50000 [==============================] - 92s - loss: 0.1494 - acc: 0.9473 - val_loss: 0.6381 - val_acc: 0.8303
Epoch 201/350
50000/50000 [==============================] - 92s - loss: 0.1514 - acc: 0.9458 - val_loss: 0.6174 - val_acc: 0.8386
Epoch 202/350
50000/50000 [==============================] - 92s - loss: 0.1475 - acc: 0.9471 - val_loss: 0.6705 - val_acc: 0.8273
Epoch 203/350
50000/50000 [==============================] - 92s - loss: 0.1547 - acc: 0.9437 - val_loss: 0.6054 - val_acc: 0.8438
Epoch 204/350
50000/50000 [==============================] - 92s - loss: 0.1484 - acc: 0.9470 - val_loss: 0.6308 - val_acc: 0.8362
Epoch 205/350
50000/50000 [==============================] - 92s - loss: 0.1480 - acc: 0.9460 - val_loss: 0.5994 - val_acc: 0.8419
Epoch 206/350
50000/50000 [==============================] - 92s - loss: 0.1479 - acc: 0.9458 - val_loss: 0.6151 - val_acc: 0.8432
Epoch 207/350
50000/50000 [==============================] - 92s - loss: 0.1473 - acc: 0.9471 - val_loss: 0.6082 - val_acc: 0.8412
Epoch 208/350
50000/50000 [==============================] - 92s - loss: 0.1427 - acc: 0.9475 - val_loss: 0.6300 - val_acc: 0.8411
Epoch 209/350
50000/50000 [==============================] - 92s - loss: 0.1395 - acc: 0.9495 - val_loss: 0.6133 - val_acc: 0.8428
Epoch 210/350
50000/50000 [==============================] - 92s - loss: 0.1432 - acc: 0.9473 - val_loss: 0.6292 - val_acc: 0.8416
Epoch 211/350
50000/50000 [==============================] - 92s - loss: 0.1398 - acc: 0.9487 - val_loss: 0.6096 - val_acc: 0.8401
Epoch 212/350
50000/50000 [==============================] - 92s - loss: 0.1395 - acc: 0.9489 - val_loss: 0.6714 - val_acc: 0.8384
Epoch 213/350
50000/50000 [==============================] - 92s - loss: 0.1384 - acc: 0.9498 - val_loss: 0.6223 - val_acc: 0.8472
Epoch 214/350
50000/50000 [==============================] - 92s - loss: 0.1377 - acc: 0.9498 - val_loss: 0.6438 - val_acc: 0.8385
Epoch 215/350
50000/50000 [==============================] - 92s - loss: 0.1364 - acc: 0.9504 - val_loss: 0.6605 - val_acc: 0.8371
Epoch 216/350
50000/50000 [==============================] - 92s - loss: 0.1380 - acc: 0.9496 - val_loss: 0.6185 - val_acc: 0.8435
Epoch 217/350
50000/50000 [==============================] - 92s - loss: 0.1416 - acc: 0.9484 - val_loss: 0.6113 - val_acc: 0.8434
Epoch 218/350
50000/50000 [==============================] - 92s - loss: 0.1332 - acc: 0.9517 - val_loss: 0.6213 - val_acc: 0.8412
Epoch 219/350
50000/50000 [==============================] - 92s - loss: 0.1364 - acc: 0.9493 - val_loss: 0.6487 - val_acc: 0.8378
Epoch 220/350
50000/50000 [==============================] - 92s - loss: 0.1356 - acc: 0.9514 - val_loss: 0.6333 - val_acc: 0.8406
Epoch 221/350
50000/50000 [==============================] - 92s - loss: 0.1323 - acc: 0.9527 - val_loss: 0.6325 - val_acc: 0.8410
Epoch 222/350
50000/50000 [==============================] - 92s - loss: 0.1327 - acc: 0.9520 - val_loss: 0.6374 - val_acc: 0.8449
Epoch 223/350
50000/50000 [==============================] - 92s - loss: 0.1319 - acc: 0.9524 - val_loss: 0.6459 - val_acc: 0.8432
Epoch 224/350
50000/50000 [==============================] - 92s - loss: 0.1311 - acc: 0.9514 - val_loss: 0.6455 - val_acc: 0.8391
Epoch 225/350
50000/50000 [==============================] - 92s - loss: 0.1305 - acc: 0.9523 - val_loss: 0.6514 - val_acc: 0.8385
Epoch 226/350
50000/50000 [==============================] - 92s - loss: 0.1289 - acc: 0.9535 - val_loss: 0.6287 - val_acc: 0.8457
Epoch 227/350
50000/50000 [==============================] - 92s - loss: 0.1257 - acc: 0.9543 - val_loss: 0.6612 - val_acc: 0.8379
Epoch 228/350
50000/50000 [==============================] - 92s - loss: 0.1285 - acc: 0.9529 - val_loss: 0.6384 - val_acc: 0.8438
Epoch 229/350
50000/50000 [==============================] - 92s - loss: 0.1235 - acc: 0.9537 - val_loss: 0.6327 - val_acc: 0.8416
Epoch 230/350
50000/50000 [==============================] - 92s - loss: 0.1273 - acc: 0.9541 - val_loss: 0.6348 - val_acc: 0.8437
Epoch 231/350
50000/50000 [==============================] - 92s - loss: 0.1217 - acc: 0.9561 - val_loss: 0.6386 - val_acc: 0.8437
Epoch 232/350
50000/50000 [==============================] - 92s - loss: 0.1220 - acc: 0.9553 - val_loss: 0.6514 - val_acc: 0.8386
Epoch 233/350
50000/50000 [==============================] - 92s - loss: 0.1232 - acc: 0.9558 - val_loss: 0.6436 - val_acc: 0.8439
Epoch 234/350
50000/50000 [==============================] - 90s - loss: 0.1242 - acc: 0.9550 - val_loss: 0.6398 - val_acc: 0.8414
Epoch 235/350
50000/50000 [==============================] - 90s - loss: 0.1220 - acc: 0.9545 - val_loss: 0.6335 - val_acc: 0.8422
Epoch 236/350
50000/50000 [==============================] - 89s - loss: 0.1197 - acc: 0.9575 - val_loss: 0.6330 - val_acc: 0.8403
Epoch 237/350
50000/50000 [==============================] - 89s - loss: 0.1164 - acc: 0.9576 - val_loss: 0.6734 - val_acc: 0.8368
Epoch 238/350
50000/50000 [==============================] - 89s - loss: 0.1196 - acc: 0.9570 - val_loss: 0.6270 - val_acc: 0.8407
Epoch 239/350
50000/50000 [==============================] - 89s - loss: 0.1212 - acc: 0.9563 - val_loss: 0.6449 - val_acc: 0.8446
Epoch 240/350
50000/50000 [==============================] - 89s - loss: 0.1151 - acc: 0.9578 - val_loss: 0.6420 - val_acc: 0.8458
Epoch 241/350
50000/50000 [==============================] - 89s - loss: 0.1130 - acc: 0.9587 - val_loss: 0.6580 - val_acc: 0.8429
Epoch 242/350
50000/50000 [==============================] - 89s - loss: 0.1172 - acc: 0.9574 - val_loss: 0.6744 - val_acc: 0.8401
Epoch 243/350
50000/50000 [==============================] - 89s - loss: 0.1162 - acc: 0.9580 - val_loss: 0.6335 - val_acc: 0.8485
Epoch 244/350
50000/50000 [==============================] - 89s - loss: 0.1138 - acc: 0.9581 - val_loss: 0.6201 - val_acc: 0.8468
Epoch 245/350
50000/50000 [==============================] - 89s - loss: 0.1185 - acc: 0.9569 - val_loss: 0.6458 - val_acc: 0.8421
Epoch 246/350
50000/50000 [==============================] - 89s - loss: 0.1149 - acc: 0.9594 - val_loss: 0.6516 - val_acc: 0.8447
Epoch 247/350
50000/50000 [==============================] - 89s - loss: 0.1126 - acc: 0.9584 - val_loss: 0.6875 - val_acc: 0.8345
Epoch 248/350
50000/50000 [==============================] - 89s - loss: 0.1104 - acc: 0.9598 - val_loss: 0.6549 - val_acc: 0.8446
Epoch 249/350
50000/50000 [==============================] - 89s - loss: 0.1126 - acc: 0.9598 - val_loss: 0.6285 - val_acc: 0.8425
Epoch 250/350
50000/50000 [==============================] - 89s - loss: 0.1092 - acc: 0.9603 - val_loss: 0.6685 - val_acc: 0.8401
Epoch 251/350
50000/50000 [==============================] - 89s - loss: 0.1073 - acc: 0.9610 - val_loss: 0.6560 - val_acc: 0.8472
Epoch 252/350
50000/50000 [==============================] - 89s - loss: 0.1126 - acc: 0.9594 - val_loss: 0.6392 - val_acc: 0.8446
Epoch 253/350
50000/50000 [==============================] - 89s - loss: 0.1077 - acc: 0.9608 - val_loss: 0.6499 - val_acc: 0.8453
Epoch 254/350
50000/50000 [==============================] - 89s - loss: 0.1080 - acc: 0.9612 - val_loss: 0.6430 - val_acc: 0.8471
Epoch 255/350
50000/50000 [==============================] - 89s - loss: 0.1056 - acc: 0.9627 - val_loss: 0.6512 - val_acc: 0.8454
Epoch 256/350
50000/50000 [==============================] - 89s - loss: 0.1055 - acc: 0.9619 - val_loss: 0.6502 - val_acc: 0.8449
Epoch 257/350
50000/50000 [==============================] - 89s - loss: 0.1077 - acc: 0.9602 - val_loss: 0.6729 - val_acc: 0.8414
Epoch 258/350
50000/50000 [==============================] - 89s - loss: 0.1034 - acc: 0.9624 - val_loss: 0.6405 - val_acc: 0.8481
Epoch 259/350
50000/50000 [==============================] - 89s - loss: 0.1038 - acc: 0.9627 - val_loss: 0.6261 - val_acc: 0.8512
Epoch 260/350
50000/50000 [==============================] - 89s - loss: 0.1021 - acc: 0.9630 - val_loss: 0.6408 - val_acc: 0.8448
Epoch 261/350
50000/50000 [==============================] - 89s - loss: 0.1024 - acc: 0.9626 - val_loss: 0.6540 - val_acc: 0.8446
Epoch 262/350
50000/50000 [==============================] - 89s - loss: 0.1019 - acc: 0.9623 - val_loss: 0.6779 - val_acc: 0.8429
Epoch 263/350
50000/50000 [==============================] - 89s - loss: 0.1037 - acc: 0.9620 - val_loss: 0.6564 - val_acc: 0.8476
Epoch 264/350
50000/50000 [==============================] - 89s - loss: 0.1039 - acc: 0.9626 - val_loss: 0.6613 - val_acc: 0.8460
Epoch 265/350
50000/50000 [==============================] - 89s - loss: 0.0984 - acc: 0.9642 - val_loss: 0.6737 - val_acc: 0.8465
Epoch 266/350
50000/50000 [==============================] - 89s - loss: 0.1022 - acc: 0.9633 - val_loss: 0.6873 - val_acc: 0.8403
Epoch 267/350
50000/50000 [==============================] - 89s - loss: 0.0999 - acc: 0.9638 - val_loss: 0.6452 - val_acc: 0.8483
Epoch 268/350
50000/50000 [==============================] - 89s - loss: 0.1053 - acc: 0.9613 - val_loss: 0.6521 - val_acc: 0.8448
Epoch 269/350
50000/50000 [==============================] - 89s - loss: 0.0994 - acc: 0.9637 - val_loss: 0.6268 - val_acc: 0.8469
Epoch 270/350
50000/50000 [==============================] - 89s - loss: 0.0999 - acc: 0.9644 - val_loss: 0.6922 - val_acc: 0.8385
Epoch 271/350
50000/50000 [==============================] - 89s - loss: 0.1007 - acc: 0.9628 - val_loss: 0.6256 - val_acc: 0.8477
Epoch 272/350
50000/50000 [==============================] - 89s - loss: 0.1001 - acc: 0.9637 - val_loss: 0.6827 - val_acc: 0.8450
Epoch 273/350
50000/50000 [==============================] - 89s - loss: 0.0979 - acc: 0.9653 - val_loss: 0.6530 - val_acc: 0.8437
Epoch 274/350
50000/50000 [==============================] - 89s - loss: 0.0994 - acc: 0.9640 - val_loss: 0.6634 - val_acc: 0.8446
Epoch 275/350
50000/50000 [==============================] - 89s - loss: 0.0961 - acc: 0.9663 - val_loss: 0.6528 - val_acc: 0.8473
Epoch 276/350
50000/50000 [==============================] - 89s - loss: 0.0958 - acc: 0.9657 - val_loss: 0.6477 - val_acc: 0.8496
Epoch 277/350
50000/50000 [==============================] - 89s - loss: 0.0922 - acc: 0.9665 - val_loss: 0.6732 - val_acc: 0.8483
Epoch 278/350
50000/50000 [==============================] - 89s - loss: 0.0943 - acc: 0.9653 - val_loss: 0.6514 - val_acc: 0.8486
Epoch 279/350
50000/50000 [==============================] - 89s - loss: 0.0903 - acc: 0.9675 - val_loss: 0.6771 - val_acc: 0.8428
Epoch 280/350
50000/50000 [==============================] - 89s - loss: 0.0973 - acc: 0.9657 - val_loss: 0.6811 - val_acc: 0.8471
Epoch 281/350
50000/50000 [==============================] - 89s - loss: 0.0946 - acc: 0.9650 - val_loss: 0.6726 - val_acc: 0.8479
Epoch 282/350
50000/50000 [==============================] - 89s - loss: 0.0957 - acc: 0.9660 - val_loss: 0.6505 - val_acc: 0.8479
Epoch 283/350
50000/50000 [==============================] - 89s - loss: 0.0944 - acc: 0.9660 - val_loss: 0.6901 - val_acc: 0.8442
Epoch 284/350
50000/50000 [==============================] - 89s - loss: 0.0913 - acc: 0.9668 - val_loss: 0.6532 - val_acc: 0.8479
Epoch 285/350
50000/50000 [==============================] - 89s - loss: 0.0910 - acc: 0.9672 - val_loss: 0.6872 - val_acc: 0.8457
Epoch 286/350
50000/50000 [==============================] - 89s - loss: 0.0916 - acc: 0.9669 - val_loss: 0.6833 - val_acc: 0.8463
Epoch 287/350
50000/50000 [==============================] - 89s - loss: 0.0913 - acc: 0.9677 - val_loss: 0.6862 - val_acc: 0.8414
Epoch 288/350
50000/50000 [==============================] - 89s - loss: 0.0893 - acc: 0.9680 - val_loss: 0.6681 - val_acc: 0.8462
Epoch 289/350
50000/50000 [==============================] - 89s - loss: 0.0879 - acc: 0.9685 - val_loss: 0.7099 - val_acc: 0.8453
Epoch 290/350
50000/50000 [==============================] - 89s - loss: 0.0914 - acc: 0.9673 - val_loss: 0.6410 - val_acc: 0.8477
Epoch 291/350
50000/50000 [==============================] - 89s - loss: 0.0858 - acc: 0.9687 - val_loss: 0.6655 - val_acc: 0.8458
Epoch 292/350
50000/50000 [==============================] - 89s - loss: 0.0884 - acc: 0.9684 - val_loss: 0.6761 - val_acc: 0.8493
Epoch 293/350
50000/50000 [==============================] - 89s - loss: 0.0877 - acc: 0.9681 - val_loss: 0.6866 - val_acc: 0.8483
Epoch 294/350
50000/50000 [==============================] - 89s - loss: 0.0879 - acc: 0.9688 - val_loss: 0.6640 - val_acc: 0.8512
Epoch 295/350
50000/50000 [==============================] - 89s - loss: 0.0873 - acc: 0.9686 - val_loss: 0.6879 - val_acc: 0.8438
Epoch 296/350
50000/50000 [==============================] - 89s - loss: 0.0884 - acc: 0.9681 - val_loss: 0.6866 - val_acc: 0.8498
Epoch 297/350
50000/50000 [==============================] - 89s - loss: 0.0836 - acc: 0.9693 - val_loss: 0.7283 - val_acc: 0.8439
Epoch 298/350
50000/50000 [==============================] - 89s - loss: 0.0818 - acc: 0.9712 - val_loss: 0.7063 - val_acc: 0.8439
Epoch 299/350
50000/50000 [==============================] - 89s - loss: 0.0876 - acc: 0.9684 - val_loss: 0.6757 - val_acc: 0.8423
Epoch 300/350
50000/50000 [==============================] - 89s - loss: 0.0869 - acc: 0.9690 - val_loss: 0.6929 - val_acc: 0.8479
Epoch 301/350
50000/50000 [==============================] - 89s - loss: 0.0878 - acc: 0.9685 - val_loss: 0.6819 - val_acc: 0.8455
Epoch 302/350
50000/50000 [==============================] - 89s - loss: 0.0806 - acc: 0.9706 - val_loss: 0.6702 - val_acc: 0.8478
Epoch 303/350
50000/50000 [==============================] - 89s - loss: 0.0857 - acc: 0.9694 - val_loss: 0.6633 - val_acc: 0.8499
Epoch 304/350
50000/50000 [==============================] - 89s - loss: 0.0802 - acc: 0.9716 - val_loss: 0.7191 - val_acc: 0.8430
Epoch 305/350
50000/50000 [==============================] - 89s - loss: 0.0877 - acc: 0.9687 - val_loss: 0.6628 - val_acc: 0.8492
Epoch 306/350
50000/50000 [==============================] - 89s - loss: 0.0803 - acc: 0.9716 - val_loss: 0.6711 - val_acc: 0.8491
Epoch 307/350
50000/50000 [==============================] - 89s - loss: 0.0830 - acc: 0.9706 - val_loss: 0.6971 - val_acc: 0.8436
Epoch 308/350
50000/50000 [==============================] - 89s - loss: 0.0830 - acc: 0.9705 - val_loss: 0.6570 - val_acc: 0.8476
Epoch 309/350
50000/50000 [==============================] - 90s - loss: 0.0816 - acc: 0.9714 - val_loss: 0.6913 - val_acc: 0.8427
Epoch 310/350
50000/50000 [==============================] - 91s - loss: 0.0823 - acc: 0.9712 - val_loss: 0.6845 - val_acc: 0.8448
Epoch 311/350
50000/50000 [==============================] - 91s - loss: 0.0795 - acc: 0.9706 - val_loss: 0.6899 - val_acc: 0.8467
Epoch 312/350
50000/50000 [==============================] - 92s - loss: 0.0855 - acc: 0.9694 - val_loss: 0.6879 - val_acc: 0.8488
Epoch 313/350
50000/50000 [==============================] - 92s - loss: 0.0809 - acc: 0.9714 - val_loss: 0.6743 - val_acc: 0.8499
Epoch 314/350
50000/50000 [==============================] - 92s - loss: 0.0759 - acc: 0.9737 - val_loss: 0.6672 - val_acc: 0.8517
Epoch 315/350
50000/50000 [==============================] - 92s - loss: 0.0787 - acc: 0.9718 - val_loss: 0.6892 - val_acc: 0.8470
Epoch 316/350
50000/50000 [==============================] - 91s - loss: 0.0779 - acc: 0.9714 - val_loss: 0.6854 - val_acc: 0.8505
Epoch 317/350
50000/50000 [==============================] - 92s - loss: 0.0811 - acc: 0.9706 - val_loss: 0.6608 - val_acc: 0.8505
Epoch 318/350
50000/50000 [==============================] - 92s - loss: 0.0765 - acc: 0.9726 - val_loss: 0.6994 - val_acc: 0.8458
Epoch 319/350
50000/50000 [==============================] - 92s - loss: 0.0797 - acc: 0.9718 - val_loss: 0.6663 - val_acc: 0.8511
Epoch 320/350
50000/50000 [==============================] - 92s - loss: 0.0791 - acc: 0.9722 - val_loss: 0.6816 - val_acc: 0.8514
Epoch 321/350
50000/50000 [==============================] - 92s - loss: 0.0810 - acc: 0.9715 - val_loss: 0.6982 - val_acc: 0.8463
Epoch 322/350
50000/50000 [==============================] - 92s - loss: 0.0772 - acc: 0.9719 - val_loss: 0.6967 - val_acc: 0.8492
Epoch 323/350
50000/50000 [==============================] - 92s - loss: 0.0760 - acc: 0.9721 - val_loss: 0.6865 - val_acc: 0.8487
Epoch 324/350
50000/50000 [==============================] - 92s - loss: 0.0766 - acc: 0.9729 - val_loss: 0.6872 - val_acc: 0.8469
Epoch 325/350
50000/50000 [==============================] - 92s - loss: 0.0772 - acc: 0.9723 - val_loss: 0.6707 - val_acc: 0.8480
Epoch 326/350
50000/50000 [==============================] - 92s - loss: 0.0705 - acc: 0.9746 - val_loss: 0.6955 - val_acc: 0.8483
Epoch 327/350
50000/50000 [==============================] - 92s - loss: 0.0790 - acc: 0.9713 - val_loss: 0.6830 - val_acc: 0.8501
Epoch 328/350
50000/50000 [==============================] - 92s - loss: 0.0754 - acc: 0.9732 - val_loss: 0.6807 - val_acc: 0.8478
Epoch 329/350
50000/50000 [==============================] - 91s - loss: 0.0749 - acc: 0.9734 - val_loss: 0.7042 - val_acc: 0.8450
Epoch 330/350
50000/50000 [==============================] - 91s - loss: 0.0758 - acc: 0.9727 - val_loss: 0.7086 - val_acc: 0.8480
Epoch 331/350
50000/50000 [==============================] - 91s - loss: 0.0744 - acc: 0.9733 - val_loss: 0.6931 - val_acc: 0.8495
Epoch 332/350
50000/50000 [==============================] - 92s - loss: 0.0773 - acc: 0.9723 - val_loss: 0.7019 - val_acc: 0.8481
Epoch 333/350
50000/50000 [==============================] - 92s - loss: 0.0727 - acc: 0.9743 - val_loss: 0.6886 - val_acc: 0.8499
Epoch 334/350
50000/50000 [==============================] - 92s - loss: 0.0761 - acc: 0.9724 - val_loss: 0.6710 - val_acc: 0.8512
Epoch 335/350
50000/50000 [==============================] - 92s - loss: 0.0714 - acc: 0.9748 - val_loss: 0.7063 - val_acc: 0.8509
Epoch 336/350
50000/50000 [==============================] - 92s - loss: 0.0693 - acc: 0.9749 - val_loss: 0.7140 - val_acc: 0.8522
Epoch 337/350
50000/50000 [==============================] - 92s - loss: 0.0744 - acc: 0.9732 - val_loss: 0.6983 - val_acc: 0.8499
Epoch 338/350
50000/50000 [==============================] - 92s - loss: 0.0728 - acc: 0.9736 - val_loss: 0.7125 - val_acc: 0.8439
Epoch 339/350
50000/50000 [==============================] - 91s - loss: 0.0724 - acc: 0.9733 - val_loss: 0.7075 - val_acc: 0.8481
Epoch 340/350
50000/50000 [==============================] - 92s - loss: 0.0763 - acc: 0.9722 - val_loss: 0.6878 - val_acc: 0.8486
Epoch 341/350
50000/50000 [==============================] - 92s - loss: 0.0709 - acc: 0.9752 - val_loss: 0.7004 - val_acc: 0.8497
Epoch 342/350
50000/50000 [==============================] - 91s - loss: 0.0695 - acc: 0.9754 - val_loss: 0.6897 - val_acc: 0.8520
Epoch 343/350
50000/50000 [==============================] - 92s - loss: 0.0677 - acc: 0.9757 - val_loss: 0.7112 - val_acc: 0.8456
Epoch 344/350
50000/50000 [==============================] - 91s - loss: 0.0724 - acc: 0.9744 - val_loss: 0.7068 - val_acc: 0.8503
Epoch 345/350
50000/50000 [==============================] - 92s - loss: 0.0697 - acc: 0.9749 - val_loss: 0.7034 - val_acc: 0.8504
Epoch 346/350
50000/50000 [==============================] - 91s - loss: 0.0739 - acc: 0.9745 - val_loss: 0.6881 - val_acc: 0.8445
Epoch 347/350
50000/50000 [==============================] - 91s - loss: 0.0694 - acc: 0.9758 - val_loss: 0.7061 - val_acc: 0.8487
Epoch 348/350
50000/50000 [==============================] - 91s - loss: 0.0697 - acc: 0.9748 - val_loss: 0.7144 - val_acc: 0.8509
Epoch 349/350
50000/50000 [==============================] - 92s - loss: 0.0678 - acc: 0.9759 - val_loss: 0.7058 - val_acc: 0.8495
Epoch 350/350
50000/50000 [==============================] - 92s - loss: 0.0710 - acc: 0.9752 - val_loss: 0.7187 - val_acc: 0.8496

Evaluate the model


In [14]:
scores = model.evaluate(images_test, class_test, verbose=0)
print("Accuracy: %.2f%%" % (scores[1]*100))


Accuracy: 84.96%

Model accuracy and loss plots


In [15]:
plot_model(model_details)


2. Augment the data


In [19]:
datagen = ImageDataGenerator(
    featurewise_center=False,  # set input mean to 0 over the dataset
    samplewise_center=False,  # set each sample mean to 0
    featurewise_std_normalization=False,  # divide inputs by std of the dataset
    samplewise_std_normalization=False,  # divide each input by its std
    zca_whitening=False,  # apply ZCA whitening
    rotation_range=45,  # randomly rotate images in the range (degrees, 0 to 180)
    width_shift_range=0.2,  # randomly shift images horizontally (fraction of total width)
    height_shift_range=0.2,  # randomly shift images vertically (fraction of total height)
    horizontal_flip=True,  # randomly flip images
    vertical_flip=False)  # randomly flip images

datagen.fit(images_train)

The above code augments the dataset to have random shifts, rotations and flips, thus increasing the size of the dataset.

Build model again


In [20]:
augmented_model = pure_cnn_model()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_19 (Conv2D)           (None, 32, 32, 96)        2688      
_________________________________________________________________
dropout_7 (Dropout)          (None, 32, 32, 96)        0         
_________________________________________________________________
conv2d_20 (Conv2D)           (None, 32, 32, 96)        83040     
_________________________________________________________________
conv2d_21 (Conv2D)           (None, 16, 16, 96)        83040     
_________________________________________________________________
dropout_8 (Dropout)          (None, 16, 16, 96)        0         
_________________________________________________________________
conv2d_22 (Conv2D)           (None, 16, 16, 192)       166080    
_________________________________________________________________
conv2d_23 (Conv2D)           (None, 16, 16, 192)       331968    
_________________________________________________________________
conv2d_24 (Conv2D)           (None, 8, 8, 192)         331968    
_________________________________________________________________
dropout_9 (Dropout)          (None, 8, 8, 192)         0         
_________________________________________________________________
conv2d_25 (Conv2D)           (None, 8, 8, 192)         331968    
_________________________________________________________________
activation_7 (Activation)    (None, 8, 8, 192)         0         
_________________________________________________________________
conv2d_26 (Conv2D)           (None, 8, 8, 192)         37056     
_________________________________________________________________
activation_8 (Activation)    (None, 8, 8, 192)         0         
_________________________________________________________________
conv2d_27 (Conv2D)           (None, 8, 8, 10)          1930      
_________________________________________________________________
global_average_pooling2d_3 ( (None, 10)                0         
_________________________________________________________________
activation_9 (Activation)    (None, 10)                0         
=================================================================
Total params: 1,369,738
Trainable params: 1,369,738
Non-trainable params: 0
_________________________________________________________________

Train model on the training data

Save the model after every epoch


In [21]:
augmented_checkpoint = ModelCheckpoint('augmented_best_model.h5',  # model filename
                             monitor='val_loss', # quantity to monitor
                             verbose=0, # verbosity - 0 or 1
                             save_best_only= True, # The latest best model will not be overwritten
                             mode='auto') # The decision to overwrite model is made 
                                          # automatically depending on the quantity to monitor

Configure the model for training


In [22]:
augmented_model.compile(loss='categorical_crossentropy', # Better loss function for neural networks
              optimizer=Adam(lr=LEARN_RATE), # Adam optimizer with 1.0e-4 learning rate
              metrics = ['accuracy']) # Metrics to be evaluated by the model

Fit the model on the data provided


In [23]:
augmented_model_details = augmented_model.fit_generator(datagen.flow(images_train, class_train, batch_size = 32),
                    steps_per_epoch = len(images_train) / 32, # number of samples per gradient update
                    epochs = NUM_EPOCH, # number of iterations
                    validation_data= (images_test, class_test),
                    callbacks=[augmented_checkpoint],
                    verbose=1)


Epoch 1/350
1563/1562 [==============================] - 107s - loss: 1.9689 - acc: 0.2589 - val_loss: 1.6575 - val_acc: 0.3786
Epoch 2/350
1563/1562 [==============================] - 106s - loss: 1.7189 - acc: 0.3684 - val_loss: 1.5286 - val_acc: 0.4464
Epoch 3/350
1563/1562 [==============================] - 106s - loss: 1.6102 - acc: 0.4114 - val_loss: 1.4007 - val_acc: 0.4890
Epoch 4/350
1563/1562 [==============================] - 106s - loss: 1.5129 - acc: 0.4503 - val_loss: 1.3066 - val_acc: 0.5261
Epoch 5/350
1563/1562 [==============================] - 106s - loss: 1.4430 - acc: 0.4780 - val_loss: 1.2456 - val_acc: 0.5559
Epoch 6/350
1563/1562 [==============================] - 106s - loss: 1.3788 - acc: 0.5016 - val_loss: 1.1631 - val_acc: 0.5806
Epoch 7/350
1563/1562 [==============================] - 106s - loss: 1.3361 - acc: 0.5168 - val_loss: 1.1373 - val_acc: 0.5938
Epoch 8/350
1563/1562 [==============================] - 106s - loss: 1.3005 - acc: 0.5305 - val_loss: 1.1376 - val_acc: 0.5951
Epoch 9/350
1563/1562 [==============================] - 106s - loss: 1.2734 - acc: 0.5428 - val_loss: 1.0886 - val_acc: 0.6090
Epoch 10/350
1563/1562 [==============================] - 106s - loss: 1.2414 - acc: 0.5546 - val_loss: 1.1023 - val_acc: 0.6093
Epoch 11/350
1563/1562 [==============================] - 106s - loss: 1.2222 - acc: 0.5649 - val_loss: 1.0647 - val_acc: 0.6253
Epoch 12/350
1563/1562 [==============================] - 106s - loss: 1.1988 - acc: 0.5700 - val_loss: 1.0680 - val_acc: 0.6242
Epoch 13/350
1563/1562 [==============================] - 106s - loss: 1.1748 - acc: 0.5838 - val_loss: 1.0642 - val_acc: 0.6246
Epoch 14/350
1563/1562 [==============================] - 106s - loss: 1.1596 - acc: 0.5861 - val_loss: 0.9812 - val_acc: 0.6453
Epoch 15/350
1563/1562 [==============================] - 106s - loss: 1.1411 - acc: 0.5958 - val_loss: 0.9417 - val_acc: 0.6695
Epoch 16/350
1563/1562 [==============================] - 106s - loss: 1.1209 - acc: 0.6035 - val_loss: 0.9407 - val_acc: 0.6682
Epoch 17/350
1563/1562 [==============================] - 106s - loss: 1.1083 - acc: 0.6053 - val_loss: 0.9901 - val_acc: 0.6606
Epoch 18/350
1563/1562 [==============================] - 106s - loss: 1.0900 - acc: 0.6135 - val_loss: 0.9502 - val_acc: 0.6693
Epoch 19/350
1563/1562 [==============================] - 106s - loss: 1.0718 - acc: 0.6192 - val_loss: 0.9558 - val_acc: 0.6719
Epoch 20/350
1563/1562 [==============================] - 106s - loss: 1.0609 - acc: 0.6211 - val_loss: 0.8963 - val_acc: 0.6818
Epoch 21/350
1563/1562 [==============================] - 106s - loss: 1.0495 - acc: 0.6267 - val_loss: 0.8861 - val_acc: 0.6889
Epoch 22/350
1563/1562 [==============================] - 106s - loss: 1.0368 - acc: 0.6332 - val_loss: 0.8915 - val_acc: 0.6813
Epoch 23/350
1563/1562 [==============================] - 106s - loss: 1.0223 - acc: 0.6377 - val_loss: 0.8900 - val_acc: 0.6872
Epoch 24/350
1563/1562 [==============================] - 106s - loss: 1.0204 - acc: 0.6396 - val_loss: 0.8309 - val_acc: 0.7074
Epoch 25/350
1563/1562 [==============================] - 106s - loss: 0.9972 - acc: 0.6490 - val_loss: 0.8353 - val_acc: 0.7068
Epoch 26/350
1563/1562 [==============================] - 106s - loss: 0.9936 - acc: 0.6514 - val_loss: 0.8420 - val_acc: 0.7086
Epoch 27/350
1563/1562 [==============================] - 106s - loss: 0.9826 - acc: 0.6540 - val_loss: 0.8441 - val_acc: 0.7101
Epoch 28/350
1563/1562 [==============================] - 106s - loss: 0.9618 - acc: 0.6597 - val_loss: 0.8333 - val_acc: 0.7092
Epoch 29/350
1563/1562 [==============================] - 106s - loss: 0.9592 - acc: 0.6620 - val_loss: 0.8251 - val_acc: 0.7212
Epoch 30/350
1563/1562 [==============================] - 106s - loss: 0.9447 - acc: 0.6660 - val_loss: 0.8106 - val_acc: 0.7202
Epoch 31/350
1563/1562 [==============================] - 106s - loss: 0.9332 - acc: 0.6715 - val_loss: 0.7898 - val_acc: 0.7246
Epoch 32/350
1563/1562 [==============================] - 106s - loss: 0.9260 - acc: 0.6734 - val_loss: 0.7786 - val_acc: 0.7334
Epoch 33/350
1563/1562 [==============================] - 106s - loss: 0.9128 - acc: 0.6795 - val_loss: 0.7982 - val_acc: 0.7214
Epoch 34/350
1563/1562 [==============================] - 106s - loss: 0.9074 - acc: 0.6825 - val_loss: 0.7505 - val_acc: 0.7361
Epoch 35/350
1563/1562 [==============================] - 106s - loss: 0.8946 - acc: 0.6872 - val_loss: 0.7615 - val_acc: 0.7358
Epoch 36/350
1563/1562 [==============================] - 106s - loss: 0.8872 - acc: 0.6886 - val_loss: 0.7302 - val_acc: 0.7502
Epoch 37/350
1563/1562 [==============================] - 106s - loss: 0.8776 - acc: 0.6911 - val_loss: 0.7350 - val_acc: 0.7487
Epoch 38/350
1563/1562 [==============================] - 106s - loss: 0.8656 - acc: 0.6943 - val_loss: 0.7372 - val_acc: 0.7511
Epoch 39/350
1563/1562 [==============================] - 106s - loss: 0.8593 - acc: 0.6984 - val_loss: 0.7601 - val_acc: 0.7426
Epoch 40/350
1563/1562 [==============================] - 106s - loss: 0.8577 - acc: 0.6997 - val_loss: 0.7086 - val_acc: 0.7566
Epoch 41/350
1563/1562 [==============================] - 106s - loss: 0.8446 - acc: 0.7032 - val_loss: 0.7057 - val_acc: 0.7598
Epoch 42/350
1563/1562 [==============================] - 106s - loss: 0.8336 - acc: 0.7056 - val_loss: 0.6953 - val_acc: 0.7598
Epoch 43/350
1563/1562 [==============================] - 106s - loss: 0.8233 - acc: 0.7113 - val_loss: 0.6960 - val_acc: 0.7662
Epoch 44/350
1563/1562 [==============================] - 106s - loss: 0.8164 - acc: 0.7123 - val_loss: 0.7206 - val_acc: 0.7517
Epoch 45/350
1563/1562 [==============================] - 106s - loss: 0.8079 - acc: 0.7149 - val_loss: 0.6861 - val_acc: 0.7672
Epoch 46/350
1563/1562 [==============================] - 106s - loss: 0.7960 - acc: 0.7204 - val_loss: 0.6587 - val_acc: 0.7722
Epoch 47/350
1563/1562 [==============================] - 106s - loss: 0.7916 - acc: 0.7204 - val_loss: 0.7090 - val_acc: 0.7582
Epoch 48/350
1563/1562 [==============================] - 106s - loss: 0.7858 - acc: 0.7240 - val_loss: 0.6967 - val_acc: 0.7642
Epoch 49/350
1563/1562 [==============================] - 106s - loss: 0.7833 - acc: 0.7258 - val_loss: 0.6718 - val_acc: 0.7704
Epoch 50/350
1563/1562 [==============================] - 106s - loss: 0.7696 - acc: 0.7297 - val_loss: 0.6761 - val_acc: 0.7724
Epoch 51/350
1563/1562 [==============================] - 106s - loss: 0.7659 - acc: 0.7340 - val_loss: 0.6658 - val_acc: 0.7739
Epoch 52/350
1563/1562 [==============================] - 106s - loss: 0.7538 - acc: 0.7385 - val_loss: 0.6617 - val_acc: 0.7823
Epoch 53/350
1563/1562 [==============================] - 106s - loss: 0.7533 - acc: 0.7372 - val_loss: 0.6424 - val_acc: 0.7823
Epoch 54/350
1563/1562 [==============================] - 106s - loss: 0.7452 - acc: 0.7378 - val_loss: 0.6266 - val_acc: 0.7845
Epoch 55/350
1563/1562 [==============================] - 106s - loss: 0.7387 - acc: 0.7408 - val_loss: 0.6430 - val_acc: 0.7781
Epoch 56/350
1563/1562 [==============================] - 106s - loss: 0.7318 - acc: 0.7438 - val_loss: 0.6123 - val_acc: 0.7919
Epoch 57/350
1563/1562 [==============================] - 106s - loss: 0.7314 - acc: 0.7435 - val_loss: 0.6038 - val_acc: 0.7910
Epoch 58/350
1563/1562 [==============================] - 106s - loss: 0.7214 - acc: 0.7460 - val_loss: 0.6280 - val_acc: 0.7839
Epoch 59/350
1563/1562 [==============================] - 106s - loss: 0.7171 - acc: 0.7481 - val_loss: 0.5900 - val_acc: 0.7988
Epoch 60/350
1563/1562 [==============================] - 106s - loss: 0.7115 - acc: 0.7505 - val_loss: 0.5882 - val_acc: 0.8002
Epoch 61/350
1563/1562 [==============================] - 106s - loss: 0.7049 - acc: 0.7540 - val_loss: 0.6137 - val_acc: 0.7947
Epoch 62/350
1563/1562 [==============================] - 106s - loss: 0.6997 - acc: 0.7559 - val_loss: 0.6107 - val_acc: 0.7932
Epoch 63/350
1563/1562 [==============================] - 106s - loss: 0.6893 - acc: 0.7599 - val_loss: 0.5877 - val_acc: 0.8007
Epoch 64/350
1563/1562 [==============================] - 106s - loss: 0.6906 - acc: 0.7570 - val_loss: 0.6062 - val_acc: 0.7925
Epoch 65/350
1563/1562 [==============================] - 106s - loss: 0.6895 - acc: 0.7592 - val_loss: 0.5761 - val_acc: 0.8039
Epoch 66/350
1563/1562 [==============================] - 106s - loss: 0.6711 - acc: 0.7670 - val_loss: 0.5705 - val_acc: 0.8041
Epoch 67/350
1563/1562 [==============================] - 106s - loss: 0.6723 - acc: 0.7655 - val_loss: 0.5790 - val_acc: 0.8032
Epoch 68/350
1563/1562 [==============================] - 106s - loss: 0.6631 - acc: 0.7687 - val_loss: 0.5643 - val_acc: 0.8058
Epoch 69/350
1563/1562 [==============================] - 106s - loss: 0.6628 - acc: 0.7686 - val_loss: 0.6267 - val_acc: 0.7880
Epoch 70/350
1563/1562 [==============================] - 106s - loss: 0.6580 - acc: 0.7692 - val_loss: 0.5905 - val_acc: 0.8063
Epoch 71/350
1563/1562 [==============================] - 106s - loss: 0.6564 - acc: 0.7687 - val_loss: 0.6011 - val_acc: 0.8001
Epoch 72/350
1563/1562 [==============================] - 106s - loss: 0.6526 - acc: 0.7731 - val_loss: 0.5556 - val_acc: 0.8131
Epoch 73/350
1563/1562 [==============================] - 106s - loss: 0.6471 - acc: 0.7729 - val_loss: 0.5458 - val_acc: 0.8138
Epoch 74/350
1563/1562 [==============================] - 106s - loss: 0.6412 - acc: 0.7747 - val_loss: 0.5606 - val_acc: 0.8113
Epoch 75/350
1563/1562 [==============================] - 106s - loss: 0.6380 - acc: 0.7757 - val_loss: 0.5466 - val_acc: 0.8149
Epoch 76/350
1563/1562 [==============================] - 106s - loss: 0.6350 - acc: 0.7794 - val_loss: 0.5470 - val_acc: 0.8135
Epoch 77/350
1563/1562 [==============================] - 106s - loss: 0.6321 - acc: 0.7798 - val_loss: 0.5291 - val_acc: 0.8203
Epoch 78/350
1563/1562 [==============================] - 106s - loss: 0.6265 - acc: 0.7803 - val_loss: 0.5703 - val_acc: 0.8076
Epoch 79/350
1563/1562 [==============================] - 106s - loss: 0.6227 - acc: 0.7810 - val_loss: 0.5734 - val_acc: 0.8101
Epoch 80/350
1563/1562 [==============================] - 106s - loss: 0.6186 - acc: 0.7843 - val_loss: 0.5286 - val_acc: 0.8221
Epoch 81/350
1563/1562 [==============================] - 106s - loss: 0.6134 - acc: 0.7850 - val_loss: 0.5338 - val_acc: 0.8209
Epoch 82/350
1563/1562 [==============================] - 106s - loss: 0.6112 - acc: 0.7876 - val_loss: 0.5356 - val_acc: 0.8180
Epoch 83/350
1563/1562 [==============================] - 106s - loss: 0.6106 - acc: 0.7843 - val_loss: 0.5434 - val_acc: 0.8199
Epoch 84/350
1563/1562 [==============================] - 106s - loss: 0.6057 - acc: 0.7897 - val_loss: 0.5360 - val_acc: 0.8205
Epoch 85/350
1563/1562 [==============================] - 106s - loss: 0.5969 - acc: 0.7916 - val_loss: 0.5213 - val_acc: 0.8234
Epoch 86/350
1563/1562 [==============================] - 106s - loss: 0.5951 - acc: 0.7915 - val_loss: 0.5162 - val_acc: 0.8261
Epoch 87/350
1563/1562 [==============================] - 106s - loss: 0.5962 - acc: 0.7930 - val_loss: 0.5173 - val_acc: 0.8268
Epoch 88/350
1563/1562 [==============================] - 106s - loss: 0.5846 - acc: 0.7951 - val_loss: 0.4977 - val_acc: 0.8312
Epoch 89/350
1563/1562 [==============================] - 106s - loss: 0.5855 - acc: 0.7943 - val_loss: 0.5180 - val_acc: 0.8265
Epoch 90/350
1563/1562 [==============================] - 106s - loss: 0.5830 - acc: 0.7953 - val_loss: 0.5129 - val_acc: 0.8266
Epoch 91/350
1563/1562 [==============================] - 106s - loss: 0.5783 - acc: 0.7988 - val_loss: 0.5373 - val_acc: 0.8180
Epoch 92/350
1563/1562 [==============================] - 106s - loss: 0.5765 - acc: 0.7991 - val_loss: 0.5398 - val_acc: 0.8183
Epoch 93/350
1563/1562 [==============================] - 106s - loss: 0.5731 - acc: 0.7990 - val_loss: 0.5348 - val_acc: 0.8192
Epoch 94/350
1563/1562 [==============================] - 106s - loss: 0.5757 - acc: 0.7998 - val_loss: 0.5329 - val_acc: 0.8217
Epoch 95/350
1563/1562 [==============================] - 106s - loss: 0.5658 - acc: 0.8035 - val_loss: 0.5068 - val_acc: 0.8284
Epoch 96/350
1563/1562 [==============================] - 106s - loss: 0.5642 - acc: 0.8055 - val_loss: 0.4896 - val_acc: 0.8378
Epoch 97/350
1563/1562 [==============================] - 106s - loss: 0.5601 - acc: 0.8045 - val_loss: 0.5216 - val_acc: 0.8249
Epoch 98/350
1563/1562 [==============================] - 106s - loss: 0.5550 - acc: 0.8069 - val_loss: 0.5084 - val_acc: 0.8304
Epoch 99/350
1563/1562 [==============================] - 106s - loss: 0.5620 - acc: 0.8042 - val_loss: 0.4852 - val_acc: 0.8370
Epoch 100/350
1563/1562 [==============================] - 106s - loss: 0.5504 - acc: 0.8082 - val_loss: 0.4913 - val_acc: 0.8376
Epoch 101/350
1563/1562 [==============================] - 106s - loss: 0.5546 - acc: 0.8054 - val_loss: 0.4988 - val_acc: 0.8336
Epoch 102/350
1563/1562 [==============================] - 106s - loss: 0.5439 - acc: 0.8107 - val_loss: 0.5035 - val_acc: 0.8332
Epoch 103/350
1563/1562 [==============================] - 106s - loss: 0.5405 - acc: 0.8103 - val_loss: 0.4831 - val_acc: 0.8354
Epoch 104/350
1563/1562 [==============================] - 106s - loss: 0.5447 - acc: 0.8100 - val_loss: 0.4964 - val_acc: 0.8315
Epoch 105/350
1563/1562 [==============================] - 106s - loss: 0.5365 - acc: 0.8128 - val_loss: 0.4935 - val_acc: 0.8338
Epoch 106/350
1563/1562 [==============================] - 106s - loss: 0.5355 - acc: 0.8138 - val_loss: 0.5012 - val_acc: 0.8303
Epoch 107/350
1563/1562 [==============================] - 106s - loss: 0.5310 - acc: 0.8130 - val_loss: 0.4862 - val_acc: 0.8362
Epoch 108/350
1563/1562 [==============================] - 106s - loss: 0.5295 - acc: 0.8160 - val_loss: 0.4751 - val_acc: 0.8440
Epoch 109/350
1563/1562 [==============================] - 106s - loss: 0.5270 - acc: 0.8158 - val_loss: 0.5261 - val_acc: 0.8278
Epoch 110/350
1563/1562 [==============================] - 106s - loss: 0.5259 - acc: 0.8159 - val_loss: 0.4817 - val_acc: 0.8397
Epoch 111/350
1563/1562 [==============================] - 106s - loss: 0.5223 - acc: 0.8170 - val_loss: 0.5033 - val_acc: 0.8348
Epoch 112/350
1563/1562 [==============================] - 106s - loss: 0.5163 - acc: 0.8202 - val_loss: 0.4833 - val_acc: 0.8377
Epoch 113/350
1563/1562 [==============================] - 106s - loss: 0.5182 - acc: 0.8198 - val_loss: 0.4873 - val_acc: 0.8430
Epoch 114/350
1563/1562 [==============================] - 106s - loss: 0.5180 - acc: 0.8197 - val_loss: 0.4653 - val_acc: 0.8429
Epoch 115/350
1563/1562 [==============================] - 106s - loss: 0.5181 - acc: 0.8200 - val_loss: 0.4765 - val_acc: 0.8424
Epoch 116/350
1563/1562 [==============================] - 106s - loss: 0.5025 - acc: 0.8250 - val_loss: 0.4582 - val_acc: 0.8440
Epoch 117/350
1563/1562 [==============================] - 106s - loss: 0.5118 - acc: 0.8208 - val_loss: 0.4797 - val_acc: 0.8421
Epoch 118/350
1563/1562 [==============================] - 106s - loss: 0.5106 - acc: 0.8214 - val_loss: 0.4725 - val_acc: 0.8417
Epoch 119/350
1563/1562 [==============================] - 106s - loss: 0.5054 - acc: 0.8227 - val_loss: 0.4363 - val_acc: 0.8537
Epoch 120/350
1563/1562 [==============================] - 106s - loss: 0.5027 - acc: 0.8240 - val_loss: 0.4607 - val_acc: 0.8477
Epoch 121/350
1563/1562 [==============================] - 106s - loss: 0.4983 - acc: 0.8261 - val_loss: 0.4584 - val_acc: 0.8479
Epoch 122/350
1563/1562 [==============================] - 106s - loss: 0.4972 - acc: 0.8277 - val_loss: 0.4820 - val_acc: 0.8406
Epoch 123/350
1563/1562 [==============================] - 106s - loss: 0.4958 - acc: 0.8258 - val_loss: 0.4867 - val_acc: 0.8444
Epoch 124/350
1563/1562 [==============================] - 106s - loss: 0.4896 - acc: 0.8286 - val_loss: 0.4528 - val_acc: 0.8496
Epoch 125/350
1563/1562 [==============================] - 106s - loss: 0.4846 - acc: 0.8317 - val_loss: 0.4632 - val_acc: 0.8490
Epoch 126/350
1563/1562 [==============================] - 106s - loss: 0.4923 - acc: 0.8283 - val_loss: 0.4706 - val_acc: 0.8461
Epoch 127/350
1563/1562 [==============================] - 106s - loss: 0.4824 - acc: 0.8316 - val_loss: 0.4491 - val_acc: 0.8495
Epoch 128/350
1563/1562 [==============================] - 106s - loss: 0.4841 - acc: 0.8300 - val_loss: 0.4684 - val_acc: 0.8458
Epoch 129/350
1563/1562 [==============================] - 106s - loss: 0.4797 - acc: 0.8308 - val_loss: 0.4674 - val_acc: 0.8513
Epoch 130/350
1563/1562 [==============================] - 106s - loss: 0.4819 - acc: 0.8332 - val_loss: 0.4468 - val_acc: 0.8512
Epoch 131/350
1563/1562 [==============================] - 106s - loss: 0.4786 - acc: 0.8324 - val_loss: 0.4394 - val_acc: 0.8527
Epoch 132/350
1563/1562 [==============================] - 106s - loss: 0.4737 - acc: 0.8361 - val_loss: 0.4551 - val_acc: 0.8535
Epoch 133/350
1563/1562 [==============================] - 106s - loss: 0.4705 - acc: 0.8362 - val_loss: 0.4711 - val_acc: 0.8496
Epoch 134/350
1563/1562 [==============================] - 106s - loss: 0.4726 - acc: 0.8343 - val_loss: 0.4874 - val_acc: 0.8454
Epoch 135/350
1563/1562 [==============================] - 106s - loss: 0.4690 - acc: 0.8374 - val_loss: 0.4422 - val_acc: 0.8532
Epoch 136/350
1563/1562 [==============================] - 106s - loss: 0.4701 - acc: 0.8350 - val_loss: 0.4306 - val_acc: 0.8574
Epoch 137/350
1563/1562 [==============================] - 106s - loss: 0.4648 - acc: 0.8385 - val_loss: 0.4817 - val_acc: 0.8447
Epoch 138/350
1563/1562 [==============================] - 106s - loss: 0.4661 - acc: 0.8374 - val_loss: 0.4540 - val_acc: 0.8521
Epoch 139/350
1563/1562 [==============================] - 106s - loss: 0.4660 - acc: 0.8369 - val_loss: 0.4501 - val_acc: 0.8512
Epoch 140/350
1563/1562 [==============================] - 106s - loss: 0.4588 - acc: 0.8377 - val_loss: 0.4384 - val_acc: 0.8485
Epoch 141/350
1563/1562 [==============================] - 106s - loss: 0.4596 - acc: 0.8407 - val_loss: 0.4609 - val_acc: 0.8469
Epoch 142/350
1563/1562 [==============================] - 106s - loss: 0.4583 - acc: 0.8393 - val_loss: 0.4689 - val_acc: 0.8468
Epoch 143/350
1563/1562 [==============================] - 106s - loss: 0.4531 - acc: 0.8429 - val_loss: 0.4422 - val_acc: 0.8574
Epoch 144/350
1563/1562 [==============================] - 106s - loss: 0.4574 - acc: 0.8396 - val_loss: 0.4536 - val_acc: 0.8515
Epoch 145/350
1563/1562 [==============================] - 106s - loss: 0.4507 - acc: 0.8417 - val_loss: 0.4442 - val_acc: 0.8566
Epoch 146/350
1563/1562 [==============================] - 106s - loss: 0.4518 - acc: 0.8405 - val_loss: 0.4482 - val_acc: 0.8527
Epoch 147/350
1563/1562 [==============================] - 106s - loss: 0.4474 - acc: 0.8428 - val_loss: 0.4389 - val_acc: 0.8559
Epoch 148/350
1563/1562 [==============================] - 106s - loss: 0.4486 - acc: 0.8428 - val_loss: 0.4426 - val_acc: 0.8577
Epoch 149/350
1563/1562 [==============================] - 106s - loss: 0.4497 - acc: 0.8429 - val_loss: 0.4362 - val_acc: 0.8565
Epoch 150/350
1563/1562 [==============================] - 106s - loss: 0.4361 - acc: 0.8465 - val_loss: 0.4598 - val_acc: 0.8523
Epoch 151/350
1563/1562 [==============================] - 106s - loss: 0.4396 - acc: 0.8458 - val_loss: 0.4435 - val_acc: 0.8518
Epoch 152/350
1563/1562 [==============================] - 106s - loss: 0.4380 - acc: 0.8463 - val_loss: 0.4443 - val_acc: 0.8595
Epoch 153/350
1563/1562 [==============================] - 106s - loss: 0.4375 - acc: 0.8479 - val_loss: 0.4315 - val_acc: 0.8557
Epoch 154/350
1563/1562 [==============================] - 106s - loss: 0.4327 - acc: 0.8485 - val_loss: 0.4429 - val_acc: 0.8549
Epoch 155/350
1563/1562 [==============================] - 106s - loss: 0.4418 - acc: 0.8458 - val_loss: 0.4634 - val_acc: 0.8504
Epoch 156/350
1563/1562 [==============================] - 106s - loss: 0.4339 - acc: 0.8469 - val_loss: 0.4407 - val_acc: 0.8559
Epoch 157/350
1563/1562 [==============================] - 106s - loss: 0.4303 - acc: 0.8491 - val_loss: 0.4344 - val_acc: 0.8561
Epoch 158/350
1563/1562 [==============================] - 106s - loss: 0.4328 - acc: 0.8488 - val_loss: 0.4264 - val_acc: 0.8587
Epoch 159/350
1563/1562 [==============================] - 106s - loss: 0.4271 - acc: 0.8508 - val_loss: 0.4380 - val_acc: 0.8577
Epoch 160/350
1563/1562 [==============================] - 105s - loss: 0.4254 - acc: 0.8510 - val_loss: 0.4499 - val_acc: 0.8524
Epoch 161/350
1563/1562 [==============================] - 106s - loss: 0.4236 - acc: 0.8524 - val_loss: 0.4161 - val_acc: 0.8639
Epoch 162/350
1563/1562 [==============================] - 106s - loss: 0.4234 - acc: 0.8525 - val_loss: 0.4149 - val_acc: 0.8614
Epoch 163/350
1563/1562 [==============================] - 105s - loss: 0.4226 - acc: 0.8519 - val_loss: 0.4266 - val_acc: 0.8607
Epoch 164/350
1563/1562 [==============================] - 105s - loss: 0.4243 - acc: 0.8512 - val_loss: 0.4209 - val_acc: 0.8578
Epoch 165/350
1563/1562 [==============================] - 105s - loss: 0.4238 - acc: 0.8525 - val_loss: 0.4212 - val_acc: 0.8625
Epoch 166/350
1563/1562 [==============================] - 106s - loss: 0.4250 - acc: 0.8516 - val_loss: 0.4148 - val_acc: 0.8633
Epoch 167/350
1563/1562 [==============================] - 105s - loss: 0.4174 - acc: 0.8521 - val_loss: 0.4350 - val_acc: 0.8592
Epoch 168/350
1563/1562 [==============================] - 106s - loss: 0.4177 - acc: 0.8524 - val_loss: 0.4307 - val_acc: 0.8624
Epoch 169/350
1563/1562 [==============================] - 106s - loss: 0.4147 - acc: 0.8541 - val_loss: 0.4374 - val_acc: 0.8611
Epoch 170/350
1563/1562 [==============================] - 106s - loss: 0.4131 - acc: 0.8563 - val_loss: 0.4317 - val_acc: 0.8618
Epoch 171/350
1563/1562 [==============================] - 106s - loss: 0.4120 - acc: 0.8549 - val_loss: 0.4372 - val_acc: 0.8598
Epoch 172/350
1563/1562 [==============================] - 106s - loss: 0.4126 - acc: 0.8553 - val_loss: 0.4380 - val_acc: 0.8594
Epoch 173/350
1563/1562 [==============================] - 106s - loss: 0.4096 - acc: 0.8557 - val_loss: 0.4231 - val_acc: 0.8654
Epoch 174/350
1563/1562 [==============================] - 106s - loss: 0.4110 - acc: 0.8556 - val_loss: 0.4070 - val_acc: 0.8658
Epoch 175/350
1563/1562 [==============================] - 106s - loss: 0.4052 - acc: 0.8555 - val_loss: 0.4334 - val_acc: 0.8612
Epoch 176/350
1563/1562 [==============================] - 106s - loss: 0.4021 - acc: 0.8580 - val_loss: 0.4298 - val_acc: 0.8623
Epoch 177/350
1563/1562 [==============================] - 106s - loss: 0.4014 - acc: 0.8590 - val_loss: 0.4279 - val_acc: 0.8573
Epoch 178/350
1563/1562 [==============================] - 106s - loss: 0.4010 - acc: 0.8591 - val_loss: 0.4123 - val_acc: 0.8637
Epoch 179/350
1563/1562 [==============================] - 106s - loss: 0.4054 - acc: 0.8576 - val_loss: 0.4093 - val_acc: 0.8657
Epoch 180/350
1563/1562 [==============================] - 106s - loss: 0.3965 - acc: 0.8603 - val_loss: 0.4225 - val_acc: 0.8617
Epoch 181/350
1563/1562 [==============================] - 106s - loss: 0.3970 - acc: 0.8603 - val_loss: 0.4281 - val_acc: 0.8615
Epoch 182/350
1563/1562 [==============================] - 106s - loss: 0.3936 - acc: 0.8617 - val_loss: 0.4198 - val_acc: 0.8634
Epoch 183/350
1563/1562 [==============================] - 105s - loss: 0.3958 - acc: 0.8598 - val_loss: 0.4423 - val_acc: 0.8614
Epoch 184/350
1563/1562 [==============================] - 106s - loss: 0.3951 - acc: 0.8617 - val_loss: 0.4537 - val_acc: 0.8560
Epoch 185/350
1563/1562 [==============================] - 106s - loss: 0.3949 - acc: 0.8617 - val_loss: 0.4122 - val_acc: 0.8655
Epoch 186/350
1563/1562 [==============================] - 106s - loss: 0.3876 - acc: 0.8655 - val_loss: 0.4166 - val_acc: 0.8645
Epoch 187/350
1563/1562 [==============================] - 106s - loss: 0.3910 - acc: 0.8635 - val_loss: 0.4117 - val_acc: 0.8673
Epoch 188/350
1563/1562 [==============================] - 106s - loss: 0.3872 - acc: 0.8657 - val_loss: 0.4159 - val_acc: 0.8695
Epoch 189/350
1563/1562 [==============================] - 106s - loss: 0.3905 - acc: 0.8636 - val_loss: 0.4340 - val_acc: 0.8583
Epoch 190/350
1563/1562 [==============================] - 106s - loss: 0.3865 - acc: 0.8644 - val_loss: 0.4276 - val_acc: 0.8652
Epoch 191/350
1563/1562 [==============================] - 106s - loss: 0.3856 - acc: 0.8648 - val_loss: 0.4184 - val_acc: 0.8648
Epoch 192/350
1563/1562 [==============================] - 106s - loss: 0.3879 - acc: 0.8644 - val_loss: 0.4054 - val_acc: 0.8698
Epoch 193/350
1563/1562 [==============================] - 106s - loss: 0.3838 - acc: 0.8664 - val_loss: 0.4318 - val_acc: 0.8647
Epoch 194/350
1563/1562 [==============================] - 105s - loss: 0.3767 - acc: 0.8690 - val_loss: 0.4060 - val_acc: 0.8722
Epoch 195/350
1563/1562 [==============================] - 105s - loss: 0.3808 - acc: 0.8658 - val_loss: 0.4219 - val_acc: 0.8648
Epoch 196/350
1563/1562 [==============================] - 106s - loss: 0.3797 - acc: 0.8671 - val_loss: 0.4133 - val_acc: 0.8688
Epoch 197/350
1563/1562 [==============================] - 106s - loss: 0.3815 - acc: 0.8659 - val_loss: 0.4236 - val_acc: 0.8616
Epoch 198/350
1563/1562 [==============================] - 106s - loss: 0.3809 - acc: 0.8675 - val_loss: 0.4157 - val_acc: 0.8680
Epoch 199/350
1563/1562 [==============================] - 105s - loss: 0.3726 - acc: 0.8706 - val_loss: 0.4150 - val_acc: 0.8673
Epoch 200/350
1563/1562 [==============================] - 105s - loss: 0.3764 - acc: 0.8675 - val_loss: 0.4285 - val_acc: 0.8650
Epoch 201/350
1563/1562 [==============================] - 106s - loss: 0.3805 - acc: 0.8649 - val_loss: 0.4380 - val_acc: 0.8621
Epoch 202/350
1563/1562 [==============================] - 106s - loss: 0.3763 - acc: 0.8689 - val_loss: 0.4069 - val_acc: 0.8687
Epoch 203/350
1563/1562 [==============================] - 105s - loss: 0.3789 - acc: 0.8667 - val_loss: 0.4291 - val_acc: 0.8679
Epoch 204/350
1563/1562 [==============================] - 105s - loss: 0.3746 - acc: 0.8683 - val_loss: 0.4241 - val_acc: 0.8678
Epoch 205/350
1563/1562 [==============================] - 105s - loss: 0.3705 - acc: 0.8698 - val_loss: 0.4264 - val_acc: 0.8666
Epoch 206/350
1563/1562 [==============================] - 105s - loss: 0.3720 - acc: 0.8694 - val_loss: 0.4439 - val_acc: 0.8571
Epoch 207/350
1563/1562 [==============================] - 105s - loss: 0.3709 - acc: 0.8691 - val_loss: 0.4088 - val_acc: 0.8698
Epoch 208/350
1563/1562 [==============================] - 106s - loss: 0.3711 - acc: 0.8690 - val_loss: 0.4095 - val_acc: 0.8667
Epoch 209/350
1563/1562 [==============================] - 106s - loss: 0.3689 - acc: 0.8693 - val_loss: 0.4006 - val_acc: 0.8720
Epoch 210/350
1563/1562 [==============================] - 105s - loss: 0.3644 - acc: 0.8703 - val_loss: 0.4407 - val_acc: 0.8613
Epoch 211/350
1563/1562 [==============================] - 105s - loss: 0.3626 - acc: 0.8727 - val_loss: 0.4435 - val_acc: 0.8639
Epoch 212/350
1563/1562 [==============================] - 105s - loss: 0.3644 - acc: 0.8707 - val_loss: 0.4355 - val_acc: 0.8654
Epoch 213/350
1563/1562 [==============================] - 105s - loss: 0.3632 - acc: 0.8725 - val_loss: 0.4145 - val_acc: 0.8710
Epoch 214/350
1563/1562 [==============================] - 105s - loss: 0.3646 - acc: 0.8732 - val_loss: 0.4306 - val_acc: 0.8669
Epoch 215/350
1563/1562 [==============================] - 105s - loss: 0.3571 - acc: 0.8759 - val_loss: 0.4254 - val_acc: 0.8650
Epoch 216/350
1563/1562 [==============================] - 105s - loss: 0.3644 - acc: 0.8726 - val_loss: 0.4101 - val_acc: 0.8680
Epoch 217/350
1563/1562 [==============================] - 105s - loss: 0.3620 - acc: 0.8733 - val_loss: 0.4054 - val_acc: 0.8743
Epoch 218/350
1563/1562 [==============================] - 105s - loss: 0.3601 - acc: 0.8735 - val_loss: 0.4278 - val_acc: 0.8672
Epoch 219/350
1563/1562 [==============================] - 105s - loss: 0.3568 - acc: 0.8749 - val_loss: 0.4237 - val_acc: 0.8664
Epoch 220/350
1563/1562 [==============================] - 105s - loss: 0.3541 - acc: 0.8759 - val_loss: 0.4127 - val_acc: 0.8677
Epoch 221/350
1563/1562 [==============================] - 105s - loss: 0.3627 - acc: 0.8732 - val_loss: 0.4108 - val_acc: 0.8719
Epoch 222/350
1563/1562 [==============================] - 105s - loss: 0.3545 - acc: 0.8756 - val_loss: 0.4178 - val_acc: 0.8696
Epoch 223/350
1563/1562 [==============================] - 105s - loss: 0.3532 - acc: 0.8756 - val_loss: 0.4162 - val_acc: 0.8664
Epoch 224/350
1563/1562 [==============================] - 105s - loss: 0.3540 - acc: 0.8757 - val_loss: 0.4271 - val_acc: 0.8652
Epoch 225/350
1563/1562 [==============================] - 105s - loss: 0.3539 - acc: 0.8753 - val_loss: 0.4146 - val_acc: 0.8694
Epoch 226/350
1563/1562 [==============================] - 105s - loss: 0.3552 - acc: 0.8750 - val_loss: 0.4221 - val_acc: 0.8707
Epoch 227/350
1563/1562 [==============================] - 105s - loss: 0.3514 - acc: 0.8751 - val_loss: 0.4082 - val_acc: 0.8693
Epoch 228/350
1563/1562 [==============================] - 105s - loss: 0.3530 - acc: 0.8767 - val_loss: 0.4246 - val_acc: 0.8645
Epoch 229/350
1563/1562 [==============================] - 105s - loss: 0.3442 - acc: 0.8798 - val_loss: 0.3922 - val_acc: 0.8723
Epoch 230/350
1563/1562 [==============================] - 105s - loss: 0.3482 - acc: 0.8786 - val_loss: 0.4189 - val_acc: 0.8696
Epoch 231/350
1563/1562 [==============================] - 105s - loss: 0.3484 - acc: 0.8772 - val_loss: 0.3937 - val_acc: 0.8720
Epoch 232/350
1563/1562 [==============================] - 105s - loss: 0.3498 - acc: 0.8775 - val_loss: 0.4114 - val_acc: 0.8644
Epoch 233/350
1563/1562 [==============================] - 105s - loss: 0.3472 - acc: 0.8781 - val_loss: 0.4472 - val_acc: 0.8619
Epoch 234/350
1563/1562 [==============================] - 105s - loss: 0.3471 - acc: 0.8781 - val_loss: 0.3882 - val_acc: 0.8762
Epoch 235/350
1563/1562 [==============================] - 105s - loss: 0.3431 - acc: 0.8780 - val_loss: 0.4042 - val_acc: 0.8749
Epoch 236/350
1563/1562 [==============================] - 105s - loss: 0.3432 - acc: 0.8789 - val_loss: 0.4625 - val_acc: 0.8582
Epoch 237/350
1563/1562 [==============================] - 105s - loss: 0.3422 - acc: 0.8796 - val_loss: 0.3998 - val_acc: 0.8753
Epoch 238/350
1563/1562 [==============================] - 105s - loss: 0.3414 - acc: 0.8794 - val_loss: 0.4220 - val_acc: 0.8654
Epoch 239/350
1563/1562 [==============================] - 105s - loss: 0.3389 - acc: 0.8804 - val_loss: 0.4484 - val_acc: 0.8644
Epoch 240/350
1563/1562 [==============================] - 105s - loss: 0.3422 - acc: 0.8792 - val_loss: 0.4274 - val_acc: 0.8665
Epoch 241/350
1563/1562 [==============================] - 105s - loss: 0.3396 - acc: 0.8807 - val_loss: 0.3991 - val_acc: 0.8718
Epoch 242/350
1563/1562 [==============================] - 105s - loss: 0.3374 - acc: 0.8801 - val_loss: 0.4025 - val_acc: 0.8735
Epoch 243/350
1563/1562 [==============================] - 105s - loss: 0.3391 - acc: 0.8802 - val_loss: 0.4307 - val_acc: 0.8702
Epoch 244/350
1563/1562 [==============================] - 105s - loss: 0.3392 - acc: 0.8805 - val_loss: 0.4054 - val_acc: 0.8704
Epoch 245/350
1563/1562 [==============================] - 105s - loss: 0.3357 - acc: 0.8814 - val_loss: 0.3937 - val_acc: 0.8756
Epoch 246/350
1563/1562 [==============================] - 105s - loss: 0.3332 - acc: 0.8820 - val_loss: 0.4110 - val_acc: 0.8740
Epoch 247/350
1563/1562 [==============================] - 105s - loss: 0.3357 - acc: 0.8820 - val_loss: 0.4089 - val_acc: 0.8696
Epoch 248/350
1563/1562 [==============================] - 105s - loss: 0.3340 - acc: 0.8816 - val_loss: 0.4034 - val_acc: 0.8727
Epoch 249/350
1563/1562 [==============================] - 105s - loss: 0.3331 - acc: 0.8841 - val_loss: 0.4214 - val_acc: 0.8674
Epoch 250/350
1563/1562 [==============================] - 105s - loss: 0.3332 - acc: 0.8806 - val_loss: 0.4150 - val_acc: 0.8734
Epoch 251/350
1563/1562 [==============================] - 105s - loss: 0.3305 - acc: 0.8822 - val_loss: 0.4213 - val_acc: 0.8694
Epoch 252/350
1563/1562 [==============================] - 105s - loss: 0.3330 - acc: 0.8837 - val_loss: 0.4088 - val_acc: 0.8739
Epoch 253/350
1563/1562 [==============================] - 105s - loss: 0.3264 - acc: 0.8857 - val_loss: 0.4460 - val_acc: 0.8613
Epoch 254/350
1563/1562 [==============================] - 105s - loss: 0.3294 - acc: 0.8854 - val_loss: 0.4340 - val_acc: 0.8667
Epoch 255/350
1563/1562 [==============================] - 105s - loss: 0.3278 - acc: 0.8829 - val_loss: 0.4130 - val_acc: 0.8709
Epoch 256/350
1563/1562 [==============================] - 105s - loss: 0.3278 - acc: 0.8844 - val_loss: 0.3942 - val_acc: 0.8737
Epoch 257/350
1563/1562 [==============================] - 105s - loss: 0.3278 - acc: 0.8838 - val_loss: 0.4112 - val_acc: 0.8672
Epoch 258/350
1563/1562 [==============================] - 105s - loss: 0.3277 - acc: 0.8849 - val_loss: 0.4011 - val_acc: 0.8703
Epoch 259/350
1563/1562 [==============================] - 105s - loss: 0.3205 - acc: 0.8860 - val_loss: 0.4127 - val_acc: 0.8720
Epoch 260/350
1563/1562 [==============================] - 105s - loss: 0.3274 - acc: 0.8846 - val_loss: 0.4122 - val_acc: 0.8738
Epoch 261/350
1563/1562 [==============================] - 105s - loss: 0.3298 - acc: 0.8842 - val_loss: 0.4023 - val_acc: 0.8730
Epoch 262/350
1563/1562 [==============================] - 105s - loss: 0.3219 - acc: 0.8856 - val_loss: 0.4268 - val_acc: 0.8685
Epoch 263/350
1563/1562 [==============================] - 105s - loss: 0.3255 - acc: 0.8863 - val_loss: 0.4007 - val_acc: 0.8749
Epoch 264/350
1563/1562 [==============================] - 105s - loss: 0.3285 - acc: 0.8835 - val_loss: 0.4044 - val_acc: 0.8736
Epoch 265/350
1563/1562 [==============================] - 105s - loss: 0.3213 - acc: 0.8847 - val_loss: 0.3972 - val_acc: 0.8744
Epoch 266/350
1563/1562 [==============================] - 105s - loss: 0.3191 - acc: 0.8873 - val_loss: 0.4240 - val_acc: 0.8699
Epoch 267/350
1563/1562 [==============================] - 105s - loss: 0.3264 - acc: 0.8835 - val_loss: 0.4207 - val_acc: 0.8707
Epoch 268/350
1563/1562 [==============================] - 105s - loss: 0.3223 - acc: 0.8865 - val_loss: 0.4088 - val_acc: 0.8756
Epoch 269/350
1563/1562 [==============================] - 105s - loss: 0.3190 - acc: 0.8869 - val_loss: 0.4186 - val_acc: 0.8717
Epoch 270/350
1563/1562 [==============================] - 105s - loss: 0.3194 - acc: 0.8872 - val_loss: 0.4214 - val_acc: 0.8703
Epoch 271/350
1563/1562 [==============================] - 105s - loss: 0.3200 - acc: 0.8853 - val_loss: 0.3878 - val_acc: 0.8762
Epoch 272/350
1563/1562 [==============================] - 105s - loss: 0.3166 - acc: 0.8898 - val_loss: 0.3989 - val_acc: 0.8736
Epoch 273/350
1563/1562 [==============================] - 105s - loss: 0.3149 - acc: 0.8900 - val_loss: 0.4174 - val_acc: 0.8704
Epoch 274/350
1563/1562 [==============================] - 105s - loss: 0.3148 - acc: 0.8882 - val_loss: 0.4438 - val_acc: 0.8620
Epoch 275/350
1563/1562 [==============================] - 105s - loss: 0.3168 - acc: 0.8865 - val_loss: 0.4102 - val_acc: 0.8731
Epoch 276/350
1563/1562 [==============================] - 105s - loss: 0.3095 - acc: 0.8905 - val_loss: 0.4147 - val_acc: 0.8749
Epoch 277/350
1563/1562 [==============================] - 105s - loss: 0.3137 - acc: 0.8890 - val_loss: 0.4202 - val_acc: 0.8723
Epoch 278/350
1563/1562 [==============================] - 105s - loss: 0.3123 - acc: 0.8895 - val_loss: 0.4278 - val_acc: 0.8706
Epoch 279/350
1563/1562 [==============================] - 105s - loss: 0.3164 - acc: 0.8891 - val_loss: 0.4079 - val_acc: 0.8731
Epoch 280/350
1563/1562 [==============================] - 105s - loss: 0.3149 - acc: 0.8886 - val_loss: 0.4206 - val_acc: 0.8696
Epoch 281/350
1563/1562 [==============================] - 105s - loss: 0.3118 - acc: 0.8903 - val_loss: 0.4394 - val_acc: 0.8679
Epoch 282/350
1563/1562 [==============================] - 105s - loss: 0.3131 - acc: 0.8914 - val_loss: 0.3975 - val_acc: 0.8725
Epoch 283/350
1563/1562 [==============================] - 105s - loss: 0.3123 - acc: 0.8905 - val_loss: 0.4093 - val_acc: 0.8731
Epoch 284/350
1563/1562 [==============================] - 105s - loss: 0.3096 - acc: 0.8907 - val_loss: 0.4063 - val_acc: 0.8749
Epoch 285/350
1563/1562 [==============================] - 105s - loss: 0.3116 - acc: 0.8896 - val_loss: 0.4037 - val_acc: 0.8755
Epoch 286/350
1563/1562 [==============================] - 105s - loss: 0.3095 - acc: 0.8904 - val_loss: 0.4140 - val_acc: 0.8744
Epoch 287/350
1563/1562 [==============================] - 105s - loss: 0.3086 - acc: 0.8898 - val_loss: 0.4122 - val_acc: 0.8700
Epoch 288/350
1563/1562 [==============================] - 105s - loss: 0.3109 - acc: 0.8908 - val_loss: 0.4250 - val_acc: 0.8739
Epoch 289/350
1563/1562 [==============================] - 105s - loss: 0.3103 - acc: 0.8889 - val_loss: 0.4147 - val_acc: 0.8699
Epoch 290/350
1563/1562 [==============================] - 105s - loss: 0.3043 - acc: 0.8927 - val_loss: 0.4208 - val_acc: 0.8697
Epoch 291/350
1563/1562 [==============================] - 105s - loss: 0.3075 - acc: 0.8916 - val_loss: 0.4077 - val_acc: 0.8752
Epoch 292/350
1563/1562 [==============================] - 105s - loss: 0.3075 - acc: 0.8911 - val_loss: 0.4165 - val_acc: 0.8743
Epoch 293/350
1563/1562 [==============================] - 105s - loss: 0.3042 - acc: 0.8934 - val_loss: 0.4192 - val_acc: 0.8683
Epoch 294/350
1563/1562 [==============================] - 105s - loss: 0.3051 - acc: 0.8923 - val_loss: 0.4098 - val_acc: 0.8720
Epoch 295/350
1563/1562 [==============================] - 105s - loss: 0.3032 - acc: 0.8928 - val_loss: 0.4086 - val_acc: 0.8706
Epoch 296/350
1563/1562 [==============================] - 105s - loss: 0.3019 - acc: 0.8945 - val_loss: 0.3995 - val_acc: 0.8753
Epoch 297/350
1563/1562 [==============================] - 105s - loss: 0.2982 - acc: 0.8948 - val_loss: 0.4446 - val_acc: 0.8689
Epoch 298/350
1563/1562 [==============================] - 105s - loss: 0.3090 - acc: 0.8911 - val_loss: 0.4127 - val_acc: 0.8787
Epoch 299/350
1563/1562 [==============================] - 105s - loss: 0.3016 - acc: 0.8948 - val_loss: 0.4372 - val_acc: 0.8691
Epoch 300/350
1563/1562 [==============================] - 105s - loss: 0.3036 - acc: 0.8917 - val_loss: 0.4032 - val_acc: 0.8757
Epoch 301/350
1563/1562 [==============================] - 105s - loss: 0.3003 - acc: 0.8936 - val_loss: 0.4092 - val_acc: 0.8732
Epoch 302/350
1563/1562 [==============================] - 105s - loss: 0.3018 - acc: 0.8921 - val_loss: 0.3898 - val_acc: 0.8775
Epoch 303/350
1563/1562 [==============================] - 105s - loss: 0.3009 - acc: 0.8922 - val_loss: 0.4045 - val_acc: 0.8743
Epoch 304/350
1563/1562 [==============================] - 105s - loss: 0.2995 - acc: 0.8946 - val_loss: 0.4158 - val_acc: 0.8754
Epoch 305/350
1563/1562 [==============================] - 105s - loss: 0.3028 - acc: 0.8930 - val_loss: 0.3936 - val_acc: 0.8766
Epoch 306/350
1563/1562 [==============================] - 105s - loss: 0.2999 - acc: 0.8941 - val_loss: 0.4090 - val_acc: 0.8759
Epoch 307/350
1563/1562 [==============================] - 105s - loss: 0.3040 - acc: 0.8925 - val_loss: 0.4139 - val_acc: 0.8760
Epoch 308/350
1563/1562 [==============================] - 105s - loss: 0.3021 - acc: 0.8934 - val_loss: 0.4051 - val_acc: 0.8757
Epoch 309/350
1563/1562 [==============================] - 105s - loss: 0.2977 - acc: 0.8947 - val_loss: 0.3833 - val_acc: 0.8789
Epoch 310/350
1563/1562 [==============================] - 105s - loss: 0.2881 - acc: 0.8987 - val_loss: 0.4181 - val_acc: 0.8750
Epoch 311/350
1563/1562 [==============================] - 105s - loss: 0.2950 - acc: 0.8954 - val_loss: 0.4130 - val_acc: 0.8749
Epoch 312/350
1563/1562 [==============================] - 105s - loss: 0.2956 - acc: 0.8960 - val_loss: 0.4032 - val_acc: 0.8775
Epoch 313/350
1563/1562 [==============================] - 105s - loss: 0.2893 - acc: 0.8966 - val_loss: 0.4205 - val_acc: 0.8744
Epoch 314/350
1563/1562 [==============================] - 105s - loss: 0.2930 - acc: 0.8971 - val_loss: 0.4083 - val_acc: 0.8781
Epoch 315/350
1563/1562 [==============================] - 105s - loss: 0.2960 - acc: 0.8961 - val_loss: 0.4095 - val_acc: 0.8757
Epoch 316/350
1563/1562 [==============================] - 105s - loss: 0.2956 - acc: 0.8960 - val_loss: 0.4079 - val_acc: 0.8764
Epoch 317/350
1563/1562 [==============================] - 105s - loss: 0.2934 - acc: 0.8964 - val_loss: 0.4111 - val_acc: 0.8750
Epoch 318/350
1563/1562 [==============================] - 105s - loss: 0.2931 - acc: 0.8966 - val_loss: 0.4439 - val_acc: 0.8681
Epoch 319/350
1563/1562 [==============================] - 105s - loss: 0.2939 - acc: 0.8957 - val_loss: 0.4227 - val_acc: 0.8692
Epoch 320/350
1563/1562 [==============================] - 105s - loss: 0.2929 - acc: 0.8969 - val_loss: 0.4244 - val_acc: 0.8786
Epoch 321/350
1563/1562 [==============================] - 105s - loss: 0.2902 - acc: 0.8980 - val_loss: 0.4408 - val_acc: 0.8707
Epoch 322/350
1563/1562 [==============================] - 105s - loss: 0.2910 - acc: 0.8964 - val_loss: 0.4418 - val_acc: 0.8702
Epoch 323/350
1563/1562 [==============================] - 105s - loss: 0.2924 - acc: 0.8960 - val_loss: 0.4137 - val_acc: 0.8776
Epoch 324/350
1563/1562 [==============================] - 105s - loss: 0.2875 - acc: 0.8985 - val_loss: 0.4455 - val_acc: 0.8679
Epoch 325/350
1563/1562 [==============================] - 105s - loss: 0.2902 - acc: 0.8971 - val_loss: 0.4118 - val_acc: 0.8781
Epoch 326/350
1563/1562 [==============================] - 105s - loss: 0.2868 - acc: 0.8987 - val_loss: 0.4291 - val_acc: 0.8734
Epoch 327/350
1563/1562 [==============================] - 105s - loss: 0.2853 - acc: 0.8977 - val_loss: 0.4115 - val_acc: 0.8759
Epoch 328/350
1563/1562 [==============================] - 105s - loss: 0.2876 - acc: 0.8973 - val_loss: 0.4282 - val_acc: 0.8748
Epoch 329/350
1563/1562 [==============================] - 105s - loss: 0.2875 - acc: 0.8993 - val_loss: 0.4202 - val_acc: 0.8742
Epoch 330/350
1563/1562 [==============================] - 105s - loss: 0.2848 - acc: 0.8999 - val_loss: 0.4235 - val_acc: 0.8710
Epoch 331/350
1563/1562 [==============================] - 105s - loss: 0.2881 - acc: 0.8980 - val_loss: 0.4144 - val_acc: 0.8785
Epoch 332/350
1563/1562 [==============================] - 105s - loss: 0.2855 - acc: 0.8984 - val_loss: 0.4383 - val_acc: 0.8726
Epoch 333/350
1563/1562 [==============================] - 105s - loss: 0.2868 - acc: 0.8989 - val_loss: 0.4320 - val_acc: 0.8709
Epoch 334/350
1563/1562 [==============================] - 105s - loss: 0.2861 - acc: 0.8985 - val_loss: 0.4165 - val_acc: 0.8714
Epoch 335/350
1563/1562 [==============================] - 105s - loss: 0.2829 - acc: 0.8993 - val_loss: 0.4358 - val_acc: 0.8712
Epoch 336/350
1563/1562 [==============================] - 105s - loss: 0.2837 - acc: 0.8982 - val_loss: 0.4180 - val_acc: 0.8717
Epoch 337/350
1563/1562 [==============================] - 105s - loss: 0.2847 - acc: 0.8981 - val_loss: 0.4331 - val_acc: 0.8681
Epoch 338/350
1563/1562 [==============================] - 105s - loss: 0.2821 - acc: 0.9014 - val_loss: 0.4294 - val_acc: 0.8753
Epoch 339/350
1563/1562 [==============================] - 105s - loss: 0.2801 - acc: 0.8994 - val_loss: 0.4193 - val_acc: 0.8774
Epoch 340/350
1563/1562 [==============================] - 105s - loss: 0.2785 - acc: 0.9013 - val_loss: 0.4315 - val_acc: 0.8734
Epoch 341/350
1563/1562 [==============================] - 105s - loss: 0.2780 - acc: 0.9022 - val_loss: 0.4006 - val_acc: 0.8793
Epoch 342/350
1563/1562 [==============================] - 105s - loss: 0.2855 - acc: 0.8991 - val_loss: 0.4078 - val_acc: 0.8775
Epoch 343/350
1563/1562 [==============================] - 105s - loss: 0.2858 - acc: 0.9004 - val_loss: 0.3926 - val_acc: 0.8797
Epoch 344/350
1563/1562 [==============================] - 105s - loss: 0.2776 - acc: 0.9013 - val_loss: 0.4074 - val_acc: 0.8750
Epoch 345/350
1563/1562 [==============================] - 105s - loss: 0.2818 - acc: 0.9007 - val_loss: 0.4279 - val_acc: 0.8755
Epoch 346/350
1563/1562 [==============================] - 105s - loss: 0.2834 - acc: 0.8986 - val_loss: 0.4003 - val_acc: 0.8760
Epoch 347/350
1563/1562 [==============================] - 105s - loss: 0.2812 - acc: 0.9013 - val_loss: 0.4039 - val_acc: 0.8771
Epoch 348/350
1563/1562 [==============================] - 105s - loss: 0.2780 - acc: 0.9021 - val_loss: 0.4160 - val_acc: 0.8751
Epoch 349/350
1563/1562 [==============================] - 105s - loss: 0.2835 - acc: 0.9000 - val_loss: 0.4493 - val_acc: 0.8697
Epoch 350/350
1563/1562 [==============================] - 105s - loss: 0.2795 - acc: 0.9029 - val_loss: 0.4150 - val_acc: 0.8794

Evaluate the model


In [24]:
scores = augmented_model.evaluate(images_test, class_test, verbose=0)
print("Accuracy: %.2f%%" % (scores[1]*100))


Accuracy: 87.94%

Model accuracy and loss plots


In [25]:
plot_model(augmented_model_details)


To further improve the model, run it for more epochs and do more augmentations like ZCA whitening.

Predictions

Predict class for test set images


In [38]:
correct, labels_pred = predict_classes(augmented_model, images_test, labels_test)

Calculate accuracy using manual calculation


In [39]:
num_images = len(correct)
print("Accuracy: %.2f%%" % ((sum(correct)*100)/num_images))


Accuracy: 87.94%

Show some mis-classifications

Plot the first 9 mis-classified images


In [40]:
visualize_errors(images_test, labels_test, class_names, labels_pred, correct)