Real Life Example: Classifying Speed Limit Signs


In [1]:
import warnings
warnings.filterwarnings('ignore')

In [2]:
%matplotlib inline
%pylab inline


Populating the interactive namespace from numpy and matplotlib

In [3]:
import matplotlib.pylab as plt
import numpy as np

In [4]:
from distutils.version import StrictVersion

In [5]:
import sklearn
print(sklearn.__version__)

assert StrictVersion(sklearn.__version__ ) >= StrictVersion('0.18.1')


0.18.1

In [6]:
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.ERROR)
print(tf.__version__)

assert StrictVersion(tf.__version__) >= StrictVersion('1.1.0')


1.2.1

In [7]:
import keras
print(keras.__version__)

assert StrictVersion(keras.__version__) >= StrictVersion('2.0.6')


Using TensorFlow backend.
2.0.8

Loading and Preparing Data


In [8]:
!ls -l speed-limit-signs


total 28
drwxrwxr-x 2 ubuntu ubuntu 4096 Sep 27 15:25 0
drwxrwxr-x 2 ubuntu ubuntu 4096 Sep 27 15:25 1
drwxrwxr-x 2 ubuntu ubuntu 4096 Sep 27 15:25 2
drwxrwxr-x 2 ubuntu ubuntu 4096 Sep 27 15:25 3
drwxrwxr-x 2 ubuntu ubuntu 4096 Sep 27 15:25 4
drwxrwxr-x 2 ubuntu ubuntu 4096 Sep 27 15:25 5
-rw-rw-r-- 1 ubuntu ubuntu  380 Oct  1 08:09 README.md

In [9]:
!cat speed-limit-signs/README.md


Data extracted from http://benchmark.ini.rub.de/?section=gtsdb&subsection=dataset

From http://benchmark.ini.rub.de/Dataset_GTSDB/FullIJCNN2013.zip just the samples for the speed limit signs 

## Format

https://en.wikipedia.org/wiki/Netpbm_format

Can be previewed on a Mac and be processed by http://scikit-image.org/

## Labels
- 0: 30
- 1: 50
- 2: 70
- 3: 80
- 4: 100
- 5: 120

Big Kudos to Waleed Abdulla for providing the initial idea and many of the functions used to prepare and display the images: https://medium.com/@waleedka/traffic-sign-recognition-with-tensorflow-629dffc391a6#.i728o84ib


In [10]:
import os
import skimage.data
import skimage.transform
from keras.utils.np_utils import to_categorical
import numpy as np

def load_data(data_dir, type=".ppm"):
    num_categories = 6

    # Get all subdirectories of data_dir. Each represents a label.
    directories = [d for d in os.listdir(data_dir) 
                   if os.path.isdir(os.path.join(data_dir, d))]
    # Loop through the label directories and collect the data in
    # two lists, labels and images.
    labels = []
    images = []
    for d in directories:
        label_dir = os.path.join(data_dir, d)
        file_names = [os.path.join(label_dir, f) for f in os.listdir(label_dir) if f.endswith(type)]
        # For each label, load it's images and add them to the images list.
        # And add the label number (i.e. directory name) to the labels list.
        for f in file_names:
            images.append(skimage.data.imread(f))
            labels.append(int(d))
    images64 = [skimage.transform.resize(image, (64, 64)) for image in images]
    return images64, labels

In [11]:
# Load datasets.
ROOT_PATH = "./"
original_dir = os.path.join(ROOT_PATH, "speed-limit-signs")
images, labels = load_data(original_dir, type=".ppm")

In [12]:
import matplotlib
import matplotlib.pyplot as plt

def display_images_and_labels(images, labels):
    """Display the first image of each label."""
    unique_labels = set(labels)
    plt.figure(figsize=(15, 15))
    i = 1
    for label in unique_labels:
        # Pick the first image for each label.
        image = images[labels.index(label)]
        plt.subplot(8, 8, i)  # A grid of 8 rows x 8 columns
        plt.axis('off')
        plt.title("Label {0} ({1})".format(label, labels.count(label)))
        i += 1
        _ = plt.imshow(image)

In [13]:
display_images_and_labels(images, labels)



In [14]:
# again a little bit of feature engeneering

y = np.array(labels)
X = np.array(images)
from keras.utils.np_utils import to_categorical

num_categories = 6

y = to_categorical(y, num_categories)

Let's start with creating a minimal model that overfits on a very small training set

http://cs231n.github.io/neural-networks-3/#sanitycheck


In [15]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.9, random_state=42, stratify=y)

In [16]:
X_train.shape, y_train.shape


Out[16]:
((37, 64, 64, 3), (37, 6))

This is how overfitting looks like in the Metrics

Accuracy

Validation Accuracy


In [17]:
# full architecture
# %load https://djcordhose.github.io/ai/fragments/vgg_style_no_dropout.py

# my sample minimized architecture
# %load https://djcordhose.github.io/ai/fragments/vgg_style_no_dropout_overfitting.py

In [18]:
model = Model(input=inputs, output=predictions)
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         (None, 64, 64, 3)         0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 64, 64, 64)        1792      
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 32, 32, 64)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_1 (Dense)              (None, 64)                262208    
_________________________________________________________________
dense_2 (Dense)              (None, 6)                 390       
=================================================================
Total params: 264,390
Trainable params: 264,390
Non-trainable params: 0
_________________________________________________________________

In [19]:
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In [20]:
# Determines how many samples are using for training in one batch
# Depends on harware GPU architecture, set as high as possible (this works well on K80)
BATCH_SIZE = 500

In [21]:
%time model.fit(X_train, y_train, epochs=100, validation_split=0.2, batch_size=BATCH_SIZE)


Train on 29 samples, validate on 8 samples
Epoch 1/100
29/29 [==============================] - 3s - loss: 1.7860 - acc: 0.1724 - val_loss: 1.7473 - val_acc: 0.1250
Epoch 2/100
29/29 [==============================] - 0s - loss: 1.6542 - acc: 0.2414 - val_loss: 1.8353 - val_acc: 0.1250
Epoch 3/100
29/29 [==============================] - 0s - loss: 1.6456 - acc: 0.2414 - val_loss: 1.7074 - val_acc: 0.1250
Epoch 4/100
29/29 [==============================] - 0s - loss: 1.6047 - acc: 0.2414 - val_loss: 1.6178 - val_acc: 0.2500
Epoch 5/100
29/29 [==============================] - 0s - loss: 1.5856 - acc: 0.3448 - val_loss: 1.6116 - val_acc: 0.2500
Epoch 6/100
29/29 [==============================] - 0s - loss: 1.5599 - acc: 0.3793 - val_loss: 1.6550 - val_acc: 0.2500
Epoch 7/100
29/29 [==============================] - 0s - loss: 1.5202 - acc: 0.3448 - val_loss: 1.7536 - val_acc: 0.1250
Epoch 8/100
29/29 [==============================] - 0s - loss: 1.4941 - acc: 0.2414 - val_loss: 1.8083 - val_acc: 0.1250
Epoch 9/100
29/29 [==============================] - 0s - loss: 1.4720 - acc: 0.2414 - val_loss: 1.7727 - val_acc: 0.1250
Epoch 10/100
29/29 [==============================] - 0s - loss: 1.4365 - acc: 0.2759 - val_loss: 1.7011 - val_acc: 0.1250
Epoch 11/100
29/29 [==============================] - 0s - loss: 1.4058 - acc: 0.4828 - val_loss: 1.6494 - val_acc: 0.1250
Epoch 12/100
29/29 [==============================] - 0s - loss: 1.3778 - acc: 0.5517 - val_loss: 1.6351 - val_acc: 0.3750
Epoch 13/100
29/29 [==============================] - 0s - loss: 1.3436 - acc: 0.5862 - val_loss: 1.6606 - val_acc: 0.3750
Epoch 14/100
29/29 [==============================] - 0s - loss: 1.3139 - acc: 0.6207 - val_loss: 1.6695 - val_acc: 0.3750
Epoch 15/100
29/29 [==============================] - 0s - loss: 1.2886 - acc: 0.5517 - val_loss: 1.6163 - val_acc: 0.3750
Epoch 16/100
29/29 [==============================] - 0s - loss: 1.2563 - acc: 0.5517 - val_loss: 1.5506 - val_acc: 0.3750
Epoch 17/100
29/29 [==============================] - 0s - loss: 1.2261 - acc: 0.6207 - val_loss: 1.5317 - val_acc: 0.3750
Epoch 18/100
29/29 [==============================] - 0s - loss: 1.1973 - acc: 0.6897 - val_loss: 1.5708 - val_acc: 0.3750
Epoch 19/100
29/29 [==============================] - 0s - loss: 1.1642 - acc: 0.6552 - val_loss: 1.6221 - val_acc: 0.2500
Epoch 20/100
29/29 [==============================] - 0s - loss: 1.1359 - acc: 0.6897 - val_loss: 1.6025 - val_acc: 0.3750
Epoch 21/100
29/29 [==============================] - 0s - loss: 1.1037 - acc: 0.6897 - val_loss: 1.5390 - val_acc: 0.3750
Epoch 22/100
29/29 [==============================] - 0s - loss: 1.0709 - acc: 0.7931 - val_loss: 1.5016 - val_acc: 0.3750
Epoch 23/100
29/29 [==============================] - 0s - loss: 1.0397 - acc: 0.7931 - val_loss: 1.5255 - val_acc: 0.6250
Epoch 24/100
29/29 [==============================] - 0s - loss: 1.0078 - acc: 0.7586 - val_loss: 1.5580 - val_acc: 0.6250
Epoch 25/100
29/29 [==============================] - 0s - loss: 0.9788 - acc: 0.7931 - val_loss: 1.4956 - val_acc: 0.5000
Epoch 26/100
29/29 [==============================] - 0s - loss: 0.9448 - acc: 0.7931 - val_loss: 1.4869 - val_acc: 0.5000
Epoch 27/100
29/29 [==============================] - 0s - loss: 0.9144 - acc: 0.7931 - val_loss: 1.5509 - val_acc: 0.5000
Epoch 28/100
29/29 [==============================] - 0s - loss: 0.8817 - acc: 0.8621 - val_loss: 1.5645 - val_acc: 0.5000
Epoch 29/100
29/29 [==============================] - 0s - loss: 0.8512 - acc: 0.8621 - val_loss: 1.5056 - val_acc: 0.5000
Epoch 30/100
29/29 [==============================] - 0s - loss: 0.8205 - acc: 0.8621 - val_loss: 1.5111 - val_acc: 0.5000
Epoch 31/100
29/29 [==============================] - 0s - loss: 0.7901 - acc: 0.8966 - val_loss: 1.5708 - val_acc: 0.5000
Epoch 32/100
29/29 [==============================] - 0s - loss: 0.7613 - acc: 0.8276 - val_loss: 1.5815 - val_acc: 0.5000
Epoch 33/100
29/29 [==============================] - 0s - loss: 0.7323 - acc: 0.8276 - val_loss: 1.5396 - val_acc: 0.5000
Epoch 34/100
29/29 [==============================] - 0s - loss: 0.7038 - acc: 0.8276 - val_loss: 1.5515 - val_acc: 0.5000
Epoch 35/100
29/29 [==============================] - 0s - loss: 0.6764 - acc: 0.8276 - val_loss: 1.6329 - val_acc: 0.5000
Epoch 36/100
29/29 [==============================] - 0s - loss: 0.6489 - acc: 0.8276 - val_loss: 1.6899 - val_acc: 0.5000
Epoch 37/100
29/29 [==============================] - 0s - loss: 0.6228 - acc: 0.8276 - val_loss: 1.6933 - val_acc: 0.5000
Epoch 38/100
29/29 [==============================] - 0s - loss: 0.5971 - acc: 0.8966 - val_loss: 1.7185 - val_acc: 0.5000
Epoch 39/100
29/29 [==============================] - 0s - loss: 0.5721 - acc: 0.8966 - val_loss: 1.7845 - val_acc: 0.5000
Epoch 40/100
29/29 [==============================] - 0s - loss: 0.5471 - acc: 0.8966 - val_loss: 1.8448 - val_acc: 0.5000
Epoch 41/100
29/29 [==============================] - 0s - loss: 0.5229 - acc: 0.8966 - val_loss: 1.8789 - val_acc: 0.5000
Epoch 42/100
29/29 [==============================] - 0s - loss: 0.4987 - acc: 0.8966 - val_loss: 1.9114 - val_acc: 0.5000
Epoch 43/100
29/29 [==============================] - 0s - loss: 0.4746 - acc: 0.8966 - val_loss: 1.9800 - val_acc: 0.5000
Epoch 44/100
29/29 [==============================] - 0s - loss: 0.4509 - acc: 0.8966 - val_loss: 2.0781 - val_acc: 0.5000
Epoch 45/100
29/29 [==============================] - 0s - loss: 0.4277 - acc: 0.8966 - val_loss: 2.1741 - val_acc: 0.5000
Epoch 46/100
29/29 [==============================] - 0s - loss: 0.4060 - acc: 0.8966 - val_loss: 2.2035 - val_acc: 0.5000
Epoch 47/100
29/29 [==============================] - 0s - loss: 0.3856 - acc: 0.8621 - val_loss: 2.2525 - val_acc: 0.3750
Epoch 48/100
29/29 [==============================] - 0s - loss: 0.3655 - acc: 0.8966 - val_loss: 2.3531 - val_acc: 0.3750
Epoch 49/100
29/29 [==============================] - 0s - loss: 0.3458 - acc: 0.8966 - val_loss: 2.4157 - val_acc: 0.3750
Epoch 50/100
29/29 [==============================] - 0s - loss: 0.3271 - acc: 0.8966 - val_loss: 2.4287 - val_acc: 0.3750
Epoch 51/100
29/29 [==============================] - 0s - loss: 0.3093 - acc: 0.9310 - val_loss: 2.4971 - val_acc: 0.3750
Epoch 52/100
29/29 [==============================] - 0s - loss: 0.2923 - acc: 0.9310 - val_loss: 2.6036 - val_acc: 0.3750
Epoch 53/100
29/29 [==============================] - 0s - loss: 0.2760 - acc: 0.9310 - val_loss: 2.6911 - val_acc: 0.3750
Epoch 54/100
29/29 [==============================] - 0s - loss: 0.2608 - acc: 0.9310 - val_loss: 2.7609 - val_acc: 0.3750
Epoch 55/100
29/29 [==============================] - 0s - loss: 0.2463 - acc: 0.9310 - val_loss: 2.8088 - val_acc: 0.3750
Epoch 56/100
29/29 [==============================] - 0s - loss: 0.2323 - acc: 0.9310 - val_loss: 2.8386 - val_acc: 0.3750
Epoch 57/100
29/29 [==============================] - 0s - loss: 0.2193 - acc: 0.9310 - val_loss: 2.8719 - val_acc: 0.3750
Epoch 58/100
29/29 [==============================] - 0s - loss: 0.2074 - acc: 0.9655 - val_loss: 2.9438 - val_acc: 0.3750
Epoch 59/100
29/29 [==============================] - 0s - loss: 0.1962 - acc: 0.9655 - val_loss: 3.0297 - val_acc: 0.3750
Epoch 60/100
29/29 [==============================] - 0s - loss: 0.1855 - acc: 0.9655 - val_loss: 3.0864 - val_acc: 0.3750
Epoch 61/100
29/29 [==============================] - 0s - loss: 0.1757 - acc: 0.9655 - val_loss: 3.1221 - val_acc: 0.3750
Epoch 62/100
29/29 [==============================] - 0s - loss: 0.1666 - acc: 0.9655 - val_loss: 3.1810 - val_acc: 0.3750
Epoch 63/100
29/29 [==============================] - 0s - loss: 0.1580 - acc: 0.9655 - val_loss: 3.2575 - val_acc: 0.3750
Epoch 64/100
29/29 [==============================] - 0s - loss: 0.1498 - acc: 0.9655 - val_loss: 3.3388 - val_acc: 0.3750
Epoch 65/100
29/29 [==============================] - 0s - loss: 0.1420 - acc: 0.9655 - val_loss: 3.3992 - val_acc: 0.3750
Epoch 66/100
29/29 [==============================] - 0s - loss: 0.1354 - acc: 0.9655 - val_loss: 3.4219 - val_acc: 0.3750
Epoch 67/100
29/29 [==============================] - 0s - loss: 0.1280 - acc: 1.0000 - val_loss: 3.4744 - val_acc: 0.3750
Epoch 68/100
29/29 [==============================] - 0s - loss: 0.1220 - acc: 1.0000 - val_loss: 3.5733 - val_acc: 0.3750
Epoch 69/100
29/29 [==============================] - 0s - loss: 0.1157 - acc: 1.0000 - val_loss: 3.6973 - val_acc: 0.3750
Epoch 70/100
29/29 [==============================] - 0s - loss: 0.1100 - acc: 1.0000 - val_loss: 3.8220 - val_acc: 0.3750
Epoch 71/100
29/29 [==============================] - 0s - loss: 0.1044 - acc: 1.0000 - val_loss: 3.9156 - val_acc: 0.3750
Epoch 72/100
29/29 [==============================] - 0s - loss: 0.0993 - acc: 1.0000 - val_loss: 3.9784 - val_acc: 0.3750
Epoch 73/100
29/29 [==============================] - 0s - loss: 0.0943 - acc: 1.0000 - val_loss: 4.0077 - val_acc: 0.3750
Epoch 74/100
29/29 [==============================] - 0s - loss: 0.0897 - acc: 1.0000 - val_loss: 4.0379 - val_acc: 0.2500
Epoch 75/100
29/29 [==============================] - 0s - loss: 0.0854 - acc: 1.0000 - val_loss: 4.0862 - val_acc: 0.2500
Epoch 76/100
29/29 [==============================] - 0s - loss: 0.0812 - acc: 1.0000 - val_loss: 4.0998 - val_acc: 0.2500
Epoch 77/100
29/29 [==============================] - 0s - loss: 0.0772 - acc: 1.0000 - val_loss: 4.1106 - val_acc: 0.2500
Epoch 78/100
29/29 [==============================] - 0s - loss: 0.0734 - acc: 1.0000 - val_loss: 4.1455 - val_acc: 0.2500
Epoch 79/100
29/29 [==============================] - 0s - loss: 0.0698 - acc: 1.0000 - val_loss: 4.2061 - val_acc: 0.2500
Epoch 80/100
29/29 [==============================] - 0s - loss: 0.0664 - acc: 1.0000 - val_loss: 4.2761 - val_acc: 0.2500
Epoch 81/100
29/29 [==============================] - 0s - loss: 0.0632 - acc: 1.0000 - val_loss: 4.3343 - val_acc: 0.2500
Epoch 82/100
29/29 [==============================] - 0s - loss: 0.0601 - acc: 1.0000 - val_loss: 4.3594 - val_acc: 0.2500
Epoch 83/100
29/29 [==============================] - 0s - loss: 0.0572 - acc: 1.0000 - val_loss: 4.3634 - val_acc: 0.2500
Epoch 84/100
29/29 [==============================] - 0s - loss: 0.0544 - acc: 1.0000 - val_loss: 4.3472 - val_acc: 0.2500
Epoch 85/100
29/29 [==============================] - 0s - loss: 0.0517 - acc: 1.0000 - val_loss: 4.3410 - val_acc: 0.2500
Epoch 86/100
29/29 [==============================] - 0s - loss: 0.0492 - acc: 1.0000 - val_loss: 4.3626 - val_acc: 0.2500
Epoch 87/100
29/29 [==============================] - 0s - loss: 0.0469 - acc: 1.0000 - val_loss: 4.3989 - val_acc: 0.2500
Epoch 88/100
29/29 [==============================] - 0s - loss: 0.0447 - acc: 1.0000 - val_loss: 4.4337 - val_acc: 0.2500
Epoch 89/100
29/29 [==============================] - 0s - loss: 0.0425 - acc: 1.0000 - val_loss: 4.4737 - val_acc: 0.2500
Epoch 90/100
29/29 [==============================] - 0s - loss: 0.0405 - acc: 1.0000 - val_loss: 4.5096 - val_acc: 0.2500
Epoch 91/100
29/29 [==============================] - 0s - loss: 0.0387 - acc: 1.0000 - val_loss: 4.5324 - val_acc: 0.2500
Epoch 92/100
29/29 [==============================] - 0s - loss: 0.0368 - acc: 1.0000 - val_loss: 4.5398 - val_acc: 0.2500
Epoch 93/100
29/29 [==============================] - 0s - loss: 0.0352 - acc: 1.0000 - val_loss: 4.5522 - val_acc: 0.2500
Epoch 94/100
29/29 [==============================] - 0s - loss: 0.0336 - acc: 1.0000 - val_loss: 4.5662 - val_acc: 0.2500
Epoch 95/100
29/29 [==============================] - 0s - loss: 0.0321 - acc: 1.0000 - val_loss: 4.5820 - val_acc: 0.2500
Epoch 96/100
29/29 [==============================] - 0s - loss: 0.0307 - acc: 1.0000 - val_loss: 4.6129 - val_acc: 0.2500
Epoch 97/100
29/29 [==============================] - 0s - loss: 0.0293 - acc: 1.0000 - val_loss: 4.6469 - val_acc: 0.2500
Epoch 98/100
29/29 [==============================] - 0s - loss: 0.0281 - acc: 1.0000 - val_loss: 4.6748 - val_acc: 0.2500
Epoch 99/100
29/29 [==============================] - 0s - loss: 0.0268 - acc: 1.0000 - val_loss: 4.6913 - val_acc: 0.2500
Epoch 100/100
29/29 [==============================] - 0s - loss: 0.0257 - acc: 1.0000 - val_loss: 4.6924 - val_acc: 0.2500
CPU times: user 2.65 s, sys: 1.44 s, total: 4.09 s
Wall time: 5.61 s
Out[21]:
<keras.callbacks.History at 0x7ff39d853f60>

Hands-On: Create a minimal model

Step #1: Simplify the given architecture until you can no longer overfit on the small training set

  • reduce number of epochs while training to 50 or even less to have quick experinemtation cycles
  • reduce number of layers
  • reduce number of feature channels
  • make sure your modell actualy has less parameters than the original one (was 4,788,358)
  • if you need a special challenge you can write your model from scratch (you can always reload the original one using the prepared %load)

Now we see that the model at least has the basic capacity for the task, we have to get rid of the overfitting

How to avoid Overfitting using Dropouts

  • A Dropout Layers blacks out a certain percentage of input neurons
  • Which each update of weihts during training other neurons are chosen
  • Hope is to train different parts of the network with each iteration avoiding overfitting
  • Dropout rate typically between 40% and 75%
  • VGG adds Dropout after each convolutional block and after fc layer
    • x = Dropout(0.5)(x)
  • this only applies for training phase, in prediction there is no such layer

Step #2: Train on the complete training set and make sure to still avoid overfitting by optimizing for val-acc

  • train on the complete training set
  • add dropout of 50% as described above
  • gradually make your model more complex until you have minimized overfitting
  • 90% and more of validation accuracy are possible
  • again, reduce the number of epochs of your model to make it trainable on your hardware (100 might work well)
  • if it does not show signs of converging early on, it is likely not complex enough
  • you can also start with a pre-defined architecture and make this less compelx (again using the prepared %load)
  • save the trained model for later comparison


In [22]:
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42, stratify=y)

In [23]:
# https://keras.io/callbacks/#tensorboard
tb_callback = keras.callbacks.TensorBoard(log_dir='./tf_log')
# To start tensorboard
# tensorboard --logdir=/mnt/c/Users/olive/Development/ml/tf_log
# open http://localhost:6006

In [24]:
early_stopping_callback = keras.callbacks.EarlyStopping(monitor='val_loss', patience=50, verbose=1)

In [25]:
checkpoint_callback = keras.callbacks.ModelCheckpoint('./model-checkpoints/weights.epoch-{epoch:02d}-val_loss-{val_loss:.2f}.hdf5');

In [26]:
keras.layers.Dropout?

In [27]:
# full architecture with dropout
# %load https://djcordhose.github.io/ai/fragments/vgg_style_dropout.py

# my sample minimized architecture
# %load https://djcordhose.github.io/ai/fragments/vgg_style_dropout_minmal.py

In [28]:
model = Model(input=inputs, output=predictions)
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_2 (InputLayer)         (None, 64, 64, 3)         0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 64, 64, 64)        1792      
_________________________________________________________________
conv2d_3 (Conv2D)            (None, 64, 64, 64)        36928     
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 32, 32, 64)        0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 32, 32, 64)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 32, 32, 128)       73856     
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 16, 16, 128)       0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 8, 8, 128)         0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 8192)              0         
_________________________________________________________________
dense_3 (Dense)              (None, 256)               2097408   
_________________________________________________________________
dropout_2 (Dropout)          (None, 256)               0         
_________________________________________________________________
dense_4 (Dense)              (None, 6)                 1542      
=================================================================
Total params: 2,211,526
Trainable params: 2,211,526
Non-trainable params: 0
_________________________________________________________________

In [29]:
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In [30]:
!rm -r tf_log

In [31]:
%time model.fit(X_train, y_train, epochs=500, batch_size=BATCH_SIZE, validation_split=0.2, callbacks=[tb_callback, early_stopping_callback])
# %time model.fit(X_train, y_train, epochs=500, batch_size=BATCH_SIZE, validation_split=0.2, callbacks=[tb_callback])
# %time model.fit(X_train, y_train, epochs=500, batch_size=BATCH_SIZE, validation_split=0.2)


Train on 242 samples, validate on 61 samples
Epoch 1/500
242/242 [==============================] - 2s - loss: 1.7566 - acc: 0.2314 - val_loss: 1.8777 - val_acc: 0.0984
Epoch 2/500
242/242 [==============================] - 0s - loss: 2.7072 - acc: 0.2273 - val_loss: 1.6280 - val_acc: 0.2787
Epoch 3/500
242/242 [==============================] - 0s - loss: 2.4442 - acc: 0.2025 - val_loss: 1.7096 - val_acc: 0.3279
Epoch 4/500
242/242 [==============================] - 0s - loss: 1.9441 - acc: 0.1818 - val_loss: 1.7659 - val_acc: 0.2787
Epoch 5/500
242/242 [==============================] - 0s - loss: 1.8391 - acc: 0.1860 - val_loss: 1.7847 - val_acc: 0.2787
Epoch 6/500
242/242 [==============================] - 0s - loss: 1.8113 - acc: 0.1653 - val_loss: 1.7901 - val_acc: 0.2787
Epoch 7/500
242/242 [==============================] - 0s - loss: 1.7910 - acc: 0.1653 - val_loss: 1.7914 - val_acc: 0.2623
Epoch 8/500
242/242 [==============================] - 0s - loss: 1.7854 - acc: 0.2107 - val_loss: 1.7914 - val_acc: 0.1639
Epoch 9/500
242/242 [==============================] - 0s - loss: 1.7767 - acc: 0.2149 - val_loss: 1.7909 - val_acc: 0.1475
Epoch 10/500
242/242 [==============================] - 0s - loss: 1.7642 - acc: 0.2149 - val_loss: 1.7890 - val_acc: 0.1148
Epoch 11/500
242/242 [==============================] - 0s - loss: 1.7502 - acc: 0.2231 - val_loss: 1.7855 - val_acc: 0.0984
Epoch 12/500
242/242 [==============================] - 0s - loss: 1.7232 - acc: 0.2397 - val_loss: 1.7802 - val_acc: 0.0984
Epoch 13/500
242/242 [==============================] - 0s - loss: 1.6977 - acc: 0.2273 - val_loss: 1.7719 - val_acc: 0.0984
Epoch 14/500
242/242 [==============================] - 0s - loss: 1.7056 - acc: 0.2355 - val_loss: 1.7635 - val_acc: 0.0984
Epoch 15/500
242/242 [==============================] - 0s - loss: 1.6974 - acc: 0.2479 - val_loss: 1.7592 - val_acc: 0.0984
Epoch 16/500
242/242 [==============================] - 0s - loss: 1.6835 - acc: 0.2190 - val_loss: 1.7603 - val_acc: 0.1311
Epoch 17/500
242/242 [==============================] - 0s - loss: 1.6781 - acc: 0.2066 - val_loss: 1.7632 - val_acc: 0.1803
Epoch 18/500
242/242 [==============================] - 0s - loss: 1.6759 - acc: 0.2149 - val_loss: 1.7600 - val_acc: 0.3443
Epoch 19/500
242/242 [==============================] - 0s - loss: 1.6807 - acc: 0.2273 - val_loss: 1.7533 - val_acc: 0.3443
Epoch 20/500
242/242 [==============================] - 0s - loss: 1.6634 - acc: 0.2107 - val_loss: 1.7457 - val_acc: 0.3443
Epoch 21/500
242/242 [==============================] - 0s - loss: 1.6819 - acc: 0.2397 - val_loss: 1.7435 - val_acc: 0.3607
Epoch 22/500
242/242 [==============================] - 0s - loss: 1.6492 - acc: 0.2107 - val_loss: 1.7429 - val_acc: 0.3607
Epoch 23/500
242/242 [==============================] - 0s - loss: 1.6418 - acc: 0.2810 - val_loss: 1.7445 - val_acc: 0.3607
Epoch 24/500
242/242 [==============================] - 0s - loss: 1.6312 - acc: 0.2686 - val_loss: 1.7429 - val_acc: 0.2787
Epoch 25/500
242/242 [==============================] - 0s - loss: 1.6400 - acc: 0.2562 - val_loss: 1.7452 - val_acc: 0.2295
Epoch 26/500
242/242 [==============================] - 0s - loss: 1.6594 - acc: 0.2851 - val_loss: 1.7550 - val_acc: 0.2295
Epoch 27/500
242/242 [==============================] - 0s - loss: 1.6277 - acc: 0.3264 - val_loss: 1.7517 - val_acc: 0.2131
Epoch 28/500
242/242 [==============================] - 0s - loss: 1.6330 - acc: 0.2893 - val_loss: 1.7457 - val_acc: 0.2131
Epoch 29/500
242/242 [==============================] - 0s - loss: 1.6367 - acc: 0.2851 - val_loss: 1.7480 - val_acc: 0.2131
Epoch 30/500
242/242 [==============================] - 0s - loss: 1.6393 - acc: 0.2934 - val_loss: 1.7503 - val_acc: 0.2131
Epoch 31/500
242/242 [==============================] - 0s - loss: 1.6406 - acc: 0.3140 - val_loss: 1.7497 - val_acc: 0.2131
Epoch 32/500
242/242 [==============================] - 0s - loss: 1.6288 - acc: 0.3182 - val_loss: 1.7415 - val_acc: 0.2295
Epoch 33/500
242/242 [==============================] - 0s - loss: 1.6390 - acc: 0.3099 - val_loss: 1.7334 - val_acc: 0.2295
Epoch 34/500
242/242 [==============================] - 0s - loss: 1.6321 - acc: 0.3099 - val_loss: 1.7341 - val_acc: 0.2459
Epoch 35/500
242/242 [==============================] - 0s - loss: 1.6085 - acc: 0.2975 - val_loss: 1.7388 - val_acc: 0.2295
Epoch 36/500
242/242 [==============================] - 0s - loss: 1.6185 - acc: 0.3430 - val_loss: 1.7392 - val_acc: 0.2459
Epoch 37/500
242/242 [==============================] - 0s - loss: 1.6108 - acc: 0.3099 - val_loss: 1.7252 - val_acc: 0.2295
Epoch 38/500
242/242 [==============================] - 0s - loss: 1.6042 - acc: 0.3430 - val_loss: 1.7108 - val_acc: 0.2295
Epoch 39/500
242/242 [==============================] - 0s - loss: 1.6156 - acc: 0.3182 - val_loss: 1.7274 - val_acc: 0.2787
Epoch 40/500
242/242 [==============================] - 0s - loss: 1.6149 - acc: 0.3430 - val_loss: 1.7457 - val_acc: 0.2787
Epoch 41/500
242/242 [==============================] - 0s - loss: 1.6038 - acc: 0.3099 - val_loss: 1.7288 - val_acc: 0.2459
Epoch 42/500
242/242 [==============================] - 0s - loss: 1.6147 - acc: 0.3017 - val_loss: 1.7127 - val_acc: 0.2787
Epoch 43/500
242/242 [==============================] - 0s - loss: 1.6009 - acc: 0.3017 - val_loss: 1.7138 - val_acc: 0.2787
Epoch 44/500
242/242 [==============================] - 0s - loss: 1.5985 - acc: 0.3471 - val_loss: 1.7257 - val_acc: 0.2951
Epoch 45/500
242/242 [==============================] - 0s - loss: 1.6010 - acc: 0.3306 - val_loss: 1.7188 - val_acc: 0.3443
Epoch 46/500
242/242 [==============================] - 0s - loss: 1.5581 - acc: 0.3843 - val_loss: 1.6839 - val_acc: 0.3607
Epoch 47/500
242/242 [==============================] - 0s - loss: 1.5605 - acc: 0.3554 - val_loss: 1.6786 - val_acc: 0.3607
Epoch 48/500
242/242 [==============================] - 0s - loss: 1.5663 - acc: 0.3223 - val_loss: 1.7108 - val_acc: 0.3443
Epoch 49/500
242/242 [==============================] - 0s - loss: 1.5667 - acc: 0.3430 - val_loss: 1.7187 - val_acc: 0.2459
Epoch 50/500
242/242 [==============================] - 0s - loss: 1.5485 - acc: 0.3140 - val_loss: 1.6563 - val_acc: 0.3443
Epoch 51/500
242/242 [==============================] - 0s - loss: 1.5233 - acc: 0.3843 - val_loss: 1.6247 - val_acc: 0.3279
Epoch 52/500
242/242 [==============================] - 0s - loss: 1.5234 - acc: 0.4050 - val_loss: 1.6264 - val_acc: 0.3443
Epoch 53/500
242/242 [==============================] - 0s - loss: 1.5170 - acc: 0.3719 - val_loss: 1.6682 - val_acc: 0.2951
Epoch 54/500
242/242 [==============================] - 0s - loss: 1.5385 - acc: 0.3099 - val_loss: 1.6598 - val_acc: 0.3443
Epoch 55/500
242/242 [==============================] - 0s - loss: 1.5138 - acc: 0.3388 - val_loss: 1.6087 - val_acc: 0.3770
Epoch 56/500
242/242 [==============================] - 0s - loss: 1.4530 - acc: 0.3926 - val_loss: 1.5906 - val_acc: 0.3443
Epoch 57/500
242/242 [==============================] - 0s - loss: 1.4678 - acc: 0.3678 - val_loss: 1.5907 - val_acc: 0.3770
Epoch 58/500
242/242 [==============================] - 0s - loss: 1.4597 - acc: 0.3678 - val_loss: 1.5975 - val_acc: 0.4098
Epoch 59/500
242/242 [==============================] - 0s - loss: 1.4358 - acc: 0.3884 - val_loss: 1.6068 - val_acc: 0.4098
Epoch 60/500
242/242 [==============================] - 0s - loss: 1.4602 - acc: 0.3430 - val_loss: 1.5869 - val_acc: 0.3770
Epoch 61/500
242/242 [==============================] - 0s - loss: 1.4566 - acc: 0.3678 - val_loss: 1.5701 - val_acc: 0.3279
Epoch 62/500
242/242 [==============================] - 0s - loss: 1.4046 - acc: 0.4256 - val_loss: 1.5549 - val_acc: 0.3115
Epoch 63/500
242/242 [==============================] - 0s - loss: 1.4028 - acc: 0.4298 - val_loss: 1.5363 - val_acc: 0.3115
Epoch 64/500
242/242 [==============================] - 0s - loss: 1.3551 - acc: 0.4380 - val_loss: 1.5243 - val_acc: 0.3934
Epoch 65/500
242/242 [==============================] - 0s - loss: 1.3744 - acc: 0.4504 - val_loss: 1.5136 - val_acc: 0.4098
Epoch 66/500
242/242 [==============================] - 0s - loss: 1.3866 - acc: 0.4298 - val_loss: 1.5092 - val_acc: 0.4098
Epoch 67/500
242/242 [==============================] - 0s - loss: 1.3544 - acc: 0.4463 - val_loss: 1.5112 - val_acc: 0.3607
Epoch 68/500
242/242 [==============================] - 0s - loss: 1.3426 - acc: 0.4587 - val_loss: 1.5094 - val_acc: 0.4262
Epoch 69/500
242/242 [==============================] - 0s - loss: 1.3628 - acc: 0.4380 - val_loss: 1.5145 - val_acc: 0.4098
Epoch 70/500
242/242 [==============================] - 0s - loss: 1.3330 - acc: 0.4711 - val_loss: 1.5027 - val_acc: 0.3934
Epoch 71/500
242/242 [==============================] - 0s - loss: 1.3339 - acc: 0.4050 - val_loss: 1.4998 - val_acc: 0.4262
Epoch 72/500
242/242 [==============================] - 0s - loss: 1.2524 - acc: 0.5000 - val_loss: 1.4846 - val_acc: 0.3934
Epoch 73/500
242/242 [==============================] - 0s - loss: 1.2782 - acc: 0.4876 - val_loss: 1.4499 - val_acc: 0.4426
Epoch 74/500
242/242 [==============================] - 0s - loss: 1.3100 - acc: 0.4463 - val_loss: 1.4085 - val_acc: 0.4426
Epoch 75/500
242/242 [==============================] - 0s - loss: 1.3108 - acc: 0.4504 - val_loss: 1.4131 - val_acc: 0.5246
Epoch 76/500
242/242 [==============================] - 0s - loss: 1.2606 - acc: 0.4628 - val_loss: 1.4386 - val_acc: 0.5082
Epoch 77/500
242/242 [==============================] - 0s - loss: 1.2257 - acc: 0.5000 - val_loss: 1.4440 - val_acc: 0.5082
Epoch 78/500
242/242 [==============================] - 0s - loss: 1.2195 - acc: 0.5331 - val_loss: 1.4235 - val_acc: 0.5246
Epoch 79/500
242/242 [==============================] - 0s - loss: 1.1640 - acc: 0.5496 - val_loss: 1.4040 - val_acc: 0.5246
Epoch 80/500
242/242 [==============================] - 0s - loss: 1.1804 - acc: 0.5207 - val_loss: 1.3818 - val_acc: 0.5738
Epoch 81/500
242/242 [==============================] - 0s - loss: 1.1353 - acc: 0.5289 - val_loss: 1.3764 - val_acc: 0.5410
Epoch 82/500
242/242 [==============================] - 0s - loss: 1.1352 - acc: 0.5661 - val_loss: 1.3490 - val_acc: 0.5246
Epoch 83/500
242/242 [==============================] - 0s - loss: 1.1571 - acc: 0.5372 - val_loss: 1.3140 - val_acc: 0.6066
Epoch 84/500
242/242 [==============================] - 0s - loss: 1.0645 - acc: 0.6116 - val_loss: 1.3042 - val_acc: 0.6393
Epoch 85/500
242/242 [==============================] - 0s - loss: 1.0690 - acc: 0.5909 - val_loss: 1.2975 - val_acc: 0.6393
Epoch 86/500
242/242 [==============================] - 0s - loss: 1.0527 - acc: 0.5744 - val_loss: 1.2993 - val_acc: 0.5902
Epoch 87/500
242/242 [==============================] - 0s - loss: 1.0257 - acc: 0.6281 - val_loss: 1.3284 - val_acc: 0.5738
Epoch 88/500
242/242 [==============================] - 0s - loss: 1.0246 - acc: 0.6322 - val_loss: 1.2865 - val_acc: 0.6557
Epoch 89/500
242/242 [==============================] - 0s - loss: 1.0130 - acc: 0.6364 - val_loss: 1.2661 - val_acc: 0.6393
Epoch 90/500
242/242 [==============================] - 0s - loss: 0.9997 - acc: 0.5661 - val_loss: 1.2479 - val_acc: 0.6393
Epoch 91/500
242/242 [==============================] - 0s - loss: 1.0042 - acc: 0.6322 - val_loss: 1.1993 - val_acc: 0.6393
Epoch 92/500
242/242 [==============================] - 0s - loss: 0.9258 - acc: 0.6653 - val_loss: 1.1848 - val_acc: 0.6066
Epoch 93/500
242/242 [==============================] - 0s - loss: 0.9043 - acc: 0.6570 - val_loss: 1.1678 - val_acc: 0.6393
Epoch 94/500
242/242 [==============================] - 0s - loss: 0.8662 - acc: 0.7066 - val_loss: 1.1536 - val_acc: 0.6557
Epoch 95/500
242/242 [==============================] - 0s - loss: 0.8781 - acc: 0.6818 - val_loss: 1.1195 - val_acc: 0.6393
Epoch 96/500
242/242 [==============================] - 0s - loss: 0.7857 - acc: 0.7479 - val_loss: 1.0520 - val_acc: 0.6393
Epoch 97/500
242/242 [==============================] - 0s - loss: 0.8084 - acc: 0.7231 - val_loss: 1.0739 - val_acc: 0.6393
Epoch 98/500
242/242 [==============================] - 0s - loss: 0.8175 - acc: 0.6860 - val_loss: 1.0050 - val_acc: 0.7213
Epoch 99/500
242/242 [==============================] - 0s - loss: 0.7582 - acc: 0.7397 - val_loss: 1.0143 - val_acc: 0.7213
Epoch 100/500
242/242 [==============================] - 0s - loss: 0.7223 - acc: 0.7769 - val_loss: 1.0211 - val_acc: 0.7377
Epoch 101/500
242/242 [==============================] - 0s - loss: 0.7147 - acc: 0.7645 - val_loss: 0.9477 - val_acc: 0.7541
Epoch 102/500
242/242 [==============================] - 0s - loss: 0.6949 - acc: 0.7603 - val_loss: 0.9377 - val_acc: 0.7377
Epoch 103/500
242/242 [==============================] - 0s - loss: 0.6768 - acc: 0.7190 - val_loss: 0.9064 - val_acc: 0.7213
Epoch 104/500
242/242 [==============================] - 0s - loss: 0.6937 - acc: 0.7562 - val_loss: 0.9111 - val_acc: 0.7049
Epoch 105/500
242/242 [==============================] - 0s - loss: 0.5907 - acc: 0.7934 - val_loss: 0.9344 - val_acc: 0.7541
Epoch 106/500
242/242 [==============================] - 0s - loss: 0.5703 - acc: 0.8140 - val_loss: 0.8922 - val_acc: 0.7705
Epoch 107/500
242/242 [==============================] - 0s - loss: 0.6179 - acc: 0.7893 - val_loss: 0.8952 - val_acc: 0.7705
Epoch 108/500
242/242 [==============================] - 0s - loss: 0.5622 - acc: 0.8099 - val_loss: 0.8486 - val_acc: 0.8033
Epoch 109/500
242/242 [==============================] - 0s - loss: 0.5089 - acc: 0.8388 - val_loss: 0.7936 - val_acc: 0.8033
Epoch 110/500
242/242 [==============================] - 0s - loss: 0.5145 - acc: 0.8430 - val_loss: 0.8017 - val_acc: 0.7705
Epoch 111/500
242/242 [==============================] - 0s - loss: 0.4904 - acc: 0.8306 - val_loss: 0.7665 - val_acc: 0.7705
Epoch 112/500
242/242 [==============================] - 0s - loss: 0.4895 - acc: 0.8388 - val_loss: 0.7746 - val_acc: 0.7869
Epoch 113/500
242/242 [==============================] - 0s - loss: 0.4758 - acc: 0.8430 - val_loss: 0.7903 - val_acc: 0.7869
Epoch 114/500
242/242 [==============================] - 0s - loss: 0.4756 - acc: 0.8306 - val_loss: 0.7138 - val_acc: 0.8525
Epoch 115/500
242/242 [==============================] - 0s - loss: 0.4269 - acc: 0.8678 - val_loss: 0.6764 - val_acc: 0.8689
Epoch 116/500
242/242 [==============================] - 0s - loss: 0.4917 - acc: 0.8595 - val_loss: 0.6653 - val_acc: 0.8525
Epoch 117/500
242/242 [==============================] - 0s - loss: 0.4167 - acc: 0.8678 - val_loss: 0.6916 - val_acc: 0.8033
Epoch 118/500
242/242 [==============================] - 0s - loss: 0.4360 - acc: 0.8843 - val_loss: 0.6693 - val_acc: 0.8689
Epoch 119/500
242/242 [==============================] - 0s - loss: 0.4204 - acc: 0.8678 - val_loss: 0.6634 - val_acc: 0.8361
Epoch 120/500
242/242 [==============================] - 0s - loss: 0.3601 - acc: 0.8678 - val_loss: 0.6416 - val_acc: 0.8361
Epoch 121/500
242/242 [==============================] - 0s - loss: 0.3630 - acc: 0.8926 - val_loss: 0.6578 - val_acc: 0.8197
Epoch 122/500
242/242 [==============================] - 0s - loss: 0.3320 - acc: 0.8719 - val_loss: 0.6398 - val_acc: 0.8033
Epoch 123/500
242/242 [==============================] - 0s - loss: 0.4339 - acc: 0.8678 - val_loss: 0.5674 - val_acc: 0.8689
Epoch 124/500
242/242 [==============================] - 0s - loss: 0.3252 - acc: 0.9008 - val_loss: 0.6097 - val_acc: 0.8361
Epoch 125/500
242/242 [==============================] - 0s - loss: 0.3628 - acc: 0.8802 - val_loss: 0.5793 - val_acc: 0.8852
Epoch 126/500
242/242 [==============================] - 0s - loss: 0.3632 - acc: 0.8719 - val_loss: 0.5700 - val_acc: 0.8852
Epoch 127/500
242/242 [==============================] - 0s - loss: 0.3227 - acc: 0.9050 - val_loss: 0.5709 - val_acc: 0.8852
Epoch 128/500
242/242 [==============================] - 0s - loss: 0.3382 - acc: 0.8802 - val_loss: 0.5372 - val_acc: 0.8852
Epoch 129/500
242/242 [==============================] - 0s - loss: 0.3181 - acc: 0.8843 - val_loss: 0.5661 - val_acc: 0.8689
Epoch 130/500
242/242 [==============================] - 0s - loss: 0.3578 - acc: 0.8884 - val_loss: 0.5327 - val_acc: 0.8852
Epoch 131/500
242/242 [==============================] - 0s - loss: 0.3070 - acc: 0.9008 - val_loss: 0.5437 - val_acc: 0.8689
Epoch 132/500
242/242 [==============================] - 0s - loss: 0.3111 - acc: 0.9256 - val_loss: 0.6165 - val_acc: 0.8689
Epoch 133/500
242/242 [==============================] - 0s - loss: 0.3036 - acc: 0.9008 - val_loss: 0.6290 - val_acc: 0.8525
Epoch 134/500
242/242 [==============================] - 0s - loss: 0.2760 - acc: 0.9215 - val_loss: 0.5637 - val_acc: 0.8689
Epoch 135/500
242/242 [==============================] - 0s - loss: 0.3009 - acc: 0.9132 - val_loss: 0.4879 - val_acc: 0.8689
Epoch 136/500
242/242 [==============================] - 0s - loss: 0.2640 - acc: 0.9132 - val_loss: 0.4396 - val_acc: 0.8852
Epoch 137/500
242/242 [==============================] - 0s - loss: 0.2461 - acc: 0.9091 - val_loss: 0.4379 - val_acc: 0.8852
Epoch 138/500
242/242 [==============================] - 0s - loss: 0.1995 - acc: 0.9339 - val_loss: 0.4471 - val_acc: 0.8852
Epoch 139/500
242/242 [==============================] - 0s - loss: 0.2530 - acc: 0.9339 - val_loss: 0.4645 - val_acc: 0.8852
Epoch 140/500
242/242 [==============================] - 0s - loss: 0.2966 - acc: 0.9008 - val_loss: 0.4724 - val_acc: 0.8852
Epoch 141/500
242/242 [==============================] - 0s - loss: 0.2397 - acc: 0.9256 - val_loss: 0.4833 - val_acc: 0.8525
Epoch 142/500
242/242 [==============================] - 0s - loss: 0.1868 - acc: 0.9545 - val_loss: 0.5012 - val_acc: 0.8361
Epoch 143/500
242/242 [==============================] - 0s - loss: 0.2239 - acc: 0.9256 - val_loss: 0.4803 - val_acc: 0.8525
Epoch 144/500
242/242 [==============================] - 0s - loss: 0.1909 - acc: 0.9256 - val_loss: 0.4508 - val_acc: 0.9016
Epoch 145/500
242/242 [==============================] - 0s - loss: 0.1818 - acc: 0.9421 - val_loss: 0.4464 - val_acc: 0.8689
Epoch 146/500
242/242 [==============================] - 0s - loss: 0.1723 - acc: 0.9504 - val_loss: 0.4403 - val_acc: 0.8852
Epoch 147/500
242/242 [==============================] - 0s - loss: 0.1854 - acc: 0.9463 - val_loss: 0.4426 - val_acc: 0.8852
Epoch 148/500
242/242 [==============================] - 0s - loss: 0.1794 - acc: 0.9421 - val_loss: 0.4483 - val_acc: 0.8689
Epoch 149/500
242/242 [==============================] - 0s - loss: 0.2283 - acc: 0.9132 - val_loss: 0.4226 - val_acc: 0.8852
Epoch 150/500
242/242 [==============================] - 0s - loss: 0.1324 - acc: 0.9752 - val_loss: 0.3987 - val_acc: 0.8852
Epoch 151/500
242/242 [==============================] - 0s - loss: 0.1704 - acc: 0.9504 - val_loss: 0.3778 - val_acc: 0.8852
Epoch 152/500
242/242 [==============================] - 0s - loss: 0.1611 - acc: 0.9628 - val_loss: 0.3686 - val_acc: 0.8689
Epoch 153/500
242/242 [==============================] - 0s - loss: 0.2229 - acc: 0.9050 - val_loss: 0.4062 - val_acc: 0.8852
Epoch 154/500
242/242 [==============================] - 0s - loss: 0.1631 - acc: 0.9463 - val_loss: 0.4320 - val_acc: 0.8852
Epoch 155/500
242/242 [==============================] - 0s - loss: 0.1644 - acc: 0.9339 - val_loss: 0.4232 - val_acc: 0.8852
Epoch 156/500
242/242 [==============================] - 0s - loss: 0.1582 - acc: 0.9421 - val_loss: 0.4180 - val_acc: 0.8852
Epoch 157/500
242/242 [==============================] - 0s - loss: 0.1593 - acc: 0.9545 - val_loss: 0.3903 - val_acc: 0.8852
Epoch 158/500
242/242 [==============================] - 0s - loss: 0.2100 - acc: 0.9298 - val_loss: 0.3901 - val_acc: 0.8689
Epoch 159/500
242/242 [==============================] - 0s - loss: 0.1621 - acc: 0.9545 - val_loss: 0.3946 - val_acc: 0.8525
Epoch 160/500
242/242 [==============================] - 0s - loss: 0.1500 - acc: 0.9587 - val_loss: 0.3892 - val_acc: 0.8689
Epoch 161/500
242/242 [==============================] - 0s - loss: 0.1529 - acc: 0.9504 - val_loss: 0.3613 - val_acc: 0.9016
Epoch 162/500
242/242 [==============================] - 0s - loss: 0.1543 - acc: 0.9421 - val_loss: 0.3530 - val_acc: 0.9016
Epoch 163/500
242/242 [==============================] - 0s - loss: 0.1444 - acc: 0.9545 - val_loss: 0.3575 - val_acc: 0.9016
Epoch 164/500
242/242 [==============================] - 0s - loss: 0.1684 - acc: 0.9587 - val_loss: 0.3500 - val_acc: 0.9016
Epoch 165/500
242/242 [==============================] - 0s - loss: 0.1159 - acc: 0.9669 - val_loss: 0.3681 - val_acc: 0.8689
Epoch 166/500
242/242 [==============================] - 0s - loss: 0.0952 - acc: 0.9669 - val_loss: 0.3968 - val_acc: 0.8689
Epoch 167/500
242/242 [==============================] - 0s - loss: 0.1331 - acc: 0.9463 - val_loss: 0.4206 - val_acc: 0.8852
Epoch 168/500
242/242 [==============================] - 0s - loss: 0.1660 - acc: 0.9339 - val_loss: 0.4094 - val_acc: 0.8852
Epoch 169/500
242/242 [==============================] - 0s - loss: 0.1380 - acc: 0.9380 - val_loss: 0.3893 - val_acc: 0.8852
Epoch 170/500
242/242 [==============================] - 0s - loss: 0.1281 - acc: 0.9545 - val_loss: 0.3574 - val_acc: 0.8689
Epoch 171/500
242/242 [==============================] - 0s - loss: 0.1773 - acc: 0.9298 - val_loss: 0.3414 - val_acc: 0.9016
Epoch 172/500
242/242 [==============================] - 0s - loss: 0.0905 - acc: 0.9752 - val_loss: 0.3399 - val_acc: 0.9016
Epoch 173/500
242/242 [==============================] - 0s - loss: 0.1139 - acc: 0.9711 - val_loss: 0.3331 - val_acc: 0.9016
Epoch 174/500
242/242 [==============================] - 0s - loss: 0.1651 - acc: 0.9628 - val_loss: 0.3367 - val_acc: 0.9016
Epoch 175/500
242/242 [==============================] - 0s - loss: 0.1062 - acc: 0.9587 - val_loss: 0.3510 - val_acc: 0.8852
Epoch 176/500
242/242 [==============================] - 0s - loss: 0.1065 - acc: 0.9628 - val_loss: 0.3733 - val_acc: 0.9016
Epoch 177/500
242/242 [==============================] - 0s - loss: 0.0936 - acc: 0.9752 - val_loss: 0.3771 - val_acc: 0.8852
Epoch 178/500
242/242 [==============================] - 0s - loss: 0.1292 - acc: 0.9587 - val_loss: 0.3424 - val_acc: 0.9016
Epoch 179/500
242/242 [==============================] - 0s - loss: 0.1031 - acc: 0.9711 - val_loss: 0.3121 - val_acc: 0.9344
Epoch 180/500
242/242 [==============================] - 0s - loss: 0.0960 - acc: 0.9752 - val_loss: 0.3128 - val_acc: 0.9016
Epoch 181/500
242/242 [==============================] - 0s - loss: 0.1013 - acc: 0.9752 - val_loss: 0.3197 - val_acc: 0.9016
Epoch 182/500
242/242 [==============================] - 0s - loss: 0.1176 - acc: 0.9545 - val_loss: 0.3118 - val_acc: 0.9016
Epoch 183/500
242/242 [==============================] - 0s - loss: 0.0892 - acc: 0.9876 - val_loss: 0.3128 - val_acc: 0.9016
Epoch 184/500
242/242 [==============================] - 0s - loss: 0.0898 - acc: 0.9876 - val_loss: 0.3386 - val_acc: 0.8689
Epoch 185/500
242/242 [==============================] - 0s - loss: 0.1148 - acc: 0.9669 - val_loss: 0.3993 - val_acc: 0.8525
Epoch 186/500
242/242 [==============================] - 0s - loss: 0.0820 - acc: 0.9711 - val_loss: 0.4561 - val_acc: 0.8689
Epoch 187/500
242/242 [==============================] - 0s - loss: 0.0545 - acc: 0.9876 - val_loss: 0.4999 - val_acc: 0.8689
Epoch 188/500
242/242 [==============================] - 0s - loss: 0.0939 - acc: 0.9628 - val_loss: 0.4968 - val_acc: 0.8689
Epoch 189/500
242/242 [==============================] - 0s - loss: 0.0995 - acc: 0.9628 - val_loss: 0.4465 - val_acc: 0.8689
Epoch 190/500
242/242 [==============================] - 0s - loss: 0.0711 - acc: 0.9835 - val_loss: 0.3783 - val_acc: 0.8689
Epoch 191/500
242/242 [==============================] - 0s - loss: 0.1010 - acc: 0.9752 - val_loss: 0.3380 - val_acc: 0.8852
Epoch 192/500
242/242 [==============================] - 0s - loss: 0.1172 - acc: 0.9628 - val_loss: 0.3307 - val_acc: 0.9016
Epoch 193/500
242/242 [==============================] - 0s - loss: 0.0860 - acc: 0.9752 - val_loss: 0.3135 - val_acc: 0.9016
Epoch 194/500
242/242 [==============================] - 0s - loss: 0.0745 - acc: 0.9711 - val_loss: 0.3093 - val_acc: 0.9016
Epoch 195/500
242/242 [==============================] - 0s - loss: 0.0714 - acc: 0.9835 - val_loss: 0.2969 - val_acc: 0.9016
Epoch 196/500
242/242 [==============================] - 0s - loss: 0.0743 - acc: 0.9752 - val_loss: 0.2871 - val_acc: 0.9016
Epoch 197/500
242/242 [==============================] - 0s - loss: 0.0661 - acc: 0.9752 - val_loss: 0.2832 - val_acc: 0.8689
Epoch 198/500
242/242 [==============================] - 0s - loss: 0.0863 - acc: 0.9669 - val_loss: 0.2641 - val_acc: 0.9344
Epoch 199/500
242/242 [==============================] - 0s - loss: 0.0902 - acc: 0.9793 - val_loss: 0.2629 - val_acc: 0.9180
Epoch 200/500
242/242 [==============================] - 0s - loss: 0.0875 - acc: 0.9711 - val_loss: 0.2774 - val_acc: 0.9344
Epoch 201/500
242/242 [==============================] - 0s - loss: 0.0739 - acc: 0.9711 - val_loss: 0.2884 - val_acc: 0.9180
Epoch 202/500
242/242 [==============================] - 0s - loss: 0.0787 - acc: 0.9711 - val_loss: 0.2913 - val_acc: 0.8852
Epoch 203/500
242/242 [==============================] - 0s - loss: 0.0452 - acc: 0.9959 - val_loss: 0.3097 - val_acc: 0.8852
Epoch 204/500
242/242 [==============================] - 0s - loss: 0.0722 - acc: 0.9876 - val_loss: 0.3201 - val_acc: 0.8852
Epoch 205/500
242/242 [==============================] - 0s - loss: 0.0882 - acc: 0.9711 - val_loss: 0.3043 - val_acc: 0.9016
Epoch 206/500
242/242 [==============================] - 0s - loss: 0.0934 - acc: 0.9669 - val_loss: 0.2896 - val_acc: 0.9016
Epoch 207/500
242/242 [==============================] - 0s - loss: 0.0814 - acc: 0.9669 - val_loss: 0.2908 - val_acc: 0.9016
Epoch 208/500
242/242 [==============================] - 0s - loss: 0.0932 - acc: 0.9793 - val_loss: 0.2767 - val_acc: 0.9016
Epoch 209/500
242/242 [==============================] - 0s - loss: 0.0788 - acc: 0.9835 - val_loss: 0.2590 - val_acc: 0.9180
Epoch 210/500
242/242 [==============================] - 0s - loss: 0.0564 - acc: 0.9876 - val_loss: 0.2442 - val_acc: 0.9344
Epoch 211/500
242/242 [==============================] - 0s - loss: 0.0412 - acc: 0.9959 - val_loss: 0.2336 - val_acc: 0.9344
Epoch 212/500
242/242 [==============================] - 0s - loss: 0.0434 - acc: 0.9917 - val_loss: 0.2251 - val_acc: 0.9344
Epoch 213/500
242/242 [==============================] - 0s - loss: 0.0468 - acc: 0.9876 - val_loss: 0.2200 - val_acc: 0.9344
Epoch 214/500
242/242 [==============================] - 0s - loss: 0.0605 - acc: 0.9793 - val_loss: 0.2199 - val_acc: 0.9344
Epoch 215/500
242/242 [==============================] - 0s - loss: 0.0482 - acc: 0.9917 - val_loss: 0.2242 - val_acc: 0.9344
Epoch 216/500
242/242 [==============================] - 0s - loss: 0.0663 - acc: 0.9711 - val_loss: 0.2371 - val_acc: 0.9180
Epoch 217/500
242/242 [==============================] - 0s - loss: 0.0513 - acc: 0.9876 - val_loss: 0.2490 - val_acc: 0.9180
Epoch 218/500
242/242 [==============================] - 0s - loss: 0.0662 - acc: 0.9876 - val_loss: 0.2597 - val_acc: 0.9180
Epoch 219/500
242/242 [==============================] - 0s - loss: 0.0579 - acc: 0.9793 - val_loss: 0.2580 - val_acc: 0.9180
Epoch 220/500
242/242 [==============================] - 0s - loss: 0.0914 - acc: 0.9628 - val_loss: 0.2582 - val_acc: 0.9180
Epoch 221/500
242/242 [==============================] - 0s - loss: 0.0429 - acc: 0.9876 - val_loss: 0.2665 - val_acc: 0.9180
Epoch 222/500
242/242 [==============================] - 0s - loss: 0.0386 - acc: 0.9917 - val_loss: 0.2723 - val_acc: 0.9180
Epoch 223/500
242/242 [==============================] - 0s - loss: 0.0440 - acc: 0.9917 - val_loss: 0.2731 - val_acc: 0.9180
Epoch 224/500
242/242 [==============================] - 0s - loss: 0.0550 - acc: 0.9876 - val_loss: 0.2635 - val_acc: 0.9180
Epoch 225/500
242/242 [==============================] - 0s - loss: 0.0488 - acc: 0.9835 - val_loss: 0.2534 - val_acc: 0.9180
Epoch 226/500
242/242 [==============================] - 0s - loss: 0.0386 - acc: 0.9917 - val_loss: 0.2558 - val_acc: 0.9180
Epoch 227/500
242/242 [==============================] - 0s - loss: 0.0600 - acc: 0.9917 - val_loss: 0.2651 - val_acc: 0.9180
Epoch 228/500
242/242 [==============================] - 0s - loss: 0.0724 - acc: 0.9752 - val_loss: 0.2690 - val_acc: 0.9180
Epoch 229/500
242/242 [==============================] - 0s - loss: 0.0731 - acc: 0.9793 - val_loss: 0.2611 - val_acc: 0.9180
Epoch 230/500
242/242 [==============================] - 0s - loss: 0.0440 - acc: 0.9876 - val_loss: 0.2556 - val_acc: 0.9016
Epoch 231/500
242/242 [==============================] - 0s - loss: 0.0483 - acc: 0.9876 - val_loss: 0.2580 - val_acc: 0.9180
Epoch 232/500
242/242 [==============================] - 0s - loss: 0.0295 - acc: 0.9917 - val_loss: 0.2664 - val_acc: 0.9180
Epoch 233/500
242/242 [==============================] - 0s - loss: 0.0564 - acc: 0.9876 - val_loss: 0.2654 - val_acc: 0.9180
Epoch 234/500
242/242 [==============================] - 0s - loss: 0.0805 - acc: 0.9711 - val_loss: 0.2627 - val_acc: 0.9180
Epoch 235/500
242/242 [==============================] - 0s - loss: 0.0556 - acc: 0.9793 - val_loss: 0.2580 - val_acc: 0.9016
Epoch 236/500
242/242 [==============================] - 0s - loss: 0.0397 - acc: 0.9835 - val_loss: 0.2629 - val_acc: 0.9016
Epoch 237/500
242/242 [==============================] - 0s - loss: 0.0466 - acc: 0.9917 - val_loss: 0.2691 - val_acc: 0.9016
Epoch 238/500
242/242 [==============================] - 0s - loss: 0.0823 - acc: 0.9752 - val_loss: 0.2919 - val_acc: 0.9180
Epoch 239/500
242/242 [==============================] - 0s - loss: 0.0401 - acc: 0.9835 - val_loss: 0.3202 - val_acc: 0.9016
Epoch 240/500
242/242 [==============================] - 0s - loss: 0.0424 - acc: 0.9876 - val_loss: 0.3321 - val_acc: 0.9016
Epoch 241/500
242/242 [==============================] - 0s - loss: 0.0907 - acc: 0.9628 - val_loss: 0.3131 - val_acc: 0.9180
Epoch 242/500
242/242 [==============================] - 0s - loss: 0.0357 - acc: 0.9959 - val_loss: 0.2914 - val_acc: 0.9180
Epoch 243/500
242/242 [==============================] - 0s - loss: 0.0321 - acc: 0.9876 - val_loss: 0.2721 - val_acc: 0.9180
Epoch 244/500
242/242 [==============================] - 0s - loss: 0.0473 - acc: 0.9876 - val_loss: 0.2517 - val_acc: 0.9180
Epoch 245/500
242/242 [==============================] - 0s - loss: 0.0349 - acc: 0.9917 - val_loss: 0.2374 - val_acc: 0.9344
Epoch 246/500
242/242 [==============================] - 0s - loss: 0.0369 - acc: 0.9917 - val_loss: 0.2318 - val_acc: 0.9508
Epoch 247/500
242/242 [==============================] - 0s - loss: 0.0238 - acc: 1.0000 - val_loss: 0.2371 - val_acc: 0.9344
Epoch 248/500
242/242 [==============================] - 0s - loss: 0.0705 - acc: 0.9876 - val_loss: 0.2393 - val_acc: 0.9508
Epoch 249/500
242/242 [==============================] - 0s - loss: 0.0418 - acc: 0.9835 - val_loss: 0.2414 - val_acc: 0.9344
Epoch 250/500
242/242 [==============================] - 0s - loss: 0.0244 - acc: 0.9959 - val_loss: 0.2369 - val_acc: 0.9344
Epoch 251/500
242/242 [==============================] - 0s - loss: 0.0477 - acc: 0.9793 - val_loss: 0.2378 - val_acc: 0.9344
Epoch 252/500
242/242 [==============================] - 0s - loss: 0.0393 - acc: 0.9835 - val_loss: 0.2387 - val_acc: 0.9344
Epoch 253/500
242/242 [==============================] - 0s - loss: 0.0286 - acc: 1.0000 - val_loss: 0.2517 - val_acc: 0.9180
Epoch 254/500
242/242 [==============================] - 0s - loss: 0.0356 - acc: 0.9876 - val_loss: 0.2771 - val_acc: 0.9180
Epoch 255/500
242/242 [==============================] - 0s - loss: 0.0377 - acc: 0.9876 - val_loss: 0.3010 - val_acc: 0.9016
Epoch 256/500
242/242 [==============================] - 0s - loss: 0.0476 - acc: 0.9835 - val_loss: 0.3133 - val_acc: 0.9016
Epoch 257/500
242/242 [==============================] - 0s - loss: 0.0429 - acc: 0.9876 - val_loss: 0.3141 - val_acc: 0.9016
Epoch 258/500
242/242 [==============================] - 0s - loss: 0.0362 - acc: 0.9793 - val_loss: 0.3219 - val_acc: 0.9016
Epoch 259/500
242/242 [==============================] - 0s - loss: 0.0499 - acc: 0.9876 - val_loss: 0.3244 - val_acc: 0.9016
Epoch 260/500
242/242 [==============================] - 0s - loss: 0.0340 - acc: 0.9876 - val_loss: 0.3257 - val_acc: 0.9016
Epoch 261/500
242/242 [==============================] - 0s - loss: 0.0348 - acc: 0.9917 - val_loss: 0.3198 - val_acc: 0.9016
Epoch 262/500
242/242 [==============================] - 0s - loss: 0.0385 - acc: 0.9917 - val_loss: 0.3173 - val_acc: 0.8852
Epoch 263/500
242/242 [==============================] - 0s - loss: 0.0377 - acc: 0.9917 - val_loss: 0.2989 - val_acc: 0.8852
Epoch 264/500
242/242 [==============================] - 0s - loss: 0.0382 - acc: 0.9917 - val_loss: 0.2839 - val_acc: 0.8852
Epoch 265/500
242/242 [==============================] - 0s - loss: 0.0320 - acc: 0.9876 - val_loss: 0.2691 - val_acc: 0.8852
Epoch 266/500
242/242 [==============================] - 0s - loss: 0.0390 - acc: 0.9876 - val_loss: 0.2564 - val_acc: 0.9016
Epoch 267/500
242/242 [==============================] - 0s - loss: 0.0222 - acc: 1.0000 - val_loss: 0.2476 - val_acc: 0.9180
Epoch 268/500
242/242 [==============================] - 0s - loss: 0.0425 - acc: 0.9793 - val_loss: 0.2613 - val_acc: 0.9180
Epoch 269/500
242/242 [==============================] - 0s - loss: 0.0161 - acc: 0.9917 - val_loss: 0.2666 - val_acc: 0.9344
Epoch 270/500
242/242 [==============================] - 0s - loss: 0.0362 - acc: 0.9876 - val_loss: 0.2783 - val_acc: 0.9180
Epoch 271/500
242/242 [==============================] - 0s - loss: 0.0387 - acc: 0.9917 - val_loss: 0.2903 - val_acc: 0.9180
Epoch 272/500
242/242 [==============================] - 0s - loss: 0.0311 - acc: 0.9876 - val_loss: 0.2847 - val_acc: 0.9180
Epoch 273/500
242/242 [==============================] - 0s - loss: 0.0418 - acc: 0.9917 - val_loss: 0.2729 - val_acc: 0.9016
Epoch 274/500
242/242 [==============================] - 0s - loss: 0.0252 - acc: 0.9959 - val_loss: 0.2606 - val_acc: 0.9016
Epoch 275/500
242/242 [==============================] - 0s - loss: 0.0373 - acc: 0.9835 - val_loss: 0.2441 - val_acc: 0.9016
Epoch 276/500
242/242 [==============================] - 0s - loss: 0.0165 - acc: 0.9917 - val_loss: 0.2347 - val_acc: 0.9016
Epoch 277/500
242/242 [==============================] - 0s - loss: 0.0199 - acc: 0.9917 - val_loss: 0.2301 - val_acc: 0.9180
Epoch 278/500
242/242 [==============================] - 0s - loss: 0.0464 - acc: 0.9917 - val_loss: 0.2263 - val_acc: 0.9180
Epoch 279/500
242/242 [==============================] - 0s - loss: 0.0143 - acc: 1.0000 - val_loss: 0.2289 - val_acc: 0.9344
Epoch 280/500
242/242 [==============================] - 0s - loss: 0.0311 - acc: 0.9917 - val_loss: 0.2275 - val_acc: 0.9344
Epoch 281/500
242/242 [==============================] - 0s - loss: 0.0668 - acc: 0.9752 - val_loss: 0.2263 - val_acc: 0.9180
Epoch 282/500
242/242 [==============================] - 0s - loss: 0.0217 - acc: 0.9959 - val_loss: 0.2349 - val_acc: 0.9180
Epoch 283/500
242/242 [==============================] - 0s - loss: 0.0213 - acc: 0.9959 - val_loss: 0.2506 - val_acc: 0.9016
Epoch 284/500
242/242 [==============================] - 0s - loss: 0.0266 - acc: 0.9959 - val_loss: 0.2681 - val_acc: 0.9016
Epoch 285/500
242/242 [==============================] - 0s - loss: 0.0164 - acc: 0.9959 - val_loss: 0.2861 - val_acc: 0.9180
Epoch 286/500
242/242 [==============================] - 0s - loss: 0.0913 - acc: 0.9876 - val_loss: 0.2850 - val_acc: 0.9180
Epoch 287/500
242/242 [==============================] - 0s - loss: 0.0288 - acc: 0.9917 - val_loss: 0.2847 - val_acc: 0.9016
Epoch 288/500
242/242 [==============================] - 0s - loss: 0.0255 - acc: 0.9917 - val_loss: 0.2849 - val_acc: 0.9016
Epoch 289/500
242/242 [==============================] - 0s - loss: 0.0343 - acc: 0.9917 - val_loss: 0.2799 - val_acc: 0.9016
Epoch 290/500
242/242 [==============================] - 0s - loss: 0.0572 - acc: 0.9876 - val_loss: 0.2518 - val_acc: 0.9016
Epoch 291/500
242/242 [==============================] - 0s - loss: 0.0258 - acc: 0.9876 - val_loss: 0.2265 - val_acc: 0.9016
Epoch 292/500
242/242 [==============================] - 0s - loss: 0.0288 - acc: 0.9959 - val_loss: 0.2280 - val_acc: 0.9016
Epoch 293/500
242/242 [==============================] - 0s - loss: 0.0172 - acc: 1.0000 - val_loss: 0.2345 - val_acc: 0.9016
Epoch 294/500
242/242 [==============================] - 0s - loss: 0.0302 - acc: 0.9876 - val_loss: 0.2202 - val_acc: 0.9180
Epoch 295/500
242/242 [==============================] - 0s - loss: 0.0096 - acc: 1.0000 - val_loss: 0.2122 - val_acc: 0.9180
Epoch 296/500
242/242 [==============================] - 0s - loss: 0.0459 - acc: 0.9917 - val_loss: 0.2216 - val_acc: 0.9180
Epoch 297/500
242/242 [==============================] - 0s - loss: 0.0627 - acc: 0.9793 - val_loss: 0.2375 - val_acc: 0.9180
Epoch 298/500
242/242 [==============================] - 0s - loss: 0.0258 - acc: 0.9959 - val_loss: 0.2633 - val_acc: 0.9180
Epoch 299/500
242/242 [==============================] - 0s - loss: 0.0490 - acc: 0.9876 - val_loss: 0.2728 - val_acc: 0.9180
Epoch 300/500
242/242 [==============================] - 0s - loss: 0.0197 - acc: 0.9959 - val_loss: 0.2955 - val_acc: 0.9016
Epoch 301/500
242/242 [==============================] - 0s - loss: 0.0390 - acc: 0.9917 - val_loss: 0.3066 - val_acc: 0.9016
Epoch 302/500
242/242 [==============================] - 0s - loss: 0.0316 - acc: 0.9917 - val_loss: 0.3125 - val_acc: 0.9016
Epoch 303/500
242/242 [==============================] - 0s - loss: 0.0264 - acc: 0.9876 - val_loss: 0.3413 - val_acc: 0.9016
Epoch 304/500
242/242 [==============================] - 0s - loss: 0.0430 - acc: 0.9917 - val_loss: 0.3593 - val_acc: 0.9016
Epoch 305/500
242/242 [==============================] - 0s - loss: 0.0680 - acc: 0.9917 - val_loss: 0.3576 - val_acc: 0.8852
Epoch 306/500
242/242 [==============================] - 0s - loss: 0.0275 - acc: 1.0000 - val_loss: 0.3428 - val_acc: 0.8852
Epoch 307/500
242/242 [==============================] - 0s - loss: 0.0405 - acc: 0.9917 - val_loss: 0.3137 - val_acc: 0.8852
Epoch 308/500
242/242 [==============================] - 0s - loss: 0.0266 - acc: 0.9917 - val_loss: 0.2925 - val_acc: 0.8852
Epoch 309/500
242/242 [==============================] - 0s - loss: 0.0228 - acc: 0.9917 - val_loss: 0.2778 - val_acc: 0.9016
Epoch 310/500
242/242 [==============================] - 0s - loss: 0.0283 - acc: 0.9917 - val_loss: 0.2746 - val_acc: 0.9180
Epoch 311/500
242/242 [==============================] - 0s - loss: 0.0351 - acc: 0.9917 - val_loss: 0.2666 - val_acc: 0.9180
Epoch 312/500
242/242 [==============================] - 0s - loss: 0.0211 - acc: 0.9959 - val_loss: 0.2724 - val_acc: 0.9180
Epoch 313/500
242/242 [==============================] - 0s - loss: 0.0144 - acc: 1.0000 - val_loss: 0.2833 - val_acc: 0.9180
Epoch 314/500
242/242 [==============================] - 0s - loss: 0.0699 - acc: 0.9917 - val_loss: 0.2775 - val_acc: 0.9180
Epoch 315/500
242/242 [==============================] - 0s - loss: 0.0364 - acc: 0.9876 - val_loss: 0.2803 - val_acc: 0.9180
Epoch 316/500
242/242 [==============================] - 0s - loss: 0.0157 - acc: 1.0000 - val_loss: 0.2853 - val_acc: 0.9016
Epoch 317/500
242/242 [==============================] - 0s - loss: 0.0329 - acc: 0.9876 - val_loss: 0.2866 - val_acc: 0.9016
Epoch 318/500
242/242 [==============================] - 0s - loss: 0.0271 - acc: 0.9876 - val_loss: 0.2955 - val_acc: 0.9016
Epoch 319/500
242/242 [==============================] - 0s - loss: 0.0270 - acc: 0.9917 - val_loss: 0.3048 - val_acc: 0.9180
Epoch 320/500
242/242 [==============================] - 0s - loss: 0.0156 - acc: 0.9959 - val_loss: 0.3161 - val_acc: 0.9344
Epoch 321/500
242/242 [==============================] - 0s - loss: 0.0268 - acc: 0.9917 - val_loss: 0.3219 - val_acc: 0.9344
Epoch 322/500
242/242 [==============================] - 0s - loss: 0.0221 - acc: 0.9959 - val_loss: 0.3234 - val_acc: 0.9344
Epoch 323/500
242/242 [==============================] - 0s - loss: 0.0309 - acc: 0.9917 - val_loss: 0.3250 - val_acc: 0.9180
Epoch 324/500
242/242 [==============================] - 0s - loss: 0.0207 - acc: 0.9917 - val_loss: 0.3222 - val_acc: 0.9180
Epoch 325/500
242/242 [==============================] - 0s - loss: 0.0220 - acc: 0.9959 - val_loss: 0.3151 - val_acc: 0.9180
Epoch 326/500
242/242 [==============================] - 0s - loss: 0.0228 - acc: 0.9917 - val_loss: 0.3080 - val_acc: 0.9344
Epoch 327/500
242/242 [==============================] - 0s - loss: 0.0497 - acc: 0.9917 - val_loss: 0.3110 - val_acc: 0.9180
Epoch 328/500
242/242 [==============================] - 0s - loss: 0.0211 - acc: 0.9917 - val_loss: 0.3236 - val_acc: 0.9016
Epoch 329/500
242/242 [==============================] - 0s - loss: 0.0106 - acc: 0.9959 - val_loss: 0.3352 - val_acc: 0.9016
Epoch 330/500
242/242 [==============================] - 0s - loss: 0.0266 - acc: 0.9917 - val_loss: 0.3393 - val_acc: 0.9180
Epoch 331/500
242/242 [==============================] - 0s - loss: 0.0150 - acc: 1.0000 - val_loss: 0.3439 - val_acc: 0.9180
Epoch 332/500
242/242 [==============================] - 0s - loss: 0.0141 - acc: 0.9959 - val_loss: 0.3514 - val_acc: 0.9180
Epoch 333/500
242/242 [==============================] - 0s - loss: 0.0212 - acc: 0.9917 - val_loss: 0.3488 - val_acc: 0.9180
Epoch 334/500
242/242 [==============================] - 0s - loss: 0.0262 - acc: 0.9917 - val_loss: 0.3434 - val_acc: 0.9016
Epoch 335/500
242/242 [==============================] - 0s - loss: 0.0143 - acc: 0.9959 - val_loss: 0.3358 - val_acc: 0.9016
Epoch 336/500
242/242 [==============================] - 0s - loss: 0.0115 - acc: 1.0000 - val_loss: 0.3341 - val_acc: 0.9180
Epoch 337/500
242/242 [==============================] - 0s - loss: 0.0317 - acc: 0.9835 - val_loss: 0.3431 - val_acc: 0.9180
Epoch 338/500
242/242 [==============================] - 0s - loss: 0.0270 - acc: 0.9917 - val_loss: 0.3384 - val_acc: 0.9180
Epoch 339/500
242/242 [==============================] - 0s - loss: 0.0098 - acc: 1.0000 - val_loss: 0.3320 - val_acc: 0.9180
Epoch 340/500
242/242 [==============================] - 0s - loss: 0.0147 - acc: 1.0000 - val_loss: 0.3245 - val_acc: 0.9180
Epoch 341/500
242/242 [==============================] - 0s - loss: 0.0583 - acc: 0.9876 - val_loss: 0.3213 - val_acc: 0.9016
Epoch 342/500
242/242 [==============================] - 0s - loss: 0.0199 - acc: 0.9917 - val_loss: 0.3133 - val_acc: 0.9180
Epoch 343/500
242/242 [==============================] - 0s - loss: 0.0268 - acc: 0.9959 - val_loss: 0.3129 - val_acc: 0.9180
Epoch 344/500
242/242 [==============================] - 0s - loss: 0.0347 - acc: 0.9917 - val_loss: 0.3008 - val_acc: 0.9180
Epoch 345/500
242/242 [==============================] - 0s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.2946 - val_acc: 0.9344
Epoch 346/500
242/242 [==============================] - 0s - loss: 0.0251 - acc: 0.9876 - val_loss: 0.2878 - val_acc: 0.9344
Epoch 347/500
242/242 [==============================] - 0s - loss: 0.0175 - acc: 0.9959 - val_loss: 0.2858 - val_acc: 0.9344
Epoch 348/500
242/242 [==============================] - 0s - loss: 0.0119 - acc: 0.9959 - val_loss: 0.2883 - val_acc: 0.9180
Epoch 349/500
242/242 [==============================] - 0s - loss: 0.0160 - acc: 1.0000 - val_loss: 0.2912 - val_acc: 0.9344
Epoch 350/500
242/242 [==============================] - 0s - loss: 0.0187 - acc: 0.9959 - val_loss: 0.2796 - val_acc: 0.9344
Epoch 351/500
242/242 [==============================] - 0s - loss: 0.0218 - acc: 0.9959 - val_loss: 0.2689 - val_acc: 0.9344
Epoch 352/500
242/242 [==============================] - 0s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.2636 - val_acc: 0.9344
Epoch 353/500
242/242 [==============================] - 0s - loss: 0.0194 - acc: 0.9959 - val_loss: 0.2589 - val_acc: 0.9180
Epoch 354/500
242/242 [==============================] - 0s - loss: 0.0170 - acc: 1.0000 - val_loss: 0.2578 - val_acc: 0.9016
Epoch 355/500
242/242 [==============================] - 0s - loss: 0.0283 - acc: 0.9917 - val_loss: 0.2634 - val_acc: 0.9016
Epoch 356/500
242/242 [==============================] - 0s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.2674 - val_acc: 0.9016
Epoch 357/500
242/242 [==============================] - 0s - loss: 0.0152 - acc: 1.0000 - val_loss: 0.2725 - val_acc: 0.9016
Epoch 358/500
242/242 [==============================] - 0s - loss: 0.0174 - acc: 0.9917 - val_loss: 0.2801 - val_acc: 0.9016
Epoch 359/500
242/242 [==============================] - 0s - loss: 0.0203 - acc: 0.9959 - val_loss: 0.2955 - val_acc: 0.9016
Epoch 360/500
242/242 [==============================] - 0s - loss: 0.0081 - acc: 1.0000 - val_loss: 0.3116 - val_acc: 0.9016
Epoch 361/500
242/242 [==============================] - 0s - loss: 0.0123 - acc: 0.9959 - val_loss: 0.3334 - val_acc: 0.9016
Epoch 362/500
242/242 [==============================] - 0s - loss: 0.0210 - acc: 0.9917 - val_loss: 0.3467 - val_acc: 0.9016
Epoch 363/500
242/242 [==============================] - 0s - loss: 0.0225 - acc: 0.9917 - val_loss: 0.3364 - val_acc: 0.9016
Epoch 364/500
242/242 [==============================] - 0s - loss: 0.0160 - acc: 0.9959 - val_loss: 0.3349 - val_acc: 0.9180
Epoch 365/500
242/242 [==============================] - 0s - loss: 0.0088 - acc: 0.9959 - val_loss: 0.3379 - val_acc: 0.9180
Epoch 366/500
242/242 [==============================] - 0s - loss: 0.0244 - acc: 0.9959 - val_loss: 0.3478 - val_acc: 0.9180
Epoch 367/500
242/242 [==============================] - 0s - loss: 0.0127 - acc: 0.9917 - val_loss: 0.3662 - val_acc: 0.9016
Epoch 368/500
242/242 [==============================] - 0s - loss: 0.0181 - acc: 0.9959 - val_loss: 0.3635 - val_acc: 0.9180
Epoch 369/500
242/242 [==============================] - 0s - loss: 0.0163 - acc: 0.9959 - val_loss: 0.3632 - val_acc: 0.9344
Epoch 370/500
242/242 [==============================] - 0s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.3504 - val_acc: 0.9180
Epoch 371/500
242/242 [==============================] - 0s - loss: 0.0145 - acc: 0.9959 - val_loss: 0.3389 - val_acc: 0.9180
Epoch 372/500
242/242 [==============================] - 0s - loss: 0.0239 - acc: 0.9917 - val_loss: 0.3364 - val_acc: 0.9180
Epoch 373/500
242/242 [==============================] - 0s - loss: 0.0121 - acc: 0.9959 - val_loss: 0.3488 - val_acc: 0.9180
Epoch 374/500
242/242 [==============================] - 0s - loss: 0.0066 - acc: 1.0000 - val_loss: 0.3559 - val_acc: 0.9016
Epoch 375/500
242/242 [==============================] - 0s - loss: 0.0203 - acc: 0.9876 - val_loss: 0.3600 - val_acc: 0.9016
Epoch 376/500
242/242 [==============================] - 0s - loss: 0.0115 - acc: 0.9959 - val_loss: 0.3518 - val_acc: 0.9180
Epoch 377/500
242/242 [==============================] - 0s - loss: 0.0340 - acc: 0.9793 - val_loss: 0.3147 - val_acc: 0.9180
Epoch 378/500
242/242 [==============================] - 0s - loss: 0.0165 - acc: 0.9959 - val_loss: 0.2877 - val_acc: 0.9180
Epoch 379/500
242/242 [==============================] - 0s - loss: 0.0066 - acc: 1.0000 - val_loss: 0.2711 - val_acc: 0.9180
Epoch 380/500
242/242 [==============================] - 0s - loss: 0.0175 - acc: 0.9917 - val_loss: 0.2628 - val_acc: 0.9180
Epoch 381/500
242/242 [==============================] - 0s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.2586 - val_acc: 0.9344
Epoch 382/500
242/242 [==============================] - 0s - loss: 0.0129 - acc: 1.0000 - val_loss: 0.2581 - val_acc: 0.9344
Epoch 383/500
242/242 [==============================] - 0s - loss: 0.0151 - acc: 0.9959 - val_loss: 0.2600 - val_acc: 0.9344
Epoch 384/500
242/242 [==============================] - 0s - loss: 0.0231 - acc: 0.9876 - val_loss: 0.2679 - val_acc: 0.9344
Epoch 385/500
242/242 [==============================] - 0s - loss: 0.0129 - acc: 0.9959 - val_loss: 0.2797 - val_acc: 0.9344
Epoch 386/500
242/242 [==============================] - 0s - loss: 0.0406 - acc: 0.9793 - val_loss: 0.3009 - val_acc: 0.9180
Epoch 387/500
242/242 [==============================] - 0s - loss: 0.0443 - acc: 0.9876 - val_loss: 0.3041 - val_acc: 0.9180
Epoch 388/500
242/242 [==============================] - 0s - loss: 0.0393 - acc: 0.9876 - val_loss: 0.3041 - val_acc: 0.9180
Epoch 389/500
242/242 [==============================] - 0s - loss: 0.0312 - acc: 0.9917 - val_loss: 0.3028 - val_acc: 0.9180
Epoch 390/500
242/242 [==============================] - 0s - loss: 0.0139 - acc: 1.0000 - val_loss: 0.3014 - val_acc: 0.9180
Epoch 391/500
242/242 [==============================] - 0s - loss: 0.0151 - acc: 0.9959 - val_loss: 0.3013 - val_acc: 0.9180
Epoch 392/500
242/242 [==============================] - 0s - loss: 0.0165 - acc: 0.9917 - val_loss: 0.3024 - val_acc: 0.9180
Epoch 393/500
242/242 [==============================] - 0s - loss: 0.0148 - acc: 1.0000 - val_loss: 0.3044 - val_acc: 0.9016
Epoch 394/500
242/242 [==============================] - 0s - loss: 0.0342 - acc: 0.9917 - val_loss: 0.2992 - val_acc: 0.9180
Epoch 395/500
242/242 [==============================] - 0s - loss: 0.0162 - acc: 0.9917 - val_loss: 0.3018 - val_acc: 0.9180
Epoch 396/500
242/242 [==============================] - 0s - loss: 0.0242 - acc: 0.9917 - val_loss: 0.3080 - val_acc: 0.9180
Epoch 00395: early stopping
CPU times: user 1min 46s, sys: 18.5 s, total: 2min 4s
Wall time: 2min 59s
Out[31]:
<keras.callbacks.History at 0x7ff39da36898>

In [36]:
train_loss, train_accuracy = model.evaluate(X_train, y_train, batch_size=BATCH_SIZE)
train_loss, train_accuracy


303/303 [==============================] - 0s
Out[36]:
(0.066526636481285095, 0.98349833488464355)

In [37]:
test_loss, test_accuracy = model.evaluate(X_test, y_test, batch_size=BATCH_SIZE)
test_loss, test_accuracy


76/76 [==============================] - 0s
Out[37]:
(0.45080634951591492, 0.89473682641983032)

How Metrics might look like when training 500 epochs with given full model

Training size makes this a little bit hard to interpret. Might look different for different random split.

Accuracy

Validation Accuracy


In [38]:
# model.save('conv-vgg.hdf5')
model.save('conv-simple.hdf5')

In [39]:
!ls -lh


total 972M
-rw-rw-r-- 1 ubuntu ubuntu  44K Oct  1 08:04 440px-Beagle_Upsy.jpg
drwxrwxr-x 8 ubuntu ubuntu 4.0K Oct  1 08:10 augmented-signs
-rw-rw-r-- 1 ubuntu ubuntu  17M Oct  1 08:10 augmented-signs.zip
-rw-rw-r-- 1 ubuntu ubuntu 303K Sep 27 15:22 Black_New_York_stuy_town_squirrel_amanda_ernlund.jpeg
-rw-rw-r-- 1 ubuntu ubuntu 844K Oct  1 08:04 cat-bonkers.png
-rw-rw-r-- 1 ubuntu ubuntu 140K Oct  1 13:50 cnn-augmentation.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 1.6M Oct  2 13:22 cnn-comparing-all-models.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 136K Oct  2 09:42 cnn-imagenet-retrain.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 158K Oct  2 08:41 cnn-intro.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 107K Oct  2 09:41 cnn-standard-architectures.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 199K Oct  2 07:51 cnn-train-augmented.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 3.1M Oct  2 13:07 conv-minimal.hdf5
-rw-rw-r-- 1 ubuntu ubuntu  26M Oct  2 13:36 conv-simple.hdf5
-rw-rw-r-- 1 ubuntu ubuntu  55M Oct  1 15:13 conv-vgg-augmented.hdf5
-rw-rw-r-- 1 ubuntu ubuntu  55M Oct  1 14:54 conv-vgg.hdf5
-rw-rw-r-- 1 ubuntu ubuntu 6.3M Oct  1 14:34 conv-vgg-simple.hdf5
-rw-rw-r-- 1 ubuntu ubuntu 495K Sep 27 15:22 london.jpg
drwxrwxr-x 3 ubuntu ubuntu 4.0K Sep 27 15:25 __MACOSX
-rw-rw-r-- 1 ubuntu ubuntu 127K Sep 27 15:22 Michigan-MSU-raschka.jpg
-rw-rw-r-- 1 ubuntu ubuntu 520K Oct  2 09:45 ml-intro.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 286K Oct  2 09:41 nn-intro.ipynb
-rw------- 1 ubuntu ubuntu 3.3M Oct  1 15:29 nohup.out
-rw-rw-r-- 1 ubuntu ubuntu   63 Sep 27 15:22 README.html
-rw-rw-r-- 1 ubuntu ubuntu 271M Oct  2 09:22 resnet-augmented.hdf5
-rw-rw-r-- 1 ubuntu ubuntu   36 Oct  1 08:04 sample_iris.json
drwxrwxr-x 8 ubuntu ubuntu 4.0K Sep 27 15:25 speed-limit-signs
-rw-rw-r-- 1 ubuntu ubuntu 169K Oct  2 13:33 speed-limit-signs.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 1.8M Oct  1 08:09 speed-limit-signs.zip
drwxr-xr-x 2 ubuntu ubuntu 4.0K Oct  2 13:30 tf_log
-rw-rw-r-- 1 ubuntu ubuntu  88M Oct  1 10:22 vgg16-augmented-retrained-fine-tuned.hdf5
-rw-rw-r-- 1 ubuntu ubuntu  59M Oct  1 10:00 vgg16-augmented-retrained.hdf5
-rw-rw-r-- 1 ubuntu ubuntu  88M Oct  1 09:56 vgg16-retrained-fine-tuned.hdf5
-rw-rw-r-- 1 ubuntu ubuntu  59M Oct  1 09:26 vgg16-retrained.hdf5
-rw-rw-r-- 1 ubuntu ubuntu  15K Oct  2 09:40 workshop.ipynb
-rw-rw-r-- 1 ubuntu ubuntu 239M Oct  1 14:32 xception-augmented.hdf5

In [ ]:
# https://transfer.sh/
# Saved for 14 days
# !curl --upload-file conv-vgg.hdf5 https://transfer.sh
!curl --upload-file conv-simple.hdf5 https://transfer.sh

# pre-trained model
# acc: 0.98- val_acc: 0.89
# https://transfer.sh/DuZA7/conv-simple.hdf5

What images does it work well on?


In [45]:
import random

# Pick 10 random images for test data set
random.seed(42) # to make this deterministic
sample_indexes = random.sample(range(len(X_test)), 10)
sample_images = [X_test[i] for i in sample_indexes]
sample_labels = [y_test[i] for i in sample_indexes]

In [46]:
ground_truth = np.argmax(sample_labels, axis=1)
ground_truth


Out[46]:
array([1, 3, 5, 3, 3, 1, 4, 4, 0, 2])

In [47]:
X_sample = np.array(sample_images)
prediction = model.predict(X_sample)
predicted_categories = np.argmax(prediction, axis=1)
predicted_categories


Out[47]:
array([1, 3, 4, 3, 3, 1, 4, 4, 0, 2])

In [48]:
# Display the predictions and the ground truth visually.
def display_prediction (images, true_labels, predicted_labels):
    fig = plt.figure(figsize=(10, 10))
    for i in range(len(true_labels)):
        truth = true_labels[i]
        prediction = predicted_labels[i]
        plt.subplot(5, 2,1+i)
        plt.axis('off')
        color='green' if truth == prediction else 'red'
        plt.text(80, 10, "Truth:        {0}\nPrediction: {1}".format(truth, prediction), 
                 fontsize=12, color=color)
        plt.imshow(images[i])

In [49]:
display_prediction(sample_images, ground_truth, predicted_categories)



In [ ]: