Introduction to Neural Networks with Keras


In [1]:
import warnings
warnings.filterwarnings('ignore')

In [2]:
%matplotlib inline
%pylab inline


Populating the interactive namespace from numpy and matplotlib

In [3]:
import matplotlib.pylab as plt
import numpy as np

In [4]:
from distutils.version import StrictVersion

In [5]:
import sklearn
print(sklearn.__version__)

assert StrictVersion(sklearn.__version__ ) >= StrictVersion('0.18.1')


0.19.0

In [6]:
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.ERROR)
print(tf.__version__)

assert StrictVersion(tf.__version__) >= StrictVersion('1.1.0')


1.2.1

In [7]:
import keras
print(keras.__version__)

assert StrictVersion(keras.__version__) >= StrictVersion('2.0.0')


Using TensorFlow backend.
2.0.8

In [8]:
import pandas as pd
print(pd.__version__)

assert StrictVersion(pd.__version__) >= StrictVersion('0.20.0')


0.20.3

Solving Iris with Neural Networks


In [9]:
from sklearn.datasets import load_iris
iris = load_iris()
iris.data[0]


Out[9]:
array([ 5.1,  3.5,  1.4,  0.2])

In [10]:
print(iris.DESCR)


Iris Plants Database
====================

Notes
-----
Data Set Characteristics:
    :Number of Instances: 150 (50 in each of three classes)
    :Number of Attributes: 4 numeric, predictive attributes and the class
    :Attribute Information:
        - sepal length in cm
        - sepal width in cm
        - petal length in cm
        - petal width in cm
        - class:
                - Iris-Setosa
                - Iris-Versicolour
                - Iris-Virginica
    :Summary Statistics:

    ============== ==== ==== ======= ===== ====================
                    Min  Max   Mean    SD   Class Correlation
    ============== ==== ==== ======= ===== ====================
    sepal length:   4.3  7.9   5.84   0.83    0.7826
    sepal width:    2.0  4.4   3.05   0.43   -0.4194
    petal length:   1.0  6.9   3.76   1.76    0.9490  (high!)
    petal width:    0.1  2.5   1.20  0.76     0.9565  (high!)
    ============== ==== ==== ======= ===== ====================

    :Missing Attribute Values: None
    :Class Distribution: 33.3% for each of 3 classes.
    :Creator: R.A. Fisher
    :Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov)
    :Date: July, 1988

This is a copy of UCI ML iris datasets.
http://archive.ics.uci.edu/ml/datasets/Iris

The famous Iris database, first used by Sir R.A Fisher

This is perhaps the best known database to be found in the
pattern recognition literature.  Fisher's paper is a classic in the field and
is referenced frequently to this day.  (See Duda & Hart, for example.)  The
data set contains 3 classes of 50 instances each, where each class refers to a
type of iris plant.  One class is linearly separable from the other 2; the
latter are NOT linearly separable from each other.

References
----------
   - Fisher,R.A. "The use of multiple measurements in taxonomic problems"
     Annual Eugenics, 7, Part II, 179-188 (1936); also in "Contributions to
     Mathematical Statistics" (John Wiley, NY, 1950).
   - Duda,R.O., & Hart,P.E. (1973) Pattern Classification and Scene Analysis.
     (Q327.D83) John Wiley & Sons.  ISBN 0-471-22361-1.  See page 218.
   - Dasarathy, B.V. (1980) "Nosing Around the Neighborhood: A New System
     Structure and Classification Rule for Recognition in Partially Exposed
     Environments".  IEEE Transactions on Pattern Analysis and Machine
     Intelligence, Vol. PAMI-2, No. 1, 67-71.
   - Gates, G.W. (1972) "The Reduced Nearest Neighbor Rule".  IEEE Transactions
     on Information Theory, May 1972, 431-433.
   - See also: 1988 MLC Proceedings, 54-64.  Cheeseman et al"s AUTOCLASS II
     conceptual clustering system finds 3 classes in the data.
   - Many, many more ...


In [11]:
import matplotlib.pyplot as plt
from matplotlib.colors import ListedColormap

iris_df = pd.DataFrame(iris.data, columns=iris.feature_names)
CMAP = ListedColormap(['#FF0000', '#00FF00', '#0000FF'])
pd.plotting.scatter_matrix(iris_df, c=iris.target, edgecolor='black', figsize=(15, 15), cmap=CMAP)
plt.show()


The artificial Neuron

Our first Neural Network with Keras


In [12]:
# keras.layers.Input?

In [13]:
from keras.layers import Input
inputs = Input(shape=(4, ))

In [14]:
# keras.layers.Dense?

In [15]:
from keras.layers import Dense
# just linear activation (like no activation function at all)
fc = Dense(3)(inputs)

In [16]:
from keras.models import Model
model = Model(input=inputs, output=fc)

In [17]:
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         (None, 4)                 0         
_________________________________________________________________
dense_1 (Dense)              (None, 3)                 15        
=================================================================
Total params: 15
Trainable params: 15
Non-trainable params: 0
_________________________________________________________________

In [18]:
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In [19]:
# this is just random stuff, no training has taken place so far
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[19]:
array([[ 5.7693758 ,  1.25018001,  3.50801897]], dtype=float32)

This is the output of all 3 hidden neurons, but what we really want is a category for iris category

  • Softmax activation turns each output to a percantage between 0 and 1 all adding up to 1
  • interpretation is likelyhood of category


In [20]:
inputs = Input(shape=(4, ))
fc = Dense(3)(inputs)
predictions = Dense(3, activation='softmax')(fc)
model = Model(input=inputs, output=predictions)

In [21]:
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_2 (InputLayer)         (None, 4)                 0         
_________________________________________________________________
dense_2 (Dense)              (None, 3)                 15        
_________________________________________________________________
dense_3 (Dense)              (None, 3)                 12        
=================================================================
Total params: 27
Trainable params: 27
Non-trainable params: 0
_________________________________________________________________

In [22]:
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In [23]:
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[23]:
array([[ 0.90537059,  0.00527649,  0.0893529 ]], dtype=float32)

Now we have likelyhoods for categories, but still our model is totally random

Training

  • training is performed using Backpropagation
  • each pair of ground truth input and output is passed through network
  • difference between expected output (ground truth) and actual result is summed up and forms loss function
  • loss function is to be minimized
  • optimizer defines strategy to minimize loss

Optimizers: Adam and RMSprop seem nice

http://cs231n.github.io/neural-networks-3/#ada


In [24]:
X = np.array(iris.data)
y = np.array(iris.target)
X.shape, y.shape


Out[24]:
((150, 4), (150,))

In [25]:
y[100]


Out[25]:
2

In [26]:
# tiny little pieces of feature engeneering
from keras.utils.np_utils import to_categorical

num_categories = 3

y = to_categorical(y, num_categories)

In [27]:
y[100]


Out[27]:
array([ 0.,  0.,  1.])

In [28]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.20, random_state=42, stratify=y)

In [29]:
X_train.shape, X_test.shape, y_train.shape, y_test.shape


Out[29]:
((120, 4), (30, 4), (120, 3), (30, 3))

In [30]:
!rm -r tf_log
tb_callback = keras.callbacks.TensorBoard(log_dir='./tf_log')

# https://keras.io/callbacks/#tensorboard
# To start tensorboard
# tensorboard --logdir=/mnt/c/Users/olive/Development/ml/tf_log
# open http://localhost:6006

In [36]:
%time model.fit(X_train, y_train, epochs=500, validation_split=0.2, callbacks=[tb_callback])
# %time model.fit(X_train, y_train, epochs=500, validation_split=0.2)


Train on 96 samples, validate on 24 samples
Epoch 1/500
96/96 [==============================] - 0s - loss: 0.4120 - acc: 0.9062 - val_loss: 0.5058 - val_acc: 0.8333
Epoch 2/500
96/96 [==============================] - 0s - loss: 0.4112 - acc: 0.9062 - val_loss: 0.5047 - val_acc: 0.8333
Epoch 3/500
96/96 [==============================] - 0s - loss: 0.4105 - acc: 0.9062 - val_loss: 0.5040 - val_acc: 0.8333
Epoch 4/500
96/96 [==============================] - 0s - loss: 0.4098 - acc: 0.9062 - val_loss: 0.5032 - val_acc: 0.8333
Epoch 5/500
96/96 [==============================] - 0s - loss: 0.4091 - acc: 0.9062 - val_loss: 0.5020 - val_acc: 0.8333
Epoch 6/500
96/96 [==============================] - 0s - loss: 0.4084 - acc: 0.9062 - val_loss: 0.5009 - val_acc: 0.8333
Epoch 7/500
96/96 [==============================] - 0s - loss: 0.4077 - acc: 0.9062 - val_loss: 0.5003 - val_acc: 0.8333
Epoch 8/500
96/96 [==============================] - 0s - loss: 0.4069 - acc: 0.9062 - val_loss: 0.4994 - val_acc: 0.8333
Epoch 9/500
96/96 [==============================] - 0s - loss: 0.4065 - acc: 0.9062 - val_loss: 0.4991 - val_acc: 0.8333
Epoch 10/500
96/96 [==============================] - 0s - loss: 0.4055 - acc: 0.9062 - val_loss: 0.4979 - val_acc: 0.8333
Epoch 11/500
96/96 [==============================] - 0s - loss: 0.4048 - acc: 0.9062 - val_loss: 0.4971 - val_acc: 0.8333
Epoch 12/500
96/96 [==============================] - 0s - loss: 0.4047 - acc: 0.9062 - val_loss: 0.4969 - val_acc: 0.8333
Epoch 13/500
96/96 [==============================] - 0s - loss: 0.4034 - acc: 0.9062 - val_loss: 0.4957 - val_acc: 0.8333
Epoch 14/500
96/96 [==============================] - 0s - loss: 0.4027 - acc: 0.9062 - val_loss: 0.4943 - val_acc: 0.8333
Epoch 15/500
96/96 [==============================] - 0s - loss: 0.4020 - acc: 0.9062 - val_loss: 0.4934 - val_acc: 0.8333
Epoch 16/500
96/96 [==============================] - 0s - loss: 0.4013 - acc: 0.9062 - val_loss: 0.4927 - val_acc: 0.8333
Epoch 17/500
96/96 [==============================] - 0s - loss: 0.4007 - acc: 0.9062 - val_loss: 0.4917 - val_acc: 0.8333
Epoch 18/500
96/96 [==============================] - 0s - loss: 0.4001 - acc: 0.9062 - val_loss: 0.4903 - val_acc: 0.8750
Epoch 19/500
96/96 [==============================] - 0s - loss: 0.3993 - acc: 0.9062 - val_loss: 0.4898 - val_acc: 0.8750
Epoch 20/500
96/96 [==============================] - 0s - loss: 0.3985 - acc: 0.9062 - val_loss: 0.4892 - val_acc: 0.8750
Epoch 21/500
96/96 [==============================] - 0s - loss: 0.3978 - acc: 0.9062 - val_loss: 0.4884 - val_acc: 0.8750
Epoch 22/500
96/96 [==============================] - 0s - loss: 0.3972 - acc: 0.9062 - val_loss: 0.4873 - val_acc: 0.8750
Epoch 23/500
96/96 [==============================] - 0s - loss: 0.3965 - acc: 0.9062 - val_loss: 0.4869 - val_acc: 0.8750
Epoch 24/500
96/96 [==============================] - 0s - loss: 0.3957 - acc: 0.9062 - val_loss: 0.4859 - val_acc: 0.8750
Epoch 25/500
96/96 [==============================] - 0s - loss: 0.3950 - acc: 0.9062 - val_loss: 0.4849 - val_acc: 0.8750
Epoch 26/500
96/96 [==============================] - 0s - loss: 0.3943 - acc: 0.9062 - val_loss: 0.4841 - val_acc: 0.8750
Epoch 27/500
96/96 [==============================] - 0s - loss: 0.3937 - acc: 0.9062 - val_loss: 0.4837 - val_acc: 0.8750
Epoch 28/500
96/96 [==============================] - 0s - loss: 0.3930 - acc: 0.9062 - val_loss: 0.4832 - val_acc: 0.8750
Epoch 29/500
96/96 [==============================] - 0s - loss: 0.3923 - acc: 0.9062 - val_loss: 0.4820 - val_acc: 0.8750
Epoch 30/500
96/96 [==============================] - 0s - loss: 0.3916 - acc: 0.9062 - val_loss: 0.4813 - val_acc: 0.8750
Epoch 31/500
96/96 [==============================] - 0s - loss: 0.3909 - acc: 0.9062 - val_loss: 0.4805 - val_acc: 0.8750
Epoch 32/500
96/96 [==============================] - 0s - loss: 0.3902 - acc: 0.9062 - val_loss: 0.4797 - val_acc: 0.8750
Epoch 33/500
96/96 [==============================] - 0s - loss: 0.3895 - acc: 0.9062 - val_loss: 0.4789 - val_acc: 0.8750
Epoch 34/500
96/96 [==============================] - 0s - loss: 0.3888 - acc: 0.9062 - val_loss: 0.4781 - val_acc: 0.8750
Epoch 35/500
96/96 [==============================] - 0s - loss: 0.3882 - acc: 0.9062 - val_loss: 0.4774 - val_acc: 0.8750
Epoch 36/500
96/96 [==============================] - 0s - loss: 0.3875 - acc: 0.9062 - val_loss: 0.4764 - val_acc: 0.8750
Epoch 37/500
96/96 [==============================] - 0s - loss: 0.3870 - acc: 0.9167 - val_loss: 0.4750 - val_acc: 0.8750
Epoch 38/500
96/96 [==============================] - 0s - loss: 0.3863 - acc: 0.9167 - val_loss: 0.4739 - val_acc: 0.8750
Epoch 39/500
96/96 [==============================] - 0s - loss: 0.3857 - acc: 0.9271 - val_loss: 0.4727 - val_acc: 0.8750
Epoch 40/500
96/96 [==============================] - 0s - loss: 0.3849 - acc: 0.9271 - val_loss: 0.4721 - val_acc: 0.8750
Epoch 41/500
96/96 [==============================] - 0s - loss: 0.3844 - acc: 0.9167 - val_loss: 0.4720 - val_acc: 0.8750
Epoch 42/500
96/96 [==============================] - 0s - loss: 0.3836 - acc: 0.9167 - val_loss: 0.4708 - val_acc: 0.8750
Epoch 43/500
96/96 [==============================] - 0s - loss: 0.3827 - acc: 0.9271 - val_loss: 0.4700 - val_acc: 0.8750
Epoch 44/500
96/96 [==============================] - 0s - loss: 0.3821 - acc: 0.9271 - val_loss: 0.4695 - val_acc: 0.8750
Epoch 45/500
96/96 [==============================] - 0s - loss: 0.3813 - acc: 0.9271 - val_loss: 0.4687 - val_acc: 0.8750
Epoch 46/500
96/96 [==============================] - 0s - loss: 0.3806 - acc: 0.9271 - val_loss: 0.4680 - val_acc: 0.8750
Epoch 47/500
96/96 [==============================] - 0s - loss: 0.3801 - acc: 0.9167 - val_loss: 0.4676 - val_acc: 0.8750
Epoch 48/500
96/96 [==============================] - 0s - loss: 0.3793 - acc: 0.9167 - val_loss: 0.4667 - val_acc: 0.8750
Epoch 49/500
96/96 [==============================] - 0s - loss: 0.3787 - acc: 0.9167 - val_loss: 0.4660 - val_acc: 0.8750
Epoch 50/500
96/96 [==============================] - 0s - loss: 0.3780 - acc: 0.9167 - val_loss: 0.4653 - val_acc: 0.8750
Epoch 51/500
96/96 [==============================] - 0s - loss: 0.3775 - acc: 0.9271 - val_loss: 0.4638 - val_acc: 0.8750
Epoch 52/500
96/96 [==============================] - 0s - loss: 0.3767 - acc: 0.9271 - val_loss: 0.4633 - val_acc: 0.8750
Epoch 53/500
96/96 [==============================] - 0s - loss: 0.3760 - acc: 0.9271 - val_loss: 0.4625 - val_acc: 0.8750
Epoch 54/500
96/96 [==============================] - 0s - loss: 0.3763 - acc: 0.9271 - val_loss: 0.4607 - val_acc: 0.8750
Epoch 55/500
96/96 [==============================] - 0s - loss: 0.3746 - acc: 0.9271 - val_loss: 0.4600 - val_acc: 0.8750
Epoch 56/500
96/96 [==============================] - 0s - loss: 0.3740 - acc: 0.9271 - val_loss: 0.4594 - val_acc: 0.8750
Epoch 57/500
96/96 [==============================] - 0s - loss: 0.3733 - acc: 0.9271 - val_loss: 0.4588 - val_acc: 0.8750
Epoch 58/500
96/96 [==============================] - 0s - loss: 0.3726 - acc: 0.9271 - val_loss: 0.4581 - val_acc: 0.8750
Epoch 59/500
96/96 [==============================] - 0s - loss: 0.3721 - acc: 0.9271 - val_loss: 0.4572 - val_acc: 0.8750
Epoch 60/500
96/96 [==============================] - 0s - loss: 0.3714 - acc: 0.9271 - val_loss: 0.4565 - val_acc: 0.8750
Epoch 61/500
96/96 [==============================] - 0s - loss: 0.3706 - acc: 0.9271 - val_loss: 0.4559 - val_acc: 0.8750
Epoch 62/500
96/96 [==============================] - 0s - loss: 0.3700 - acc: 0.9271 - val_loss: 0.4550 - val_acc: 0.8750
Epoch 63/500
96/96 [==============================] - 0s - loss: 0.3693 - acc: 0.9271 - val_loss: 0.4542 - val_acc: 0.8750
Epoch 64/500
96/96 [==============================] - ETA: 0s - loss: 0.3975 - acc: 0.968 - 0s - loss: 0.3687 - acc: 0.9271 - val_loss: 0.4536 - val_acc: 0.8750
Epoch 65/500
96/96 [==============================] - 0s - loss: 0.3680 - acc: 0.9271 - val_loss: 0.4526 - val_acc: 0.8750
Epoch 66/500
96/96 [==============================] - 0s - loss: 0.3674 - acc: 0.9271 - val_loss: 0.4516 - val_acc: 0.8750
Epoch 67/500
96/96 [==============================] - 0s - loss: 0.3667 - acc: 0.9271 - val_loss: 0.4510 - val_acc: 0.8750
Epoch 68/500
96/96 [==============================] - 0s - loss: 0.3661 - acc: 0.9271 - val_loss: 0.4500 - val_acc: 0.8750
Epoch 69/500
96/96 [==============================] - 0s - loss: 0.3653 - acc: 0.9271 - val_loss: 0.4493 - val_acc: 0.8750
Epoch 70/500
96/96 [==============================] - 0s - loss: 0.3650 - acc: 0.9271 - val_loss: 0.4489 - val_acc: 0.8750
Epoch 71/500
96/96 [==============================] - 0s - loss: 0.3646 - acc: 0.9271 - val_loss: 0.4475 - val_acc: 0.8750
Epoch 72/500
96/96 [==============================] - ETA: 0s - loss: 0.3566 - acc: 0.968 - 0s - loss: 0.3638 - acc: 0.9271 - val_loss: 0.4463 - val_acc: 0.8750
Epoch 73/500
96/96 [==============================] - 0s - loss: 0.3633 - acc: 0.9271 - val_loss: 0.4463 - val_acc: 0.8750
Epoch 74/500
96/96 [==============================] - 0s - loss: 0.3621 - acc: 0.9271 - val_loss: 0.4455 - val_acc: 0.8750
Epoch 75/500
96/96 [==============================] - 0s - loss: 0.3614 - acc: 0.9271 - val_loss: 0.4448 - val_acc: 0.8750
Epoch 76/500
96/96 [==============================] - ETA: 0s - loss: 0.3809 - acc: 0.906 - 0s - loss: 0.3608 - acc: 0.9271 - val_loss: 0.4440 - val_acc: 0.8750
Epoch 77/500
96/96 [==============================] - 0s - loss: 0.3601 - acc: 0.9271 - val_loss: 0.4431 - val_acc: 0.8750
Epoch 78/500
96/96 [==============================] - 0s - loss: 0.3595 - acc: 0.9271 - val_loss: 0.4424 - val_acc: 0.8750
Epoch 79/500
96/96 [==============================] - 0s - loss: 0.3589 - acc: 0.9271 - val_loss: 0.4417 - val_acc: 0.8750
Epoch 80/500
96/96 [==============================] - 0s - loss: 0.3584 - acc: 0.9375 - val_loss: 0.4404 - val_acc: 0.8750
Epoch 81/500
96/96 [==============================] - 0s - loss: 0.3581 - acc: 0.9375 - val_loss: 0.4402 - val_acc: 0.8750
Epoch 82/500
96/96 [==============================] - 0s - loss: 0.3572 - acc: 0.9271 - val_loss: 0.4388 - val_acc: 0.8750
Epoch 83/500
96/96 [==============================] - 0s - loss: 0.3564 - acc: 0.9375 - val_loss: 0.4378 - val_acc: 0.8750
Epoch 84/500
96/96 [==============================] - 0s - loss: 0.3556 - acc: 0.9375 - val_loss: 0.4370 - val_acc: 0.8750
Epoch 85/500
96/96 [==============================] - 0s - loss: 0.3551 - acc: 0.9375 - val_loss: 0.4366 - val_acc: 0.8750
Epoch 86/500
96/96 [==============================] - 0s - loss: 0.3544 - acc: 0.9375 - val_loss: 0.4357 - val_acc: 0.8750
Epoch 87/500
96/96 [==============================] - 0s - loss: 0.3540 - acc: 0.9375 - val_loss: 0.4354 - val_acc: 0.8750
Epoch 88/500
96/96 [==============================] - 0s - loss: 0.3532 - acc: 0.9375 - val_loss: 0.4342 - val_acc: 0.8750
Epoch 89/500
96/96 [==============================] - 0s - loss: 0.3525 - acc: 0.9375 - val_loss: 0.4332 - val_acc: 0.8750
Epoch 90/500
96/96 [==============================] - 0s - loss: 0.3518 - acc: 0.9375 - val_loss: 0.4324 - val_acc: 0.8750
Epoch 91/500
96/96 [==============================] - 0s - loss: 0.3512 - acc: 0.9375 - val_loss: 0.4318 - val_acc: 0.8750
Epoch 92/500
96/96 [==============================] - 0s - loss: 0.3504 - acc: 0.9375 - val_loss: 0.4311 - val_acc: 0.8750
Epoch 93/500
96/96 [==============================] - 0s - loss: 0.3499 - acc: 0.9375 - val_loss: 0.4306 - val_acc: 0.8750
Epoch 94/500
96/96 [==============================] - 0s - loss: 0.3492 - acc: 0.9375 - val_loss: 0.4298 - val_acc: 0.8750
Epoch 95/500
96/96 [==============================] - 0s - loss: 0.3487 - acc: 0.9375 - val_loss: 0.4294 - val_acc: 0.8750
Epoch 96/500
96/96 [==============================] - 0s - loss: 0.3480 - acc: 0.9479 - val_loss: 0.4282 - val_acc: 0.8750
Epoch 97/500
96/96 [==============================] - 0s - loss: 0.3474 - acc: 0.9375 - val_loss: 0.4273 - val_acc: 0.8750
Epoch 98/500
96/96 [==============================] - 0s - loss: 0.3471 - acc: 0.9375 - val_loss: 0.4269 - val_acc: 0.8750
Epoch 99/500
96/96 [==============================] - 0s - loss: 0.3462 - acc: 0.9375 - val_loss: 0.4257 - val_acc: 0.8750
Epoch 100/500
96/96 [==============================] - 0s - loss: 0.3454 - acc: 0.9375 - val_loss: 0.4251 - val_acc: 0.8750
Epoch 101/500
96/96 [==============================] - 0s - loss: 0.3452 - acc: 0.9375 - val_loss: 0.4237 - val_acc: 0.8750
Epoch 102/500
96/96 [==============================] - 0s - loss: 0.3442 - acc: 0.9375 - val_loss: 0.4231 - val_acc: 0.8750
Epoch 103/500
96/96 [==============================] - 0s - loss: 0.3435 - acc: 0.9375 - val_loss: 0.4226 - val_acc: 0.8750
Epoch 104/500
96/96 [==============================] - 0s - loss: 0.3430 - acc: 0.9375 - val_loss: 0.4220 - val_acc: 0.8750
Epoch 105/500
96/96 [==============================] - 0s - loss: 0.3422 - acc: 0.9479 - val_loss: 0.4213 - val_acc: 0.8750
Epoch 106/500
96/96 [==============================] - 0s - loss: 0.3421 - acc: 0.9479 - val_loss: 0.4209 - val_acc: 0.8750
Epoch 107/500
96/96 [==============================] - 0s - loss: 0.3409 - acc: 0.9479 - val_loss: 0.4199 - val_acc: 0.8750
Epoch 108/500
96/96 [==============================] - 0s - loss: 0.3404 - acc: 0.9479 - val_loss: 0.4192 - val_acc: 0.8750
Epoch 109/500
96/96 [==============================] - 0s - loss: 0.3397 - acc: 0.9479 - val_loss: 0.4180 - val_acc: 0.8750
Epoch 110/500
96/96 [==============================] - 0s - loss: 0.3393 - acc: 0.9479 - val_loss: 0.4168 - val_acc: 0.9167
Epoch 111/500
96/96 [==============================] - 0s - loss: 0.3385 - acc: 0.9375 - val_loss: 0.4160 - val_acc: 0.9167
Epoch 112/500
96/96 [==============================] - 0s - loss: 0.3380 - acc: 0.9375 - val_loss: 0.4154 - val_acc: 0.8750
Epoch 113/500
96/96 [==============================] - 0s - loss: 0.3375 - acc: 0.9375 - val_loss: 0.4143 - val_acc: 0.9167
Epoch 114/500
96/96 [==============================] - 0s - loss: 0.3366 - acc: 0.9375 - val_loss: 0.4137 - val_acc: 0.9167
Epoch 115/500
96/96 [==============================] - 0s - loss: 0.3360 - acc: 0.9375 - val_loss: 0.4130 - val_acc: 0.9167
Epoch 116/500
96/96 [==============================] - 0s - loss: 0.3353 - acc: 0.9479 - val_loss: 0.4124 - val_acc: 0.9167
Epoch 117/500
96/96 [==============================] - 0s - loss: 0.3348 - acc: 0.9479 - val_loss: 0.4117 - val_acc: 0.9167
Epoch 118/500
96/96 [==============================] - 0s - loss: 0.3342 - acc: 0.9479 - val_loss: 0.4112 - val_acc: 0.9167
Epoch 119/500
96/96 [==============================] - 0s - loss: 0.3334 - acc: 0.9479 - val_loss: 0.4105 - val_acc: 0.9167
Epoch 120/500
96/96 [==============================] - 0s - loss: 0.3329 - acc: 0.9479 - val_loss: 0.4097 - val_acc: 0.9167
Epoch 121/500
96/96 [==============================] - 0s - loss: 0.3323 - acc: 0.9479 - val_loss: 0.4091 - val_acc: 0.9167
Epoch 122/500
96/96 [==============================] - 0s - loss: 0.3316 - acc: 0.9479 - val_loss: 0.4085 - val_acc: 0.9167
Epoch 123/500
96/96 [==============================] - 0s - loss: 0.3311 - acc: 0.9479 - val_loss: 0.4077 - val_acc: 0.9167
Epoch 124/500
96/96 [==============================] - ETA: 0s - loss: 0.3088 - acc: 0.937 - 0s - loss: 0.3304 - acc: 0.9479 - val_loss: 0.4070 - val_acc: 0.9167
Epoch 125/500
96/96 [==============================] - 0s - loss: 0.3299 - acc: 0.9479 - val_loss: 0.4062 - val_acc: 0.9167
Epoch 126/500
96/96 [==============================] - 0s - loss: 0.3293 - acc: 0.9479 - val_loss: 0.4051 - val_acc: 0.9167
Epoch 127/500
96/96 [==============================] - 0s - loss: 0.3289 - acc: 0.9479 - val_loss: 0.4039 - val_acc: 0.9583
Epoch 128/500
96/96 [==============================] - 0s - loss: 0.3280 - acc: 0.9479 - val_loss: 0.4032 - val_acc: 0.9583
Epoch 129/500
96/96 [==============================] - 0s - loss: 0.3274 - acc: 0.9479 - val_loss: 0.4024 - val_acc: 0.9583
Epoch 130/500
96/96 [==============================] - 0s - loss: 0.3268 - acc: 0.9479 - val_loss: 0.4016 - val_acc: 0.9583
Epoch 131/500
96/96 [==============================] - 0s - loss: 0.3261 - acc: 0.9479 - val_loss: 0.4011 - val_acc: 0.9583
Epoch 132/500
96/96 [==============================] - 0s - loss: 0.3256 - acc: 0.9479 - val_loss: 0.4006 - val_acc: 0.9583
Epoch 133/500
96/96 [==============================] - 0s - loss: 0.3249 - acc: 0.9479 - val_loss: 0.3998 - val_acc: 0.9583
Epoch 134/500
96/96 [==============================] - 0s - loss: 0.3244 - acc: 0.9479 - val_loss: 0.3993 - val_acc: 0.9583
Epoch 135/500
96/96 [==============================] - 0s - loss: 0.3240 - acc: 0.9479 - val_loss: 0.3983 - val_acc: 0.9583
Epoch 136/500
96/96 [==============================] - 0s - loss: 0.3232 - acc: 0.9479 - val_loss: 0.3974 - val_acc: 0.9583
Epoch 137/500
96/96 [==============================] - 0s - loss: 0.3231 - acc: 0.9479 - val_loss: 0.3972 - val_acc: 0.9583
Epoch 138/500
96/96 [==============================] - 0s - loss: 0.3218 - acc: 0.9479 - val_loss: 0.3964 - val_acc: 0.9583
Epoch 139/500
96/96 [==============================] - 0s - loss: 0.3213 - acc: 0.9479 - val_loss: 0.3956 - val_acc: 0.9583
Epoch 140/500
96/96 [==============================] - 0s - loss: 0.3207 - acc: 0.9479 - val_loss: 0.3947 - val_acc: 0.9583
Epoch 141/500
96/96 [==============================] - 0s - loss: 0.3202 - acc: 0.9479 - val_loss: 0.3942 - val_acc: 0.9583
Epoch 142/500
96/96 [==============================] - 0s - loss: 0.3195 - acc: 0.9479 - val_loss: 0.3935 - val_acc: 0.9583
Epoch 143/500
96/96 [==============================] - 0s - loss: 0.3190 - acc: 0.9479 - val_loss: 0.3924 - val_acc: 0.9583
Epoch 144/500
96/96 [==============================] - 0s - loss: 0.3187 - acc: 0.9479 - val_loss: 0.3920 - val_acc: 0.9583
Epoch 145/500
96/96 [==============================] - 0s - loss: 0.3179 - acc: 0.9479 - val_loss: 0.3908 - val_acc: 0.9583
Epoch 146/500
96/96 [==============================] - 0s - loss: 0.3170 - acc: 0.9479 - val_loss: 0.3899 - val_acc: 0.9583
Epoch 147/500
96/96 [==============================] - 0s - loss: 0.3171 - acc: 0.9479 - val_loss: 0.3896 - val_acc: 0.9583
Epoch 148/500
96/96 [==============================] - 0s - loss: 0.3160 - acc: 0.9479 - val_loss: 0.3886 - val_acc: 0.9583
Epoch 149/500
96/96 [==============================] - 0s - loss: 0.3155 - acc: 0.9479 - val_loss: 0.3876 - val_acc: 0.9583
Epoch 150/500
96/96 [==============================] - 0s - loss: 0.3148 - acc: 0.9479 - val_loss: 0.3871 - val_acc: 0.9583
Epoch 151/500
96/96 [==============================] - 0s - loss: 0.3144 - acc: 0.9479 - val_loss: 0.3867 - val_acc: 0.9583
Epoch 152/500
96/96 [==============================] - 0s - loss: 0.3135 - acc: 0.9479 - val_loss: 0.3859 - val_acc: 0.9583
Epoch 153/500
96/96 [==============================] - 0s - loss: 0.3130 - acc: 0.9479 - val_loss: 0.3850 - val_acc: 0.9583
Epoch 154/500
96/96 [==============================] - 0s - loss: 0.3126 - acc: 0.9479 - val_loss: 0.3839 - val_acc: 0.9583
Epoch 155/500
96/96 [==============================] - 0s - loss: 0.3119 - acc: 0.9479 - val_loss: 0.3830 - val_acc: 0.9583
Epoch 156/500
96/96 [==============================] - ETA: 0s - loss: 0.3175 - acc: 0.937 - 0s - loss: 0.3112 - acc: 0.9479 - val_loss: 0.3824 - val_acc: 0.9583
Epoch 157/500
96/96 [==============================] - 0s - loss: 0.3107 - acc: 0.9479 - val_loss: 0.3816 - val_acc: 0.9583
Epoch 158/500
96/96 [==============================] - 0s - loss: 0.3102 - acc: 0.9479 - val_loss: 0.3809 - val_acc: 0.9583
Epoch 159/500
96/96 [==============================] - 0s - loss: 0.3094 - acc: 0.9479 - val_loss: 0.3805 - val_acc: 0.9583
Epoch 160/500
96/96 [==============================] - 0s - loss: 0.3088 - acc: 0.9479 - val_loss: 0.3801 - val_acc: 0.9583
Epoch 161/500
96/96 [==============================] - 0s - loss: 0.3082 - acc: 0.9479 - val_loss: 0.3796 - val_acc: 0.9583
Epoch 162/500
96/96 [==============================] - 0s - loss: 0.3076 - acc: 0.9479 - val_loss: 0.3790 - val_acc: 0.9583
Epoch 163/500
96/96 [==============================] - 0s - loss: 0.3073 - acc: 0.9479 - val_loss: 0.3780 - val_acc: 0.9583
Epoch 164/500
96/96 [==============================] - 0s - loss: 0.3065 - acc: 0.9479 - val_loss: 0.3773 - val_acc: 0.9583
Epoch 165/500
96/96 [==============================] - 0s - loss: 0.3059 - acc: 0.9479 - val_loss: 0.3766 - val_acc: 0.9583
Epoch 166/500
96/96 [==============================] - 0s - loss: 0.3053 - acc: 0.9479 - val_loss: 0.3758 - val_acc: 0.9583
Epoch 167/500
96/96 [==============================] - 0s - loss: 0.3048 - acc: 0.9479 - val_loss: 0.3753 - val_acc: 0.9583
Epoch 168/500
96/96 [==============================] - 0s - loss: 0.3042 - acc: 0.9479 - val_loss: 0.3746 - val_acc: 0.9583
Epoch 169/500
96/96 [==============================] - 0s - loss: 0.3035 - acc: 0.9479 - val_loss: 0.3738 - val_acc: 0.9583
Epoch 170/500
96/96 [==============================] - ETA: 0s - loss: 0.3322 - acc: 0.906 - 0s - loss: 0.3029 - acc: 0.9479 - val_loss: 0.3730 - val_acc: 0.9583
Epoch 171/500
96/96 [==============================] - 0s - loss: 0.3025 - acc: 0.9479 - val_loss: 0.3724 - val_acc: 0.9583
Epoch 172/500
96/96 [==============================] - 0s - loss: 0.3018 - acc: 0.9479 - val_loss: 0.3716 - val_acc: 0.9583
Epoch 173/500
96/96 [==============================] - 0s - loss: 0.3013 - acc: 0.9479 - val_loss: 0.3707 - val_acc: 0.9583
Epoch 174/500
96/96 [==============================] - 0s - loss: 0.3007 - acc: 0.9583 - val_loss: 0.3700 - val_acc: 0.9583
Epoch 175/500
96/96 [==============================] - 0s - loss: 0.3002 - acc: 0.9583 - val_loss: 0.3690 - val_acc: 0.9583
Epoch 176/500
96/96 [==============================] - 0s - loss: 0.3001 - acc: 0.9583 - val_loss: 0.3687 - val_acc: 0.9583
Epoch 177/500
96/96 [==============================] - 0s - loss: 0.2989 - acc: 0.9583 - val_loss: 0.3679 - val_acc: 0.9583
Epoch 178/500
96/96 [==============================] - 0s - loss: 0.2985 - acc: 0.9583 - val_loss: 0.3672 - val_acc: 0.9583
Epoch 179/500
96/96 [==============================] - 0s - loss: 0.2978 - acc: 0.9583 - val_loss: 0.3663 - val_acc: 0.9583
Epoch 180/500
96/96 [==============================] - 0s - loss: 0.2972 - acc: 0.9583 - val_loss: 0.3655 - val_acc: 0.9583
Epoch 181/500
96/96 [==============================] - 0s - loss: 0.2968 - acc: 0.9583 - val_loss: 0.3649 - val_acc: 0.9583
Epoch 182/500
96/96 [==============================] - 0s - loss: 0.2961 - acc: 0.9583 - val_loss: 0.3641 - val_acc: 0.9583
Epoch 183/500
96/96 [==============================] - 0s - loss: 0.2956 - acc: 0.9583 - val_loss: 0.3635 - val_acc: 0.9583
Epoch 184/500
96/96 [==============================] - 0s - loss: 0.2950 - acc: 0.9583 - val_loss: 0.3628 - val_acc: 0.9583
Epoch 185/500
96/96 [==============================] - 0s - loss: 0.2944 - acc: 0.9583 - val_loss: 0.3621 - val_acc: 0.9583
Epoch 186/500
96/96 [==============================] - 0s - loss: 0.2939 - acc: 0.9583 - val_loss: 0.3614 - val_acc: 0.9583
Epoch 187/500
96/96 [==============================] - 0s - loss: 0.2933 - acc: 0.9583 - val_loss: 0.3607 - val_acc: 0.9583
Epoch 188/500
96/96 [==============================] - 0s - loss: 0.2929 - acc: 0.9583 - val_loss: 0.3599 - val_acc: 0.9583
Epoch 189/500
96/96 [==============================] - 0s - loss: 0.2921 - acc: 0.9583 - val_loss: 0.3593 - val_acc: 0.9583
Epoch 190/500
96/96 [==============================] - 0s - loss: 0.2917 - acc: 0.9583 - val_loss: 0.3588 - val_acc: 0.9583
Epoch 191/500
96/96 [==============================] - 0s - loss: 0.2911 - acc: 0.9583 - val_loss: 0.3583 - val_acc: 0.9583
Epoch 192/500
96/96 [==============================] - 0s - loss: 0.2905 - acc: 0.9583 - val_loss: 0.3576 - val_acc: 0.9583
Epoch 193/500
96/96 [==============================] - 0s - loss: 0.2899 - acc: 0.9583 - val_loss: 0.3568 - val_acc: 0.9583
Epoch 194/500
96/96 [==============================] - 0s - loss: 0.2893 - acc: 0.9583 - val_loss: 0.3560 - val_acc: 0.9583
Epoch 195/500
96/96 [==============================] - 0s - loss: 0.2888 - acc: 0.9583 - val_loss: 0.3552 - val_acc: 0.9583
Epoch 196/500
96/96 [==============================] - 0s - loss: 0.2884 - acc: 0.9583 - val_loss: 0.3544 - val_acc: 0.9583
Epoch 197/500
96/96 [==============================] - 0s - loss: 0.2877 - acc: 0.9583 - val_loss: 0.3538 - val_acc: 0.9583
Epoch 198/500
96/96 [==============================] - 0s - loss: 0.2872 - acc: 0.9583 - val_loss: 0.3530 - val_acc: 0.9583
Epoch 199/500
96/96 [==============================] - 0s - loss: 0.2867 - acc: 0.9583 - val_loss: 0.3522 - val_acc: 0.9583
Epoch 200/500
96/96 [==============================] - 0s - loss: 0.2860 - acc: 0.9583 - val_loss: 0.3517 - val_acc: 0.9583
Epoch 201/500
96/96 [==============================] - 0s - loss: 0.2854 - acc: 0.9583 - val_loss: 0.3510 - val_acc: 0.9583
Epoch 202/500
96/96 [==============================] - 0s - loss: 0.2849 - acc: 0.9583 - val_loss: 0.3505 - val_acc: 0.9583
Epoch 203/500
96/96 [==============================] - 0s - loss: 0.2844 - acc: 0.9583 - val_loss: 0.3500 - val_acc: 0.9583
Epoch 204/500
96/96 [==============================] - 0s - loss: 0.2839 - acc: 0.9583 - val_loss: 0.3495 - val_acc: 0.9583
Epoch 205/500
96/96 [==============================] - 0s - loss: 0.2833 - acc: 0.9583 - val_loss: 0.3488 - val_acc: 0.9583
Epoch 206/500
96/96 [==============================] - 0s - loss: 0.2827 - acc: 0.9583 - val_loss: 0.3480 - val_acc: 0.9583
Epoch 207/500
96/96 [==============================] - 0s - loss: 0.2822 - acc: 0.9583 - val_loss: 0.3472 - val_acc: 0.9583
Epoch 208/500
96/96 [==============================] - 0s - loss: 0.2816 - acc: 0.9583 - val_loss: 0.3465 - val_acc: 0.9583
Epoch 209/500
96/96 [==============================] - 0s - loss: 0.2811 - acc: 0.9583 - val_loss: 0.3455 - val_acc: 0.9583
Epoch 210/500
96/96 [==============================] - 0s - loss: 0.2807 - acc: 0.9583 - val_loss: 0.3447 - val_acc: 0.9583
Epoch 211/500
96/96 [==============================] - 0s - loss: 0.2800 - acc: 0.9583 - val_loss: 0.3440 - val_acc: 0.9583
Epoch 212/500
96/96 [==============================] - 0s - loss: 0.2796 - acc: 0.9583 - val_loss: 0.3436 - val_acc: 0.9583
Epoch 213/500
96/96 [==============================] - 0s - loss: 0.2789 - acc: 0.9583 - val_loss: 0.3428 - val_acc: 0.9583
Epoch 214/500
96/96 [==============================] - 0s - loss: 0.2783 - acc: 0.9583 - val_loss: 0.3422 - val_acc: 0.9583
Epoch 215/500
96/96 [==============================] - 0s - loss: 0.2778 - acc: 0.9583 - val_loss: 0.3414 - val_acc: 0.9583
Epoch 216/500
96/96 [==============================] - 0s - loss: 0.2772 - acc: 0.9583 - val_loss: 0.3408 - val_acc: 0.9583
Epoch 217/500
96/96 [==============================] - 0s - loss: 0.2767 - acc: 0.9583 - val_loss: 0.3402 - val_acc: 0.9583
Epoch 218/500
96/96 [==============================] - ETA: 0s - loss: 0.2429 - acc: 1.000 - 0s - loss: 0.2766 - acc: 0.9583 - val_loss: 0.3398 - val_acc: 0.9583
Epoch 219/500
96/96 [==============================] - 0s - loss: 0.2757 - acc: 0.9583 - val_loss: 0.3392 - val_acc: 0.9583
Epoch 220/500
96/96 [==============================] - 0s - loss: 0.2752 - acc: 0.9583 - val_loss: 0.3386 - val_acc: 0.9583
Epoch 221/500
96/96 [==============================] - 0s - loss: 0.2745 - acc: 0.9583 - val_loss: 0.3378 - val_acc: 0.9583
Epoch 222/500
96/96 [==============================] - 0s - loss: 0.2740 - acc: 0.9583 - val_loss: 0.3370 - val_acc: 0.9583
Epoch 223/500
96/96 [==============================] - 0s - loss: 0.2735 - acc: 0.9583 - val_loss: 0.3361 - val_acc: 0.9583
Epoch 224/500
96/96 [==============================] - 0s - loss: 0.2730 - acc: 0.9583 - val_loss: 0.3353 - val_acc: 0.9583
Epoch 225/500
96/96 [==============================] - 0s - loss: 0.2724 - acc: 0.9688 - val_loss: 0.3347 - val_acc: 0.9583
Epoch 226/500
96/96 [==============================] - 0s - loss: 0.2719 - acc: 0.9688 - val_loss: 0.3340 - val_acc: 0.9583
Epoch 227/500
96/96 [==============================] - 0s - loss: 0.2713 - acc: 0.9688 - val_loss: 0.3334 - val_acc: 0.9583
Epoch 228/500
96/96 [==============================] - 0s - loss: 0.2708 - acc: 0.9583 - val_loss: 0.3328 - val_acc: 0.9583
Epoch 229/500
96/96 [==============================] - 0s - loss: 0.2703 - acc: 0.9688 - val_loss: 0.3321 - val_acc: 0.9583
Epoch 230/500
96/96 [==============================] - 0s - loss: 0.2700 - acc: 0.9583 - val_loss: 0.3317 - val_acc: 0.9583
Epoch 231/500
96/96 [==============================] - ETA: 0s - loss: 0.2529 - acc: 0.937 - 0s - loss: 0.2692 - acc: 0.9583 - val_loss: 0.3310 - val_acc: 0.9583
Epoch 232/500
96/96 [==============================] - 0s - loss: 0.2689 - acc: 0.9583 - val_loss: 0.3305 - val_acc: 0.9583
Epoch 233/500
96/96 [==============================] - 0s - loss: 0.2681 - acc: 0.9583 - val_loss: 0.3297 - val_acc: 0.9583
Epoch 234/500
96/96 [==============================] - 0s - loss: 0.2676 - acc: 0.9583 - val_loss: 0.3290 - val_acc: 0.9583
Epoch 235/500
96/96 [==============================] - 0s - loss: 0.2671 - acc: 0.9688 - val_loss: 0.3283 - val_acc: 0.9583
Epoch 236/500
96/96 [==============================] - 0s - loss: 0.2666 - acc: 0.9688 - val_loss: 0.3275 - val_acc: 0.9583
Epoch 237/500
96/96 [==============================] - 0s - loss: 0.2661 - acc: 0.9688 - val_loss: 0.3270 - val_acc: 0.9583
Epoch 238/500
96/96 [==============================] - 0s - loss: 0.2656 - acc: 0.9688 - val_loss: 0.3263 - val_acc: 0.9583
Epoch 239/500
96/96 [==============================] - 0s - loss: 0.2651 - acc: 0.9688 - val_loss: 0.3256 - val_acc: 0.9583
Epoch 240/500
96/96 [==============================] - 0s - loss: 0.2645 - acc: 0.9688 - val_loss: 0.3250 - val_acc: 0.9583
Epoch 241/500
96/96 [==============================] - 0s - loss: 0.2640 - acc: 0.9688 - val_loss: 0.3243 - val_acc: 0.9583
Epoch 242/500
96/96 [==============================] - 0s - loss: 0.2635 - acc: 0.9688 - val_loss: 0.3237 - val_acc: 0.9583
Epoch 243/500
96/96 [==============================] - 0s - loss: 0.2630 - acc: 0.9688 - val_loss: 0.3232 - val_acc: 0.9583
Epoch 244/500
96/96 [==============================] - 0s - loss: 0.2625 - acc: 0.9688 - val_loss: 0.3225 - val_acc: 0.9583
Epoch 245/500
96/96 [==============================] - 0s - loss: 0.2619 - acc: 0.9688 - val_loss: 0.3219 - val_acc: 0.9583
Epoch 246/500
96/96 [==============================] - 0s - loss: 0.2614 - acc: 0.9688 - val_loss: 0.3213 - val_acc: 0.9583
Epoch 247/500
96/96 [==============================] - 0s - loss: 0.2609 - acc: 0.9688 - val_loss: 0.3205 - val_acc: 0.9583
Epoch 248/500
96/96 [==============================] - 0s - loss: 0.2603 - acc: 0.9688 - val_loss: 0.3199 - val_acc: 0.9583
Epoch 249/500
96/96 [==============================] - 0s - loss: 0.2600 - acc: 0.9688 - val_loss: 0.3194 - val_acc: 0.9583
Epoch 250/500
96/96 [==============================] - 0s - loss: 0.2593 - acc: 0.9688 - val_loss: 0.3187 - val_acc: 0.9583
Epoch 251/500
96/96 [==============================] - 0s - loss: 0.2590 - acc: 0.9688 - val_loss: 0.3179 - val_acc: 0.9583
Epoch 252/500
96/96 [==============================] - 0s - loss: 0.2583 - acc: 0.9688 - val_loss: 0.3172 - val_acc: 0.9583
Epoch 253/500
96/96 [==============================] - 0s - loss: 0.2579 - acc: 0.9688 - val_loss: 0.3167 - val_acc: 0.9583
Epoch 254/500
96/96 [==============================] - 0s - loss: 0.2576 - acc: 0.9688 - val_loss: 0.3160 - val_acc: 0.9583
Epoch 255/500
96/96 [==============================] - 0s - loss: 0.2568 - acc: 0.9688 - val_loss: 0.3154 - val_acc: 0.9583
Epoch 256/500
96/96 [==============================] - 0s - loss: 0.2563 - acc: 0.9688 - val_loss: 0.3148 - val_acc: 0.9583
Epoch 257/500
96/96 [==============================] - 0s - loss: 0.2558 - acc: 0.9688 - val_loss: 0.3142 - val_acc: 0.9583
Epoch 258/500
96/96 [==============================] - 0s - loss: 0.2556 - acc: 0.9688 - val_loss: 0.3135 - val_acc: 0.9583
Epoch 259/500
96/96 [==============================] - 0s - loss: 0.2548 - acc: 0.9688 - val_loss: 0.3130 - val_acc: 0.9583
Epoch 260/500
96/96 [==============================] - 0s - loss: 0.2542 - acc: 0.9688 - val_loss: 0.3124 - val_acc: 0.9583
Epoch 261/500
96/96 [==============================] - 0s - loss: 0.2539 - acc: 0.9688 - val_loss: 0.3117 - val_acc: 0.9583
Epoch 262/500
96/96 [==============================] - 0s - loss: 0.2532 - acc: 0.9688 - val_loss: 0.3111 - val_acc: 0.9583
Epoch 263/500
96/96 [==============================] - 0s - loss: 0.2529 - acc: 0.9688 - val_loss: 0.3104 - val_acc: 0.9583
Epoch 264/500
96/96 [==============================] - 0s - loss: 0.2522 - acc: 0.9688 - val_loss: 0.3098 - val_acc: 0.9583
Epoch 265/500
96/96 [==============================] - 0s - loss: 0.2517 - acc: 0.9688 - val_loss: 0.3092 - val_acc: 0.9583
Epoch 266/500
96/96 [==============================] - 0s - loss: 0.2513 - acc: 0.9688 - val_loss: 0.3086 - val_acc: 0.9583
Epoch 267/500
96/96 [==============================] - 0s - loss: 0.2508 - acc: 0.9688 - val_loss: 0.3080 - val_acc: 0.9583
Epoch 268/500
96/96 [==============================] - 0s - loss: 0.2502 - acc: 0.9688 - val_loss: 0.3074 - val_acc: 0.9583
Epoch 269/500
96/96 [==============================] - 0s - loss: 0.2499 - acc: 0.9688 - val_loss: 0.3069 - val_acc: 0.9583
Epoch 270/500
96/96 [==============================] - 0s - loss: 0.2493 - acc: 0.9688 - val_loss: 0.3063 - val_acc: 0.9583
Epoch 271/500
96/96 [==============================] - 0s - loss: 0.2488 - acc: 0.9688 - val_loss: 0.3057 - val_acc: 0.9583
Epoch 272/500
96/96 [==============================] - 0s - loss: 0.2483 - acc: 0.9688 - val_loss: 0.3050 - val_acc: 0.9583
Epoch 273/500
96/96 [==============================] - 0s - loss: 0.2478 - acc: 0.9688 - val_loss: 0.3044 - val_acc: 0.9583
Epoch 274/500
96/96 [==============================] - 0s - loss: 0.2475 - acc: 0.9688 - val_loss: 0.3037 - val_acc: 0.9583
Epoch 275/500
96/96 [==============================] - 0s - loss: 0.2468 - acc: 0.9688 - val_loss: 0.3030 - val_acc: 0.9583
Epoch 276/500
96/96 [==============================] - 0s - loss: 0.2464 - acc: 0.9688 - val_loss: 0.3024 - val_acc: 0.9583
Epoch 277/500
96/96 [==============================] - 0s - loss: 0.2458 - acc: 0.9688 - val_loss: 0.3018 - val_acc: 0.9583
Epoch 278/500
96/96 [==============================] - 0s - loss: 0.2454 - acc: 0.9688 - val_loss: 0.3012 - val_acc: 0.9583
Epoch 279/500
96/96 [==============================] - 0s - loss: 0.2450 - acc: 0.9688 - val_loss: 0.3006 - val_acc: 0.9583
Epoch 280/500
96/96 [==============================] - 0s - loss: 0.2445 - acc: 0.9688 - val_loss: 0.3001 - val_acc: 0.9583
Epoch 281/500
96/96 [==============================] - 0s - loss: 0.2439 - acc: 0.9688 - val_loss: 0.2995 - val_acc: 0.9583
Epoch 282/500
96/96 [==============================] - 0s - loss: 0.2435 - acc: 0.9688 - val_loss: 0.2989 - val_acc: 0.9583
Epoch 283/500
96/96 [==============================] - 0s - loss: 0.2430 - acc: 0.9688 - val_loss: 0.2983 - val_acc: 0.9583
Epoch 284/500
96/96 [==============================] - 0s - loss: 0.2425 - acc: 0.9688 - val_loss: 0.2977 - val_acc: 0.9583
Epoch 285/500
96/96 [==============================] - 0s - loss: 0.2421 - acc: 0.9688 - val_loss: 0.2971 - val_acc: 0.9583
Epoch 286/500
96/96 [==============================] - 0s - loss: 0.2415 - acc: 0.9688 - val_loss: 0.2965 - val_acc: 0.9583
Epoch 287/500
96/96 [==============================] - 0s - loss: 0.2411 - acc: 0.9688 - val_loss: 0.2959 - val_acc: 0.9583
Epoch 288/500
96/96 [==============================] - 0s - loss: 0.2408 - acc: 0.9688 - val_loss: 0.2954 - val_acc: 0.9583
Epoch 289/500
96/96 [==============================] - 0s - loss: 0.2402 - acc: 0.9688 - val_loss: 0.2947 - val_acc: 0.9583
Epoch 290/500
96/96 [==============================] - 0s - loss: 0.2397 - acc: 0.9688 - val_loss: 0.2941 - val_acc: 0.9583
Epoch 291/500
96/96 [==============================] - 0s - loss: 0.2392 - acc: 0.9688 - val_loss: 0.2935 - val_acc: 0.9583
Epoch 292/500
96/96 [==============================] - 0s - loss: 0.2390 - acc: 0.9688 - val_loss: 0.2929 - val_acc: 0.9583
Epoch 293/500
96/96 [==============================] - 0s - loss: 0.2382 - acc: 0.9688 - val_loss: 0.2923 - val_acc: 0.9583
Epoch 294/500
96/96 [==============================] - 0s - loss: 0.2378 - acc: 0.9688 - val_loss: 0.2917 - val_acc: 0.9583
Epoch 295/500
96/96 [==============================] - 0s - loss: 0.2373 - acc: 0.9688 - val_loss: 0.2911 - val_acc: 0.9583
Epoch 296/500
96/96 [==============================] - 0s - loss: 0.2369 - acc: 0.9688 - val_loss: 0.2906 - val_acc: 0.9583
Epoch 297/500
96/96 [==============================] - 0s - loss: 0.2364 - acc: 0.9688 - val_loss: 0.2900 - val_acc: 0.9583
Epoch 298/500
96/96 [==============================] - 0s - loss: 0.2360 - acc: 0.9688 - val_loss: 0.2895 - val_acc: 0.9583
Epoch 299/500
96/96 [==============================] - 0s - loss: 0.2358 - acc: 0.9688 - val_loss: 0.2889 - val_acc: 0.9583
Epoch 300/500
96/96 [==============================] - ETA: 0s - loss: 0.2240 - acc: 0.968 - 0s - loss: 0.2350 - acc: 0.9688 - val_loss: 0.2883 - val_acc: 0.9583
Epoch 301/500
96/96 [==============================] - 0s - loss: 0.2345 - acc: 0.9688 - val_loss: 0.2877 - val_acc: 0.9583
Epoch 302/500
96/96 [==============================] - 0s - loss: 0.2341 - acc: 0.9688 - val_loss: 0.2871 - val_acc: 0.9583
Epoch 303/500
96/96 [==============================] - 0s - loss: 0.2336 - acc: 0.9688 - val_loss: 0.2865 - val_acc: 0.9583
Epoch 304/500
96/96 [==============================] - 0s - loss: 0.2332 - acc: 0.9688 - val_loss: 0.2859 - val_acc: 0.9583
Epoch 305/500
96/96 [==============================] - 0s - loss: 0.2330 - acc: 0.9688 - val_loss: 0.2854 - val_acc: 0.9583
Epoch 306/500
96/96 [==============================] - 0s - loss: 0.2322 - acc: 0.9688 - val_loss: 0.2848 - val_acc: 0.9583
Epoch 307/500
96/96 [==============================] - 0s - loss: 0.2319 - acc: 0.9688 - val_loss: 0.2842 - val_acc: 0.9583
Epoch 308/500
96/96 [==============================] - 0s - loss: 0.2315 - acc: 0.9688 - val_loss: 0.2836 - val_acc: 0.9583
Epoch 309/500
96/96 [==============================] - 0s - loss: 0.2309 - acc: 0.9688 - val_loss: 0.2831 - val_acc: 0.9583
Epoch 310/500
96/96 [==============================] - 0s - loss: 0.2306 - acc: 0.9688 - val_loss: 0.2825 - val_acc: 0.9583
Epoch 311/500
96/96 [==============================] - 0s - loss: 0.2299 - acc: 0.9688 - val_loss: 0.2820 - val_acc: 0.9583
Epoch 312/500
96/96 [==============================] - 0s - loss: 0.2295 - acc: 0.9688 - val_loss: 0.2814 - val_acc: 0.9583
Epoch 313/500
96/96 [==============================] - 0s - loss: 0.2291 - acc: 0.9688 - val_loss: 0.2809 - val_acc: 0.9583
Epoch 314/500
96/96 [==============================] - 0s - loss: 0.2287 - acc: 0.9688 - val_loss: 0.2803 - val_acc: 0.9583
Epoch 315/500
96/96 [==============================] - 0s - loss: 0.2282 - acc: 0.9688 - val_loss: 0.2797 - val_acc: 0.9583
Epoch 316/500
96/96 [==============================] - 0s - loss: 0.2279 - acc: 0.9688 - val_loss: 0.2791 - val_acc: 0.9583
Epoch 317/500
96/96 [==============================] - 0s - loss: 0.2275 - acc: 0.9688 - val_loss: 0.2786 - val_acc: 0.9583
Epoch 318/500
96/96 [==============================] - 0s - loss: 0.2271 - acc: 0.9688 - val_loss: 0.2781 - val_acc: 0.9583
Epoch 319/500
96/96 [==============================] - 0s - loss: 0.2264 - acc: 0.9688 - val_loss: 0.2775 - val_acc: 0.9583
Epoch 320/500
96/96 [==============================] - 0s - loss: 0.2259 - acc: 0.9688 - val_loss: 0.2770 - val_acc: 0.9583
Epoch 321/500
96/96 [==============================] - 0s - loss: 0.2257 - acc: 0.9688 - val_loss: 0.2764 - val_acc: 0.9583
Epoch 322/500
96/96 [==============================] - 0s - loss: 0.2251 - acc: 0.9688 - val_loss: 0.2758 - val_acc: 0.9583
Epoch 323/500
96/96 [==============================] - 0s - loss: 0.2248 - acc: 0.9688 - val_loss: 0.2753 - val_acc: 0.9583
Epoch 324/500
96/96 [==============================] - 0s - loss: 0.2242 - acc: 0.9688 - val_loss: 0.2748 - val_acc: 0.9583
Epoch 325/500
96/96 [==============================] - 0s - loss: 0.2238 - acc: 0.9688 - val_loss: 0.2742 - val_acc: 0.9583
Epoch 326/500
96/96 [==============================] - 0s - loss: 0.2234 - acc: 0.9688 - val_loss: 0.2736 - val_acc: 0.9583
Epoch 327/500
96/96 [==============================] - 0s - loss: 0.2230 - acc: 0.9688 - val_loss: 0.2731 - val_acc: 0.9583
Epoch 328/500
96/96 [==============================] - 0s - loss: 0.2224 - acc: 0.9688 - val_loss: 0.2726 - val_acc: 0.9583
Epoch 329/500
96/96 [==============================] - 0s - loss: 0.2221 - acc: 0.9688 - val_loss: 0.2720 - val_acc: 0.9583
Epoch 330/500
96/96 [==============================] - 0s - loss: 0.2218 - acc: 0.9688 - val_loss: 0.2715 - val_acc: 0.9583
Epoch 331/500
96/96 [==============================] - 0s - loss: 0.2212 - acc: 0.9688 - val_loss: 0.2709 - val_acc: 0.9583
Epoch 332/500
96/96 [==============================] - 0s - loss: 0.2208 - acc: 0.9688 - val_loss: 0.2704 - val_acc: 0.9583
Epoch 333/500
96/96 [==============================] - 0s - loss: 0.2204 - acc: 0.9688 - val_loss: 0.2699 - val_acc: 0.9583
Epoch 334/500
96/96 [==============================] - 0s - loss: 0.2199 - acc: 0.9688 - val_loss: 0.2693 - val_acc: 0.9583
Epoch 335/500
96/96 [==============================] - 0s - loss: 0.2194 - acc: 0.9688 - val_loss: 0.2688 - val_acc: 0.9583
Epoch 336/500
96/96 [==============================] - 0s - loss: 0.2192 - acc: 0.9688 - val_loss: 0.2683 - val_acc: 0.9583
Epoch 337/500
96/96 [==============================] - 0s - loss: 0.2186 - acc: 0.9688 - val_loss: 0.2678 - val_acc: 0.9583
Epoch 338/500
96/96 [==============================] - 0s - loss: 0.2182 - acc: 0.9688 - val_loss: 0.2672 - val_acc: 0.9583
Epoch 339/500
96/96 [==============================] - 0s - loss: 0.2178 - acc: 0.9688 - val_loss: 0.2667 - val_acc: 0.9583
Epoch 340/500
96/96 [==============================] - 0s - loss: 0.2173 - acc: 0.9688 - val_loss: 0.2662 - val_acc: 0.9583
Epoch 341/500
96/96 [==============================] - 0s - loss: 0.2169 - acc: 0.9688 - val_loss: 0.2656 - val_acc: 0.9583
Epoch 342/500
96/96 [==============================] - ETA: 0s - loss: 0.2213 - acc: 0.968 - 0s - loss: 0.2165 - acc: 0.9688 - val_loss: 0.2651 - val_acc: 0.9583
Epoch 343/500
96/96 [==============================] - 0s - loss: 0.2162 - acc: 0.9688 - val_loss: 0.2646 - val_acc: 0.9583
Epoch 344/500
96/96 [==============================] - 0s - loss: 0.2158 - acc: 0.9688 - val_loss: 0.2641 - val_acc: 0.9583
Epoch 345/500
96/96 [==============================] - 0s - loss: 0.2153 - acc: 0.9688 - val_loss: 0.2636 - val_acc: 0.9583
Epoch 346/500
96/96 [==============================] - 0s - loss: 0.2150 - acc: 0.9688 - val_loss: 0.2631 - val_acc: 0.9583
Epoch 347/500
96/96 [==============================] - 0s - loss: 0.2144 - acc: 0.9688 - val_loss: 0.2625 - val_acc: 0.9583
Epoch 348/500
96/96 [==============================] - 0s - loss: 0.2141 - acc: 0.9688 - val_loss: 0.2620 - val_acc: 0.9583
Epoch 349/500
96/96 [==============================] - 0s - loss: 0.2137 - acc: 0.9688 - val_loss: 0.2614 - val_acc: 0.9583
Epoch 350/500
96/96 [==============================] - 0s - loss: 0.2132 - acc: 0.9688 - val_loss: 0.2609 - val_acc: 0.9583
Epoch 351/500
96/96 [==============================] - 0s - loss: 0.2128 - acc: 0.9688 - val_loss: 0.2604 - val_acc: 0.9583
Epoch 352/500
96/96 [==============================] - 0s - loss: 0.2124 - acc: 0.9688 - val_loss: 0.2599 - val_acc: 0.9583
Epoch 353/500
96/96 [==============================] - 0s - loss: 0.2121 - acc: 0.9688 - val_loss: 0.2594 - val_acc: 0.9583
Epoch 354/500
96/96 [==============================] - 0s - loss: 0.2116 - acc: 0.9688 - val_loss: 0.2589 - val_acc: 0.9583
Epoch 355/500
96/96 [==============================] - 0s - loss: 0.2113 - acc: 0.9688 - val_loss: 0.2585 - val_acc: 0.9583
Epoch 356/500
96/96 [==============================] - 0s - loss: 0.2108 - acc: 0.9688 - val_loss: 0.2579 - val_acc: 0.9583
Epoch 357/500
96/96 [==============================] - 0s - loss: 0.2103 - acc: 0.9688 - val_loss: 0.2574 - val_acc: 0.9583
Epoch 358/500
96/96 [==============================] - 0s - loss: 0.2099 - acc: 0.9688 - val_loss: 0.2569 - val_acc: 0.9583
Epoch 359/500
96/96 [==============================] - 0s - loss: 0.2095 - acc: 0.9688 - val_loss: 0.2563 - val_acc: 0.9583
Epoch 360/500
96/96 [==============================] - 0s - loss: 0.2092 - acc: 0.9688 - val_loss: 0.2558 - val_acc: 0.9583
Epoch 361/500
96/96 [==============================] - 0s - loss: 0.2092 - acc: 0.9688 - val_loss: 0.2554 - val_acc: 0.9583
Epoch 362/500
96/96 [==============================] - 0s - loss: 0.2085 - acc: 0.9688 - val_loss: 0.2548 - val_acc: 0.9583
Epoch 363/500
96/96 [==============================] - 0s - loss: 0.2080 - acc: 0.9688 - val_loss: 0.2543 - val_acc: 0.9583
Epoch 364/500
96/96 [==============================] - 0s - loss: 0.2076 - acc: 0.9688 - val_loss: 0.2537 - val_acc: 0.9583
Epoch 365/500
96/96 [==============================] - 0s - loss: 0.2074 - acc: 0.9688 - val_loss: 0.2532 - val_acc: 0.9583
Epoch 366/500
96/96 [==============================] - 0s - loss: 0.2068 - acc: 0.9688 - val_loss: 0.2528 - val_acc: 0.9583
Epoch 367/500
96/96 [==============================] - 0s - loss: 0.2064 - acc: 0.9688 - val_loss: 0.2523 - val_acc: 0.9583
Epoch 368/500
96/96 [==============================] - 0s - loss: 0.2059 - acc: 0.9688 - val_loss: 0.2518 - val_acc: 0.9583
Epoch 369/500
96/96 [==============================] - 0s - loss: 0.2060 - acc: 0.9688 - val_loss: 0.2515 - val_acc: 0.9583
Epoch 370/500
96/96 [==============================] - 0s - loss: 0.2053 - acc: 0.9688 - val_loss: 0.2510 - val_acc: 0.9583
Epoch 371/500
96/96 [==============================] - 0s - loss: 0.2051 - acc: 0.9688 - val_loss: 0.2506 - val_acc: 0.9583
Epoch 372/500
96/96 [==============================] - 0s - loss: 0.2045 - acc: 0.9688 - val_loss: 0.2501 - val_acc: 0.9583
Epoch 373/500
96/96 [==============================] - 0s - loss: 0.2040 - acc: 0.9688 - val_loss: 0.2495 - val_acc: 0.9583
Epoch 374/500
96/96 [==============================] - 0s - loss: 0.2036 - acc: 0.9688 - val_loss: 0.2490 - val_acc: 0.9583
Epoch 375/500
96/96 [==============================] - 0s - loss: 0.2033 - acc: 0.9688 - val_loss: 0.2485 - val_acc: 0.9583
Epoch 376/500
96/96 [==============================] - 0s - loss: 0.2028 - acc: 0.9688 - val_loss: 0.2479 - val_acc: 0.9583
Epoch 377/500
96/96 [==============================] - 0s - loss: 0.2025 - acc: 0.9688 - val_loss: 0.2474 - val_acc: 0.9583
Epoch 378/500
96/96 [==============================] - 0s - loss: 0.2023 - acc: 0.9688 - val_loss: 0.2468 - val_acc: 0.9583
Epoch 379/500
96/96 [==============================] - 0s - loss: 0.2018 - acc: 0.9688 - val_loss: 0.2463 - val_acc: 0.9583
Epoch 380/500
96/96 [==============================] - 0s - loss: 0.2014 - acc: 0.9688 - val_loss: 0.2460 - val_acc: 0.9583
Epoch 381/500
96/96 [==============================] - 0s - loss: 0.2009 - acc: 0.9688 - val_loss: 0.2455 - val_acc: 0.9583
Epoch 382/500
96/96 [==============================] - 0s - loss: 0.2006 - acc: 0.9688 - val_loss: 0.2451 - val_acc: 0.9583
Epoch 383/500
96/96 [==============================] - 0s - loss: 0.2004 - acc: 0.9688 - val_loss: 0.2448 - val_acc: 0.9583
Epoch 384/500
96/96 [==============================] - 0s - loss: 0.2000 - acc: 0.9688 - val_loss: 0.2445 - val_acc: 0.9583
Epoch 385/500
96/96 [==============================] - 0s - loss: 0.1995 - acc: 0.9688 - val_loss: 0.2439 - val_acc: 0.9583
Epoch 386/500
96/96 [==============================] - 0s - loss: 0.1993 - acc: 0.9688 - val_loss: 0.2432 - val_acc: 0.9583
Epoch 387/500
96/96 [==============================] - 0s - loss: 0.1988 - acc: 0.9688 - val_loss: 0.2428 - val_acc: 0.9583
Epoch 388/500
96/96 [==============================] - 0s - loss: 0.1984 - acc: 0.9688 - val_loss: 0.2422 - val_acc: 0.9583
Epoch 389/500
96/96 [==============================] - 0s - loss: 0.1982 - acc: 0.9688 - val_loss: 0.2418 - val_acc: 0.9583
Epoch 390/500
96/96 [==============================] - 0s - loss: 0.1977 - acc: 0.9688 - val_loss: 0.2414 - val_acc: 0.9583
Epoch 391/500
96/96 [==============================] - 0s - loss: 0.1973 - acc: 0.9688 - val_loss: 0.2409 - val_acc: 0.9583
Epoch 392/500
96/96 [==============================] - 0s - loss: 0.1968 - acc: 0.9688 - val_loss: 0.2404 - val_acc: 0.9583
Epoch 393/500
96/96 [==============================] - 0s - loss: 0.1965 - acc: 0.9688 - val_loss: 0.2399 - val_acc: 0.9583
Epoch 394/500
96/96 [==============================] - 0s - loss: 0.1962 - acc: 0.9688 - val_loss: 0.2395 - val_acc: 0.9583
Epoch 395/500
96/96 [==============================] - 0s - loss: 0.1964 - acc: 0.9688 - val_loss: 0.2388 - val_acc: 0.9583
Epoch 396/500
96/96 [==============================] - 0s - loss: 0.1955 - acc: 0.9688 - val_loss: 0.2385 - val_acc: 0.9583
Epoch 397/500
96/96 [==============================] - 0s - loss: 0.1951 - acc: 0.9688 - val_loss: 0.2380 - val_acc: 0.9583
Epoch 398/500
96/96 [==============================] - ETA: 0s - loss: 0.1656 - acc: 1.000 - 0s - loss: 0.1947 - acc: 0.9688 - val_loss: 0.2376 - val_acc: 0.9583
Epoch 399/500
96/96 [==============================] - 0s - loss: 0.1943 - acc: 0.9688 - val_loss: 0.2372 - val_acc: 0.9583
Epoch 400/500
96/96 [==============================] - 0s - loss: 0.1940 - acc: 0.9688 - val_loss: 0.2368 - val_acc: 0.9583
Epoch 401/500
96/96 [==============================] - 0s - loss: 0.1937 - acc: 0.9688 - val_loss: 0.2363 - val_acc: 0.9583
Epoch 402/500
96/96 [==============================] - 0s - loss: 0.1933 - acc: 0.9688 - val_loss: 0.2359 - val_acc: 1.0000
Epoch 403/500
96/96 [==============================] - 0s - loss: 0.1932 - acc: 0.9688 - val_loss: 0.2353 - val_acc: 0.9583
Epoch 404/500
96/96 [==============================] - 0s - loss: 0.1925 - acc: 0.9688 - val_loss: 0.2350 - val_acc: 0.9583
Epoch 405/500
96/96 [==============================] - 0s - loss: 0.1924 - acc: 0.9688 - val_loss: 0.2344 - val_acc: 0.9583
Epoch 406/500
96/96 [==============================] - 0s - loss: 0.1922 - acc: 0.9688 - val_loss: 0.2342 - val_acc: 1.0000
Epoch 407/500
96/96 [==============================] - 0s - loss: 0.1916 - acc: 0.9688 - val_loss: 0.2338 - val_acc: 1.0000
Epoch 408/500
96/96 [==============================] - 0s - loss: 0.1912 - acc: 0.9688 - val_loss: 0.2334 - val_acc: 1.0000
Epoch 409/500
96/96 [==============================] - 0s - loss: 0.1908 - acc: 0.9688 - val_loss: 0.2329 - val_acc: 1.0000
Epoch 410/500
96/96 [==============================] - 0s - loss: 0.1905 - acc: 0.9688 - val_loss: 0.2325 - val_acc: 1.0000
Epoch 411/500
96/96 [==============================] - 0s - loss: 0.1901 - acc: 0.9688 - val_loss: 0.2319 - val_acc: 1.0000
Epoch 412/500
96/96 [==============================] - 0s - loss: 0.1897 - acc: 0.9688 - val_loss: 0.2315 - val_acc: 1.0000
Epoch 413/500
96/96 [==============================] - 0s - loss: 0.1893 - acc: 0.9688 - val_loss: 0.2311 - val_acc: 1.0000
Epoch 414/500
96/96 [==============================] - 0s - loss: 0.1890 - acc: 0.9688 - val_loss: 0.2306 - val_acc: 1.0000
Epoch 415/500
96/96 [==============================] - 0s - loss: 0.1887 - acc: 0.9688 - val_loss: 0.2301 - val_acc: 1.0000
Epoch 416/500
96/96 [==============================] - 0s - loss: 0.1886 - acc: 0.9688 - val_loss: 0.2298 - val_acc: 1.0000
Epoch 417/500
96/96 [==============================] - 0s - loss: 0.1880 - acc: 0.9688 - val_loss: 0.2293 - val_acc: 1.0000
Epoch 418/500
96/96 [==============================] - 0s - loss: 0.1877 - acc: 0.9688 - val_loss: 0.2288 - val_acc: 1.0000
Epoch 419/500
96/96 [==============================] - 0s - loss: 0.1873 - acc: 0.9688 - val_loss: 0.2284 - val_acc: 1.0000
Epoch 420/500
96/96 [==============================] - 0s - loss: 0.1870 - acc: 0.9688 - val_loss: 0.2279 - val_acc: 1.0000
Epoch 421/500
96/96 [==============================] - 0s - loss: 0.1867 - acc: 0.9688 - val_loss: 0.2274 - val_acc: 1.0000
Epoch 422/500
96/96 [==============================] - 0s - loss: 0.1863 - acc: 0.9688 - val_loss: 0.2270 - val_acc: 1.0000
Epoch 423/500
96/96 [==============================] - ETA: 0s - loss: 0.1632 - acc: 0.968 - 0s - loss: 0.1860 - acc: 0.9688 - val_loss: 0.2265 - val_acc: 1.0000
Epoch 424/500
96/96 [==============================] - 0s - loss: 0.1857 - acc: 0.9688 - val_loss: 0.2261 - val_acc: 1.0000
Epoch 425/500
96/96 [==============================] - 0s - loss: 0.1853 - acc: 0.9688 - val_loss: 0.2258 - val_acc: 1.0000
Epoch 426/500
96/96 [==============================] - 0s - loss: 0.1851 - acc: 0.9688 - val_loss: 0.2256 - val_acc: 1.0000
Epoch 427/500
96/96 [==============================] - 0s - loss: 0.1853 - acc: 0.9688 - val_loss: 0.2249 - val_acc: 1.0000
Epoch 428/500
96/96 [==============================] - 0s - loss: 0.1843 - acc: 0.9688 - val_loss: 0.2247 - val_acc: 1.0000
Epoch 429/500
96/96 [==============================] - 0s - loss: 0.1846 - acc: 0.9688 - val_loss: 0.2247 - val_acc: 1.0000
Epoch 430/500
96/96 [==============================] - 0s - loss: 0.1837 - acc: 0.9688 - val_loss: 0.2243 - val_acc: 1.0000
Epoch 431/500
96/96 [==============================] - 0s - loss: 0.1833 - acc: 0.9688 - val_loss: 0.2238 - val_acc: 1.0000
Epoch 432/500
96/96 [==============================] - 0s - loss: 0.1830 - acc: 0.9688 - val_loss: 0.2232 - val_acc: 1.0000
Epoch 433/500
96/96 [==============================] - 0s - loss: 0.1826 - acc: 0.9688 - val_loss: 0.2227 - val_acc: 1.0000
Epoch 434/500
96/96 [==============================] - 0s - loss: 0.1824 - acc: 0.9688 - val_loss: 0.2223 - val_acc: 1.0000
Epoch 435/500
96/96 [==============================] - 0s - loss: 0.1822 - acc: 0.9688 - val_loss: 0.2216 - val_acc: 1.0000
Epoch 436/500
96/96 [==============================] - 0s - loss: 0.1817 - acc: 0.9688 - val_loss: 0.2213 - val_acc: 1.0000
Epoch 437/500
96/96 [==============================] - 0s - loss: 0.1815 - acc: 0.9688 - val_loss: 0.2207 - val_acc: 1.0000
Epoch 438/500
96/96 [==============================] - 0s - loss: 0.1811 - acc: 0.9688 - val_loss: 0.2202 - val_acc: 1.0000
Epoch 439/500
96/96 [==============================] - 0s - loss: 0.1807 - acc: 0.9688 - val_loss: 0.2199 - val_acc: 1.0000
Epoch 440/500
96/96 [==============================] - 0s - loss: 0.1804 - acc: 0.9688 - val_loss: 0.2195 - val_acc: 1.0000
Epoch 441/500
96/96 [==============================] - 0s - loss: 0.1801 - acc: 0.9688 - val_loss: 0.2193 - val_acc: 1.0000
Epoch 442/500
96/96 [==============================] - 0s - loss: 0.1797 - acc: 0.9688 - val_loss: 0.2189 - val_acc: 1.0000
Epoch 443/500
96/96 [==============================] - 0s - loss: 0.1795 - acc: 0.9688 - val_loss: 0.2187 - val_acc: 1.0000
Epoch 444/500
96/96 [==============================] - 0s - loss: 0.1793 - acc: 0.9688 - val_loss: 0.2182 - val_acc: 1.0000
Epoch 445/500
96/96 [==============================] - 0s - loss: 0.1788 - acc: 0.9688 - val_loss: 0.2179 - val_acc: 1.0000
Epoch 446/500
96/96 [==============================] - 0s - loss: 0.1785 - acc: 0.9688 - val_loss: 0.2176 - val_acc: 1.0000
Epoch 447/500
96/96 [==============================] - 0s - loss: 0.1782 - acc: 0.9688 - val_loss: 0.2172 - val_acc: 1.0000
Epoch 448/500
96/96 [==============================] - ETA: 0s - loss: 0.1330 - acc: 1.000 - 0s - loss: 0.1780 - acc: 0.9688 - val_loss: 0.2168 - val_acc: 1.0000
Epoch 449/500
96/96 [==============================] - 0s - loss: 0.1782 - acc: 0.9688 - val_loss: 0.2160 - val_acc: 1.0000
Epoch 450/500
96/96 [==============================] - 0s - loss: 0.1772 - acc: 0.9688 - val_loss: 0.2156 - val_acc: 1.0000
Epoch 451/500
96/96 [==============================] - 0s - loss: 0.1770 - acc: 0.9688 - val_loss: 0.2154 - val_acc: 1.0000
Epoch 452/500
96/96 [==============================] - 0s - loss: 0.1767 - acc: 0.9688 - val_loss: 0.2151 - val_acc: 1.0000
Epoch 453/500
96/96 [==============================] - 0s - loss: 0.1765 - acc: 0.9688 - val_loss: 0.2145 - val_acc: 1.0000
Epoch 454/500
96/96 [==============================] - 0s - loss: 0.1761 - acc: 0.9688 - val_loss: 0.2143 - val_acc: 1.0000
Epoch 455/500
96/96 [==============================] - 0s - loss: 0.1757 - acc: 0.9688 - val_loss: 0.2139 - val_acc: 1.0000
Epoch 456/500
96/96 [==============================] - 0s - loss: 0.1755 - acc: 0.9688 - val_loss: 0.2133 - val_acc: 1.0000
Epoch 457/500
96/96 [==============================] - 0s - loss: 0.1752 - acc: 0.9688 - val_loss: 0.2131 - val_acc: 1.0000
Epoch 458/500
96/96 [==============================] - 0s - loss: 0.1748 - acc: 0.9688 - val_loss: 0.2126 - val_acc: 1.0000
Epoch 459/500
96/96 [==============================] - 0s - loss: 0.1745 - acc: 0.9688 - val_loss: 0.2122 - val_acc: 1.0000
Epoch 460/500
96/96 [==============================] - 0s - loss: 0.1742 - acc: 0.9688 - val_loss: 0.2118 - val_acc: 1.0000
Epoch 461/500
96/96 [==============================] - 0s - loss: 0.1739 - acc: 0.9688 - val_loss: 0.2115 - val_acc: 1.0000
Epoch 462/500
96/96 [==============================] - 0s - loss: 0.1737 - acc: 0.9688 - val_loss: 0.2109 - val_acc: 1.0000
Epoch 463/500
96/96 [==============================] - 0s - loss: 0.1733 - acc: 0.9688 - val_loss: 0.2107 - val_acc: 1.0000
Epoch 464/500
96/96 [==============================] - 0s - loss: 0.1730 - acc: 0.9688 - val_loss: 0.2106 - val_acc: 1.0000
Epoch 465/500
96/96 [==============================] - 0s - loss: 0.1727 - acc: 0.9688 - val_loss: 0.2101 - val_acc: 1.0000
Epoch 466/500
96/96 [==============================] - ETA: 0s - loss: 0.2553 - acc: 0.906 - 0s - loss: 0.1726 - acc: 0.9688 - val_loss: 0.2101 - val_acc: 1.0000
Epoch 467/500
96/96 [==============================] - 0s - loss: 0.1721 - acc: 0.9688 - val_loss: 0.2096 - val_acc: 1.0000
Epoch 468/500
96/96 [==============================] - 0s - loss: 0.1718 - acc: 0.9688 - val_loss: 0.2092 - val_acc: 1.0000
Epoch 469/500
96/96 [==============================] - 0s - loss: 0.1715 - acc: 0.9688 - val_loss: 0.2088 - val_acc: 1.0000
Epoch 470/500
96/96 [==============================] - 0s - loss: 0.1712 - acc: 0.9688 - val_loss: 0.2082 - val_acc: 1.0000
Epoch 471/500
96/96 [==============================] - 0s - loss: 0.1709 - acc: 0.9688 - val_loss: 0.2077 - val_acc: 1.0000
Epoch 472/500
96/96 [==============================] - 0s - loss: 0.1706 - acc: 0.9688 - val_loss: 0.2073 - val_acc: 1.0000
Epoch 473/500
96/96 [==============================] - 0s - loss: 0.1703 - acc: 0.9688 - val_loss: 0.2068 - val_acc: 1.0000
Epoch 474/500
96/96 [==============================] - 0s - loss: 0.1700 - acc: 0.9688 - val_loss: 0.2066 - val_acc: 1.0000
Epoch 475/500
96/96 [==============================] - 0s - loss: 0.1697 - acc: 0.9688 - val_loss: 0.2063 - val_acc: 1.0000
Epoch 476/500
96/96 [==============================] - 0s - loss: 0.1695 - acc: 0.9688 - val_loss: 0.2058 - val_acc: 1.0000
Epoch 477/500
96/96 [==============================] - 0s - loss: 0.1692 - acc: 0.9688 - val_loss: 0.2056 - val_acc: 1.0000
Epoch 478/500
96/96 [==============================] - 0s - loss: 0.1690 - acc: 0.9688 - val_loss: 0.2051 - val_acc: 1.0000
Epoch 479/500
96/96 [==============================] - 0s - loss: 0.1686 - acc: 0.9688 - val_loss: 0.2048 - val_acc: 1.0000
Epoch 480/500
96/96 [==============================] - 0s - loss: 0.1684 - acc: 0.9688 - val_loss: 0.2046 - val_acc: 1.0000
Epoch 481/500
96/96 [==============================] - 0s - loss: 0.1681 - acc: 0.9688 - val_loss: 0.2042 - val_acc: 1.0000
Epoch 482/500
96/96 [==============================] - 0s - loss: 0.1678 - acc: 0.9688 - val_loss: 0.2041 - val_acc: 1.0000
Epoch 483/500
96/96 [==============================] - 0s - loss: 0.1674 - acc: 0.9688 - val_loss: 0.2037 - val_acc: 1.0000
Epoch 484/500
96/96 [==============================] - 0s - loss: 0.1671 - acc: 0.9688 - val_loss: 0.2033 - val_acc: 1.0000
Epoch 485/500
96/96 [==============================] - 0s - loss: 0.1669 - acc: 0.9688 - val_loss: 0.2029 - val_acc: 1.0000
Epoch 486/500
96/96 [==============================] - 0s - loss: 0.1668 - acc: 0.9688 - val_loss: 0.2027 - val_acc: 1.0000
Epoch 487/500
96/96 [==============================] - 0s - loss: 0.1665 - acc: 0.9688 - val_loss: 0.2020 - val_acc: 1.0000
Epoch 488/500
96/96 [==============================] - 0s - loss: 0.1660 - acc: 0.9688 - val_loss: 0.2017 - val_acc: 1.0000
Epoch 489/500
96/96 [==============================] - 0s - loss: 0.1658 - acc: 0.9688 - val_loss: 0.2015 - val_acc: 1.0000
Epoch 490/500
96/96 [==============================] - 0s - loss: 0.1655 - acc: 0.9688 - val_loss: 0.2011 - val_acc: 1.0000
Epoch 491/500
96/96 [==============================] - 0s - loss: 0.1652 - acc: 0.9688 - val_loss: 0.2008 - val_acc: 1.0000
Epoch 492/500
96/96 [==============================] - 0s - loss: 0.1650 - acc: 0.9688 - val_loss: 0.2006 - val_acc: 1.0000
Epoch 493/500
96/96 [==============================] - 0s - loss: 0.1646 - acc: 0.9688 - val_loss: 0.2002 - val_acc: 1.0000
Epoch 494/500
96/96 [==============================] - 0s - loss: 0.1644 - acc: 0.9688 - val_loss: 0.1997 - val_acc: 1.0000
Epoch 495/500
96/96 [==============================] - 0s - loss: 0.1643 - acc: 0.9688 - val_loss: 0.1991 - val_acc: 1.0000
Epoch 496/500
96/96 [==============================] - 0s - loss: 0.1639 - acc: 0.9688 - val_loss: 0.1985 - val_acc: 1.0000
Epoch 497/500
96/96 [==============================] - 0s - loss: 0.1637 - acc: 0.9688 - val_loss: 0.1980 - val_acc: 1.0000
Epoch 498/500
96/96 [==============================] - 0s - loss: 0.1633 - acc: 0.9688 - val_loss: 0.1978 - val_acc: 1.0000
Epoch 499/500
96/96 [==============================] - 0s - loss: 0.1632 - acc: 0.9688 - val_loss: 0.1980 - val_acc: 1.0000
Epoch 500/500
96/96 [==============================] - 0s - loss: 0.1629 - acc: 0.9688 - val_loss: 0.1980 - val_acc: 1.0000
CPU times: user 8.03 s, sys: 4.75 s, total: 12.8 s
Wall time: 11.5 s
Out[36]:
<keras.callbacks.History at 0x7fc05cf01550>

Evaluation


In [32]:
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[32]:
array([[ 0.93480259,  0.06045231,  0.00474514]], dtype=float32)

In [33]:
X[0], y[0]


Out[33]:
(array([ 5.1,  3.5,  1.4,  0.2]), array([ 1.,  0.,  0.]))

In [37]:
train_loss, train_accuracy = model.evaluate(X_train, y_train)
train_loss, train_accuracy


 32/120 [=======>......................] - ETA: 0s
Out[37]:
(0.16963374316692353, 0.97499999999999998)

In [38]:
test_loss, test_accuracy = model.evaluate(X_test, y_test)
test_loss, test_accuracy


30/30 [==============================] - 0s
Out[38]:
(0.16105984151363373, 0.96666663885116577)

Hands-On

Execute this notebook and improve the training results

  • Are the results you see good?
  • What kind of decision boundary can our neurons draw?
  • Can you improve the results? Can you at least match Knn?
    • What is the total mimimum of neurons you need for this problem?
      • Play around with the number of neurons in the hidden layer
    • Add another hidden layer
      • More than one hidden layer is what people call deep neural networks
      • hidden layers close to the input hopefully do the feature engeneering and extraction for us
    • Increase the number of epochs for training

Stop Here

Save Model in Keras und TensorFlow Format


In [39]:
# Keras format
model.save('nn-iris.hdf5')

In [40]:
import os
from keras import backend as K

In [41]:
K.set_learning_phase(0)

In [42]:
sess = K.get_session()

In [43]:
!rm -r tf


rm: cannot remove 'tf': No such file or directory

In [44]:
tf.app.flags.DEFINE_integer('model_version', 1, 'version number of the model.')
tf.app.flags.DEFINE_string('work_dir', '/tmp', 'Working directory.')
FLAGS = tf.app.flags.FLAGS

In [45]:
export_path_base = 'tf'
export_path = os.path.join(
  tf.compat.as_bytes(export_path_base),
  tf.compat.as_bytes(str(FLAGS.model_version)))

In [46]:
classification_inputs = tf.saved_model.utils.build_tensor_info(model.input)
classification_outputs_scores = tf.saved_model.utils.build_tensor_info(model.output)

In [47]:
from tensorflow.python.saved_model.signature_def_utils_impl import build_signature_def, predict_signature_def

In [48]:
signature = predict_signature_def(inputs={'inputs': model.input},
                                  outputs={'scores': model.output})

In [49]:
builder = tf.saved_model.builder.SavedModelBuilder(export_path)

In [50]:
builder.add_meta_graph_and_variables(
      sess, 
     tags=[tf.saved_model.tag_constants.SERVING],
      signature_def_map={
          tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature
      })

In [51]:
builder.save()


Out[51]:
b'tf/1/saved_model.pb'

In [52]:
!ls -lhR tf


tf:
total 0
drwxrwxrwx 0 root root 512 Sep 28 21:36 1

tf/1:
total 55M
-rwxrwxrwx 1 root root 108K Sep 28 21:36 saved_model.pb
drwxrwxrwx 0 root root  512 Sep 28 21:36 variables

tf/1/variables:
total 577K
-rwxrwxrwx 1 root root 432 Sep 28 21:36 variables.data-00000-of-00001
-rwxrwxrwx 1 root root 719 Sep 28 21:36 variables.index

This TensorFlow Modell can be uploaded to Google Cloud ML and called via REST interface


In [53]:
# cd tf
# gsutil cp -R 1 gs://irisnn
# create model and version at https://console.cloud.google.com/mlengine
# gcloud ml-engine predict --model=irisnn --json-instances=./sample_iris.json
# SCORES
# [0.9954029321670532, 0.004596732556819916, 3.3544753819114703e-07]

In [ ]: