Introduction to Neural Networks

How manual coding works

Opposing basic Idea of Supervised Machine Learning

Hope: System can generalize to previously unknown data and situations

Common Use Case: Classification

Types of Machine Learning

AI vs Machine Learning (ML)

NVIDIA Blog: What’s the Difference Between Artificial Intelligence, Machine Learning, and Deep Learning?


In [1]:
import warnings
warnings.filterwarnings('ignore')

In [2]:
%matplotlib inline
%pylab inline


Populating the interactive namespace from numpy and matplotlib

In [3]:
import matplotlib.pylab as plt
import numpy as np

In [4]:
from distutils.version import StrictVersion

In [5]:
import sklearn
print(sklearn.__version__)

assert StrictVersion(sklearn.__version__ ) >= StrictVersion('0.18.1')


0.18.1

In [6]:
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.ERROR)
print(tf.__version__)

assert StrictVersion(tf.__version__) >= StrictVersion('1.1.0')


1.3.0

In [7]:
import keras
print(keras.__version__)

assert StrictVersion(keras.__version__) >= StrictVersion('2.0.0')


Using TensorFlow backend.
2.0.8

In [8]:
import pandas as pd
print(pd.__version__)

assert StrictVersion(pd.__version__) >= StrictVersion('0.20.0')


0.20.1

One of the Classics: Classify Iris Type by sizes of their flower

Solving Iris with Neural Networks

First we load the data set and get an impression


In [9]:
from sklearn.datasets import load_iris
iris = load_iris()
iris.data[0]


Out[9]:
array([ 5.1,  3.5,  1.4,  0.2])

In [10]:
print(iris.DESCR)


Iris Plants Database
====================

Notes
-----
Data Set Characteristics:
    :Number of Instances: 150 (50 in each of three classes)
    :Number of Attributes: 4 numeric, predictive attributes and the class
    :Attribute Information:
        - sepal length in cm
        - sepal width in cm
        - petal length in cm
        - petal width in cm
        - class:
                - Iris-Setosa
                - Iris-Versicolour
                - Iris-Virginica
    :Summary Statistics:

    ============== ==== ==== ======= ===== ====================
                    Min  Max   Mean    SD   Class Correlation
    ============== ==== ==== ======= ===== ====================
    sepal length:   4.3  7.9   5.84   0.83    0.7826
    sepal width:    2.0  4.4   3.05   0.43   -0.4194
    petal length:   1.0  6.9   3.76   1.76    0.9490  (high!)
    petal width:    0.1  2.5   1.20  0.76     0.9565  (high!)
    ============== ==== ==== ======= ===== ====================

    :Missing Attribute Values: None
    :Class Distribution: 33.3% for each of 3 classes.
    :Creator: R.A. Fisher
    :Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov)
    :Date: July, 1988

This is a copy of UCI ML iris datasets.
http://archive.ics.uci.edu/ml/datasets/Iris

The famous Iris database, first used by Sir R.A Fisher

This is perhaps the best known database to be found in the
pattern recognition literature.  Fisher's paper is a classic in the field and
is referenced frequently to this day.  (See Duda & Hart, for example.)  The
data set contains 3 classes of 50 instances each, where each class refers to a
type of iris plant.  One class is linearly separable from the other 2; the
latter are NOT linearly separable from each other.

References
----------
   - Fisher,R.A. "The use of multiple measurements in taxonomic problems"
     Annual Eugenics, 7, Part II, 179-188 (1936); also in "Contributions to
     Mathematical Statistics" (John Wiley, NY, 1950).
   - Duda,R.O., & Hart,P.E. (1973) Pattern Classification and Scene Analysis.
     (Q327.D83) John Wiley & Sons.  ISBN 0-471-22361-1.  See page 218.
   - Dasarathy, B.V. (1980) "Nosing Around the Neighborhood: A New System
     Structure and Classification Rule for Recognition in Partially Exposed
     Environments".  IEEE Transactions on Pattern Analysis and Machine
     Intelligence, Vol. PAMI-2, No. 1, 67-71.
   - Gates, G.W. (1972) "The Reduced Nearest Neighbor Rule".  IEEE Transactions
     on Information Theory, May 1972, 431-433.
   - See also: 1988 MLC Proceedings, 54-64.  Cheeseman et al"s AUTOCLASS II
     conceptual clustering system finds 3 classes in the data.
   - Many, many more ...


In [11]:
import matplotlib.pyplot as plt
from matplotlib.colors import ListedColormap

iris_df = pd.DataFrame(iris.data, columns=iris.feature_names)
CMAP = ListedColormap(['#FF0000', '#00FF00', '#0000FF'])
pd.plotting.scatter_matrix(iris_df, c=iris.target, edgecolor='black', figsize=(15, 15), cmap=CMAP)
plt.show()


The artificial Neuron

Question: What kind of equation is this? What is the graph of such a function?

The Classic: A fully connected network with a hidden layer


In [12]:
# keras.layers.Input?

In [13]:
from keras.layers import Input
inputs = Input(shape=(4, ))

In [14]:
# keras.layers.Dense?

In [16]:
from keras.layers import Dense
# just linear activation (like no activation function at all)
fc = Dense(3)(inputs)

In [17]:
from keras.models import Model
model = Model(input=inputs, output=fc)

In [18]:
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         (None, 4)                 0         
_________________________________________________________________
dense_2 (Dense)              (None, 3)                 15        
=================================================================
Total params: 15
Trainable params: 15
Non-trainable params: 0
_________________________________________________________________

In [19]:
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In [20]:
# this is just random stuff, no training has taken place so far
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[20]:
array([[ 1.06889915,  1.43573582, -0.00691152]], dtype=float32)

This is the output of all 3 hidden neurons, but what we really want is a category for iris category

  • Softmax activation turns each output to a percantage between 0 and 1 all adding up to 1
  • interpretation is likelyhood of category


In [54]:
inputs = Input(shape=(4, ))
fc = Dense(10)(inputs)
predictions = Dense(3, activation='softmax')(fc)
model = Model(input=inputs, output=predictions)

In [55]:
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_4 (InputLayer)         (None, 4)                 0         
_________________________________________________________________
dense_7 (Dense)              (None, 10)                50        
_________________________________________________________________
dense_8 (Dense)              (None, 3)                 33        
=================================================================
Total params: 83
Trainable params: 83
Non-trainable params: 0
_________________________________________________________________

In [56]:
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In [57]:
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[57]:
array([[ 0.04891691,  0.2536929 ,  0.6973902 ]], dtype=float32)

Question: What is the minimum amount of hidden neurons to solve this categorization task?

Now we have likelyhoods for categories, but still our model is totally random

Training

  • training is performed using Backpropagation
  • each pair of ground truth input and output is passed through network
  • difference between expected output (ground truth) and actual result is summed up and forms loss function
  • loss function is to be minimized
  • optimizer defines strategy to minimize loss

Optimizers: Adam and RMSprop seem nice

http://cs231n.github.io/neural-networks-3/#ada


In [58]:
X = np.array(iris.data)
y = np.array(iris.target)
X.shape, y.shape


Out[58]:
((150, 4), (150,))

In [59]:
y[100]


Out[59]:
2

In [60]:
# tiny little pieces of feature engeneering
from keras.utils.np_utils import to_categorical

num_categories = 3

y = to_categorical(y, num_categories)

In [61]:
y[100]


Out[61]:
array([ 0.,  0.,  1.])

In [62]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.20, random_state=42, stratify=y)

In [63]:
X_train.shape, X_test.shape, y_train.shape, y_test.shape


Out[63]:
((120, 4), (30, 4), (120, 3), (30, 3))

In [64]:
!rm -r tf_log
tb_callback = keras.callbacks.TensorBoard(log_dir='./tf_log')

# https://keras.io/callbacks/#tensorboard
# To start tensorboard
# tensorboard --logdir=/mnt/c/Users/olive/Development/ml/tf_log
# open http://localhost:6006

In [65]:
%time model.fit(X_train, y_train, epochs=500, validation_split=0.2, callbacks=[tb_callback])
# %time model.fit(X_train, y_train, epochs=500, validation_split=0.2)


Train on 96 samples, validate on 24 samples
Epoch 1/500
96/96 [==============================] - 0s - loss: 1.9953 - acc: 0.3021 - val_loss: 1.5292 - val_acc: 0.4583
Epoch 2/500
96/96 [==============================] - 0s - loss: 1.8296 - acc: 0.3021 - val_loss: 1.4286 - val_acc: 0.4583
Epoch 3/500
96/96 [==============================] - 0s - loss: 1.6816 - acc: 0.3021 - val_loss: 1.3391 - val_acc: 0.4583
Epoch 4/500
96/96 [==============================] - 0s - loss: 1.5460 - acc: 0.3021 - val_loss: 1.2654 - val_acc: 0.4583
Epoch 5/500
96/96 [==============================] - 0s - loss: 1.4303 - acc: 0.3021 - val_loss: 1.2082 - val_acc: 0.4583
Epoch 6/500
96/96 [==============================] - 0s - loss: 1.3278 - acc: 0.3021 - val_loss: 1.1665 - val_acc: 0.4583
Epoch 7/500
96/96 [==============================] - 0s - loss: 1.2554 - acc: 0.3021 - val_loss: 1.1408 - val_acc: 0.4583
Epoch 8/500
96/96 [==============================] - 0s - loss: 1.1968 - acc: 0.3021 - val_loss: 1.1300 - val_acc: 0.4583
Epoch 9/500
96/96 [==============================] - 0s - loss: 1.1485 - acc: 0.3021 - val_loss: 1.1261 - val_acc: 0.4583
Epoch 10/500
96/96 [==============================] - 0s - loss: 1.1249 - acc: 0.3125 - val_loss: 1.1267 - val_acc: 0.4583
Epoch 11/500
96/96 [==============================] - 0s - loss: 1.1017 - acc: 0.4062 - val_loss: 1.1251 - val_acc: 0.5417
Epoch 12/500
96/96 [==============================] - 0s - loss: 1.0876 - acc: 0.5521 - val_loss: 1.1207 - val_acc: 0.5417
Epoch 13/500
96/96 [==============================] - ETA: 0s - loss: 1.0541 - acc: 0.656 - 0s - loss: 1.0729 - acc: 0.6354 - val_loss: 1.1100 - val_acc: 0.5417
Epoch 14/500
96/96 [==============================] - 0s - loss: 1.0611 - acc: 0.6562 - val_loss: 1.0941 - val_acc: 0.5833
Epoch 15/500
96/96 [==============================] - 0s - loss: 1.0457 - acc: 0.6562 - val_loss: 1.0748 - val_acc: 0.5833
Epoch 16/500
96/96 [==============================] - 0s - loss: 1.0335 - acc: 0.6562 - val_loss: 1.0539 - val_acc: 0.5417
Epoch 17/500
96/96 [==============================] - 0s - loss: 1.0178 - acc: 0.6354 - val_loss: 1.0329 - val_acc: 0.5417
Epoch 18/500
96/96 [==============================] - 0s - loss: 1.0032 - acc: 0.5938 - val_loss: 1.0129 - val_acc: 0.5417
Epoch 19/500
96/96 [==============================] - 0s - loss: 0.9895 - acc: 0.5625 - val_loss: 0.9905 - val_acc: 0.5417
Epoch 20/500
96/96 [==============================] - 0s - loss: 0.9755 - acc: 0.5521 - val_loss: 0.9697 - val_acc: 0.5417
Epoch 21/500
96/96 [==============================] - 0s - loss: 0.9621 - acc: 0.5312 - val_loss: 0.9523 - val_acc: 0.5417
Epoch 22/500
96/96 [==============================] - 0s - loss: 0.9508 - acc: 0.4583 - val_loss: 0.9322 - val_acc: 0.5000
Epoch 23/500
96/96 [==============================] - 0s - loss: 0.9380 - acc: 0.4271 - val_loss: 0.9160 - val_acc: 0.5000
Epoch 24/500
96/96 [==============================] - 0s - loss: 0.9284 - acc: 0.3854 - val_loss: 0.8985 - val_acc: 0.4583
Epoch 25/500
96/96 [==============================] - 0s - loss: 0.9153 - acc: 0.3958 - val_loss: 0.8884 - val_acc: 0.5000
Epoch 26/500
96/96 [==============================] - 0s - loss: 0.9046 - acc: 0.4167 - val_loss: 0.8786 - val_acc: 0.5000
Epoch 27/500
96/96 [==============================] - 0s - loss: 0.8934 - acc: 0.4271 - val_loss: 0.8663 - val_acc: 0.5000
Epoch 28/500
96/96 [==============================] - 0s - loss: 0.8829 - acc: 0.4688 - val_loss: 0.8557 - val_acc: 0.5417
Epoch 29/500
96/96 [==============================] - 0s - loss: 0.8728 - acc: 0.4896 - val_loss: 0.8480 - val_acc: 0.5833
Epoch 30/500
96/96 [==============================] - 0s - loss: 0.8629 - acc: 0.5729 - val_loss: 0.8391 - val_acc: 0.6667
Epoch 31/500
96/96 [==============================] - 0s - loss: 0.8527 - acc: 0.6562 - val_loss: 0.8289 - val_acc: 0.7083
Epoch 32/500
96/96 [==============================] - 0s - loss: 0.8434 - acc: 0.7083 - val_loss: 0.8173 - val_acc: 0.7500
Epoch 33/500
96/96 [==============================] - 0s - loss: 0.8347 - acc: 0.7708 - val_loss: 0.8051 - val_acc: 0.9167
Epoch 34/500
96/96 [==============================] - 0s - loss: 0.8250 - acc: 0.8125 - val_loss: 0.7984 - val_acc: 0.9167
Epoch 35/500
96/96 [==============================] - 0s - loss: 0.8158 - acc: 0.8542 - val_loss: 0.7895 - val_acc: 0.9167
Epoch 36/500
96/96 [==============================] - 0s - loss: 0.8069 - acc: 0.8542 - val_loss: 0.7799 - val_acc: 0.9167
Epoch 37/500
96/96 [==============================] - 0s - loss: 0.7989 - acc: 0.8542 - val_loss: 0.7697 - val_acc: 0.9167
Epoch 38/500
96/96 [==============================] - 0s - loss: 0.7903 - acc: 0.8750 - val_loss: 0.7615 - val_acc: 0.9167
Epoch 39/500
96/96 [==============================] - 0s - loss: 0.7826 - acc: 0.8646 - val_loss: 0.7501 - val_acc: 0.9167
Epoch 40/500
96/96 [==============================] - 0s - loss: 0.7743 - acc: 0.8542 - val_loss: 0.7407 - val_acc: 0.9167
Epoch 41/500
96/96 [==============================] - 0s - loss: 0.7665 - acc: 0.8646 - val_loss: 0.7332 - val_acc: 0.9167
Epoch 42/500
96/96 [==============================] - 0s - loss: 0.7591 - acc: 0.8646 - val_loss: 0.7254 - val_acc: 0.9167
Epoch 43/500
96/96 [==============================] - 0s - loss: 0.7517 - acc: 0.8646 - val_loss: 0.7177 - val_acc: 0.9167
Epoch 44/500
96/96 [==============================] - 0s - loss: 0.7445 - acc: 0.8750 - val_loss: 0.7104 - val_acc: 0.9167
Epoch 45/500
96/96 [==============================] - 0s - loss: 0.7372 - acc: 0.8854 - val_loss: 0.7048 - val_acc: 0.9167
Epoch 46/500
96/96 [==============================] - ETA: 0s - loss: 0.7584 - acc: 0.875 - 0s - loss: 0.7309 - acc: 0.8854 - val_loss: 0.7017 - val_acc: 0.9167
Epoch 47/500
96/96 [==============================] - 0s - loss: 0.7236 - acc: 0.8958 - val_loss: 0.6946 - val_acc: 0.9167
Epoch 48/500
96/96 [==============================] - 0s - loss: 0.7172 - acc: 0.8958 - val_loss: 0.6860 - val_acc: 0.9167
Epoch 49/500
96/96 [==============================] - 0s - loss: 0.7107 - acc: 0.8854 - val_loss: 0.6782 - val_acc: 0.9167
Epoch 50/500
96/96 [==============================] - 0s - loss: 0.7045 - acc: 0.8854 - val_loss: 0.6713 - val_acc: 0.9167
Epoch 51/500
96/96 [==============================] - 0s - loss: 0.6985 - acc: 0.8854 - val_loss: 0.6626 - val_acc: 0.9167
Epoch 52/500
96/96 [==============================] - 0s - loss: 0.6927 - acc: 0.8854 - val_loss: 0.6566 - val_acc: 0.9167
Epoch 53/500
96/96 [==============================] - 0s - loss: 0.6869 - acc: 0.8854 - val_loss: 0.6533 - val_acc: 0.9167
Epoch 54/500
96/96 [==============================] - 0s - loss: 0.6808 - acc: 0.8958 - val_loss: 0.6486 - val_acc: 0.9167
Epoch 55/500
96/96 [==============================] - 0s - loss: 0.6752 - acc: 0.9167 - val_loss: 0.6424 - val_acc: 0.9167
Epoch 56/500
96/96 [==============================] - 0s - loss: 0.6700 - acc: 0.8958 - val_loss: 0.6356 - val_acc: 0.9167
Epoch 57/500
96/96 [==============================] - 0s - loss: 0.6646 - acc: 0.8958 - val_loss: 0.6278 - val_acc: 0.9167
Epoch 58/500
96/96 [==============================] - 0s - loss: 0.6597 - acc: 0.8958 - val_loss: 0.6200 - val_acc: 0.9167
Epoch 59/500
96/96 [==============================] - 0s - loss: 0.6542 - acc: 0.8854 - val_loss: 0.6135 - val_acc: 0.9167
Epoch 60/500
96/96 [==============================] - 0s - loss: 0.6491 - acc: 0.8854 - val_loss: 0.6095 - val_acc: 0.9167
Epoch 61/500
96/96 [==============================] - 0s - loss: 0.6441 - acc: 0.8854 - val_loss: 0.6056 - val_acc: 0.9167
Epoch 62/500
96/96 [==============================] - 0s - loss: 0.6397 - acc: 0.9062 - val_loss: 0.6053 - val_acc: 0.9167
Epoch 63/500
96/96 [==============================] - 0s - loss: 0.6350 - acc: 0.9688 - val_loss: 0.6032 - val_acc: 0.9583
Epoch 64/500
96/96 [==============================] - 0s - loss: 0.6301 - acc: 0.9896 - val_loss: 0.5972 - val_acc: 0.9167
Epoch 65/500
96/96 [==============================] - 0s - loss: 0.6255 - acc: 0.9896 - val_loss: 0.5929 - val_acc: 0.9167
Epoch 66/500
96/96 [==============================] - 0s - loss: 0.6211 - acc: 0.9896 - val_loss: 0.5885 - val_acc: 0.9167
Epoch 67/500
96/96 [==============================] - 0s - loss: 0.6169 - acc: 0.9896 - val_loss: 0.5834 - val_acc: 0.9167
Epoch 68/500
96/96 [==============================] - 0s - loss: 0.6139 - acc: 0.9375 - val_loss: 0.5732 - val_acc: 0.9167
Epoch 69/500
96/96 [==============================] - 0s - loss: 0.6086 - acc: 0.8958 - val_loss: 0.5692 - val_acc: 0.9167
Epoch 70/500
96/96 [==============================] - 0s - loss: 0.6047 - acc: 0.8958 - val_loss: 0.5634 - val_acc: 0.9167
Epoch 71/500
96/96 [==============================] - 0s - loss: 0.6008 - acc: 0.8958 - val_loss: 0.5588 - val_acc: 0.9167
Epoch 72/500
96/96 [==============================] - 0s - loss: 0.5964 - acc: 0.8958 - val_loss: 0.5580 - val_acc: 0.9167
Epoch 73/500
96/96 [==============================] - 0s - loss: 0.5925 - acc: 0.9688 - val_loss: 0.5569 - val_acc: 0.9167
Epoch 74/500
96/96 [==============================] - 0s - loss: 0.5887 - acc: 0.9896 - val_loss: 0.5556 - val_acc: 0.9583
Epoch 75/500
96/96 [==============================] - 0s - loss: 0.5851 - acc: 0.9896 - val_loss: 0.5546 - val_acc: 0.9583
Epoch 76/500
96/96 [==============================] - ETA: 0s - loss: 0.5427 - acc: 0.968 - 0s - loss: 0.5822 - acc: 0.9896 - val_loss: 0.5492 - val_acc: 0.9583
Epoch 77/500
96/96 [==============================] - 0s - loss: 0.5780 - acc: 0.9896 - val_loss: 0.5474 - val_acc: 0.9583
Epoch 78/500
96/96 [==============================] - 0s - loss: 0.5750 - acc: 0.9896 - val_loss: 0.5455 - val_acc: 0.9583
Epoch 79/500
96/96 [==============================] - 0s - loss: 0.5710 - acc: 0.9896 - val_loss: 0.5407 - val_acc: 0.9583
Epoch 80/500
96/96 [==============================] - 0s - loss: 0.5676 - acc: 0.9896 - val_loss: 0.5378 - val_acc: 0.9583
Epoch 81/500
96/96 [==============================] - 0s - loss: 0.5641 - acc: 0.9896 - val_loss: 0.5321 - val_acc: 0.9583
Epoch 82/500
96/96 [==============================] - 0s - loss: 0.5608 - acc: 0.9896 - val_loss: 0.5281 - val_acc: 0.9583
Epoch 83/500
96/96 [==============================] - 0s - loss: 0.5576 - acc: 0.9896 - val_loss: 0.5221 - val_acc: 0.9167
Epoch 84/500
96/96 [==============================] - 0s - loss: 0.5548 - acc: 0.9896 - val_loss: 0.5157 - val_acc: 0.9167
Epoch 85/500
96/96 [==============================] - 0s - loss: 0.5516 - acc: 0.9896 - val_loss: 0.5150 - val_acc: 0.9167
Epoch 86/500
96/96 [==============================] - 0s - loss: 0.5485 - acc: 0.9896 - val_loss: 0.5106 - val_acc: 0.9167
Epoch 87/500
96/96 [==============================] - ETA: 0s - loss: 0.5329 - acc: 1.000 - 0s - loss: 0.5453 - acc: 0.9896 - val_loss: 0.5099 - val_acc: 0.9167
Epoch 88/500
96/96 [==============================] - 0s - loss: 0.5421 - acc: 0.9896 - val_loss: 0.5079 - val_acc: 0.9583
Epoch 89/500
96/96 [==============================] - 0s - loss: 0.5394 - acc: 0.9896 - val_loss: 0.5086 - val_acc: 0.9583
Epoch 90/500
96/96 [==============================] - 0s - loss: 0.5364 - acc: 0.9896 - val_loss: 0.5076 - val_acc: 0.9583
Epoch 91/500
96/96 [==============================] - 0s - loss: 0.5337 - acc: 0.9896 - val_loss: 0.5067 - val_acc: 0.9583
Epoch 92/500
96/96 [==============================] - 0s - loss: 0.5312 - acc: 0.9792 - val_loss: 0.5059 - val_acc: 0.9583
Epoch 93/500
96/96 [==============================] - 0s - loss: 0.5280 - acc: 0.9792 - val_loss: 0.5024 - val_acc: 0.9583
Epoch 94/500
96/96 [==============================] - 0s - loss: 0.5251 - acc: 0.9896 - val_loss: 0.4969 - val_acc: 0.9583
Epoch 95/500
96/96 [==============================] - 0s - loss: 0.5223 - acc: 0.9896 - val_loss: 0.4916 - val_acc: 0.9583
Epoch 96/500
96/96 [==============================] - 0s - loss: 0.5201 - acc: 0.9896 - val_loss: 0.4898 - val_acc: 0.9583
Epoch 97/500
96/96 [==============================] - 0s - loss: 0.5177 - acc: 0.9896 - val_loss: 0.4820 - val_acc: 0.9167
Epoch 98/500
96/96 [==============================] - 0s - loss: 0.5147 - acc: 0.9896 - val_loss: 0.4809 - val_acc: 0.9583
Epoch 99/500
96/96 [==============================] - 0s - loss: 0.5121 - acc: 0.9896 - val_loss: 0.4804 - val_acc: 0.9583
Epoch 100/500
96/96 [==============================] - 0s - loss: 0.5095 - acc: 0.9896 - val_loss: 0.4770 - val_acc: 0.9583
Epoch 101/500
96/96 [==============================] - 0s - loss: 0.5070 - acc: 0.9896 - val_loss: 0.4766 - val_acc: 0.9583
Epoch 102/500
96/96 [==============================] - 0s - loss: 0.5045 - acc: 0.9896 - val_loss: 0.4734 - val_acc: 0.9583
Epoch 103/500
96/96 [==============================] - 0s - loss: 0.5020 - acc: 0.9896 - val_loss: 0.4714 - val_acc: 0.9583
Epoch 104/500
96/96 [==============================] - 0s - loss: 0.4997 - acc: 0.9896 - val_loss: 0.4726 - val_acc: 0.9583
Epoch 105/500
96/96 [==============================] - 0s - loss: 0.4972 - acc: 0.9896 - val_loss: 0.4704 - val_acc: 0.9583
Epoch 106/500
96/96 [==============================] - 0s - loss: 0.4949 - acc: 0.9896 - val_loss: 0.4710 - val_acc: 0.9583
Epoch 107/500
96/96 [==============================] - 0s - loss: 0.4924 - acc: 0.9792 - val_loss: 0.4687 - val_acc: 0.9583
Epoch 108/500
96/96 [==============================] - 0s - loss: 0.4902 - acc: 0.9896 - val_loss: 0.4656 - val_acc: 0.9583
Epoch 109/500
96/96 [==============================] - 0s - loss: 0.4878 - acc: 0.9896 - val_loss: 0.4633 - val_acc: 0.9583
Epoch 110/500
96/96 [==============================] - 0s - loss: 0.4857 - acc: 0.9896 - val_loss: 0.4600 - val_acc: 0.9583
Epoch 111/500
96/96 [==============================] - 0s - loss: 0.4835 - acc: 0.9896 - val_loss: 0.4566 - val_acc: 0.9583
Epoch 112/500
96/96 [==============================] - 0s - loss: 0.4810 - acc: 0.9896 - val_loss: 0.4556 - val_acc: 0.9583
Epoch 113/500
96/96 [==============================] - 0s - loss: 0.4788 - acc: 0.9896 - val_loss: 0.4543 - val_acc: 0.9583
Epoch 114/500
96/96 [==============================] - 0s - loss: 0.4768 - acc: 0.9792 - val_loss: 0.4546 - val_acc: 0.9583
Epoch 115/500
96/96 [==============================] - 0s - loss: 0.4750 - acc: 0.9792 - val_loss: 0.4499 - val_acc: 0.9583
Epoch 116/500
96/96 [==============================] - 0s - loss: 0.4726 - acc: 0.9896 - val_loss: 0.4505 - val_acc: 0.9583
Epoch 117/500
96/96 [==============================] - 0s - loss: 0.4716 - acc: 0.9792 - val_loss: 0.4532 - val_acc: 0.9583
Epoch 118/500
96/96 [==============================] - 0s - loss: 0.4687 - acc: 0.9792 - val_loss: 0.4468 - val_acc: 0.9583
Epoch 119/500
96/96 [==============================] - ETA: 0s - loss: 0.4765 - acc: 1.000 - 0s - loss: 0.4660 - acc: 0.9792 - val_loss: 0.4445 - val_acc: 0.9583
Epoch 120/500
96/96 [==============================] - 0s - loss: 0.4640 - acc: 0.9896 - val_loss: 0.4411 - val_acc: 0.9583
Epoch 121/500
96/96 [==============================] - 0s - loss: 0.4633 - acc: 0.9896 - val_loss: 0.4346 - val_acc: 0.9583
Epoch 122/500
96/96 [==============================] - 0s - loss: 0.4603 - acc: 0.9896 - val_loss: 0.4332 - val_acc: 0.9583
Epoch 123/500
96/96 [==============================] - 0s - loss: 0.4584 - acc: 0.9896 - val_loss: 0.4355 - val_acc: 0.9583
Epoch 124/500
96/96 [==============================] - 0s - loss: 0.4559 - acc: 0.9792 - val_loss: 0.4364 - val_acc: 0.9583
Epoch 125/500
96/96 [==============================] - 0s - loss: 0.4540 - acc: 0.9792 - val_loss: 0.4356 - val_acc: 0.9583
Epoch 126/500
96/96 [==============================] - 0s - loss: 0.4522 - acc: 0.9792 - val_loss: 0.4335 - val_acc: 0.9583
Epoch 127/500
96/96 [==============================] - 0s - loss: 0.4504 - acc: 0.9792 - val_loss: 0.4356 - val_acc: 0.9583
Epoch 128/500
96/96 [==============================] - 0s - loss: 0.4484 - acc: 0.9792 - val_loss: 0.4318 - val_acc: 0.9583
Epoch 129/500
96/96 [==============================] - 0s - loss: 0.4465 - acc: 0.9792 - val_loss: 0.4289 - val_acc: 0.9583
Epoch 130/500
96/96 [==============================] - 0s - loss: 0.4448 - acc: 0.9792 - val_loss: 0.4300 - val_acc: 0.9583
Epoch 131/500
96/96 [==============================] - 0s - loss: 0.4426 - acc: 0.9792 - val_loss: 0.4278 - val_acc: 0.9583
Epoch 132/500
96/96 [==============================] - 0s - loss: 0.4409 - acc: 0.9792 - val_loss: 0.4236 - val_acc: 0.9583
Epoch 133/500
96/96 [==============================] - 0s - loss: 0.4388 - acc: 0.9792 - val_loss: 0.4212 - val_acc: 0.9583
Epoch 134/500
96/96 [==============================] - 0s - loss: 0.4369 - acc: 0.9792 - val_loss: 0.4193 - val_acc: 0.9583
Epoch 135/500
96/96 [==============================] - 0s - loss: 0.4351 - acc: 0.9792 - val_loss: 0.4180 - val_acc: 0.9583
Epoch 136/500
96/96 [==============================] - 0s - loss: 0.4335 - acc: 0.9792 - val_loss: 0.4154 - val_acc: 0.9583
Epoch 137/500
96/96 [==============================] - 0s - loss: 0.4315 - acc: 0.9792 - val_loss: 0.4159 - val_acc: 0.9583
Epoch 138/500
96/96 [==============================] - 0s - loss: 0.4299 - acc: 0.9792 - val_loss: 0.4176 - val_acc: 0.9583
Epoch 139/500
96/96 [==============================] - 0s - loss: 0.4281 - acc: 0.9792 - val_loss: 0.4175 - val_acc: 0.9583
Epoch 140/500
96/96 [==============================] - 0s - loss: 0.4261 - acc: 0.9792 - val_loss: 0.4156 - val_acc: 0.9583
Epoch 141/500
96/96 [==============================] - 0s - loss: 0.4249 - acc: 0.9792 - val_loss: 0.4106 - val_acc: 0.9583
Epoch 142/500
96/96 [==============================] - 0s - loss: 0.4226 - acc: 0.9792 - val_loss: 0.4097 - val_acc: 0.9583
Epoch 143/500
96/96 [==============================] - 0s - loss: 0.4209 - acc: 0.9792 - val_loss: 0.4076 - val_acc: 0.9583
Epoch 144/500
96/96 [==============================] - 0s - loss: 0.4191 - acc: 0.9792 - val_loss: 0.4067 - val_acc: 0.9583
Epoch 145/500
96/96 [==============================] - 0s - loss: 0.4177 - acc: 0.9792 - val_loss: 0.4066 - val_acc: 0.9583
Epoch 146/500
96/96 [==============================] - 0s - loss: 0.4160 - acc: 0.9792 - val_loss: 0.4024 - val_acc: 0.9583
Epoch 147/500
96/96 [==============================] - 0s - loss: 0.4149 - acc: 0.9792 - val_loss: 0.4053 - val_acc: 0.9583
Epoch 148/500
96/96 [==============================] - 0s - loss: 0.4125 - acc: 0.9792 - val_loss: 0.4013 - val_acc: 0.9583
Epoch 149/500
96/96 [==============================] - 0s - loss: 0.4113 - acc: 0.9792 - val_loss: 0.4024 - val_acc: 0.9583
Epoch 150/500
96/96 [==============================] - 0s - loss: 0.4090 - acc: 0.9792 - val_loss: 0.3995 - val_acc: 0.9583
Epoch 151/500
96/96 [==============================] - 0s - loss: 0.4078 - acc: 0.9792 - val_loss: 0.3953 - val_acc: 0.9583
Epoch 152/500
96/96 [==============================] - 0s - loss: 0.4058 - acc: 0.9792 - val_loss: 0.3936 - val_acc: 0.9583
Epoch 153/500
96/96 [==============================] - 0s - loss: 0.4045 - acc: 0.9792 - val_loss: 0.3961 - val_acc: 0.9583
Epoch 154/500
96/96 [==============================] - 0s - loss: 0.4028 - acc: 0.9792 - val_loss: 0.3925 - val_acc: 0.9583
Epoch 155/500
96/96 [==============================] - 0s - loss: 0.4008 - acc: 0.9792 - val_loss: 0.3922 - val_acc: 0.9583
Epoch 156/500
96/96 [==============================] - 0s - loss: 0.3991 - acc: 0.9792 - val_loss: 0.3923 - val_acc: 0.9583
Epoch 157/500
96/96 [==============================] - 0s - loss: 0.3981 - acc: 0.9792 - val_loss: 0.3950 - val_acc: 0.9583
Epoch 158/500
96/96 [==============================] - 0s - loss: 0.3961 - acc: 0.9792 - val_loss: 0.3924 - val_acc: 0.9583
Epoch 159/500
96/96 [==============================] - 0s - loss: 0.3944 - acc: 0.9792 - val_loss: 0.3911 - val_acc: 0.9583
Epoch 160/500
96/96 [==============================] - 0s - loss: 0.3931 - acc: 0.9792 - val_loss: 0.3861 - val_acc: 0.9583
Epoch 161/500
96/96 [==============================] - 0s - loss: 0.3912 - acc: 0.9792 - val_loss: 0.3845 - val_acc: 0.9583
Epoch 162/500
96/96 [==============================] - 0s - loss: 0.3897 - acc: 0.9792 - val_loss: 0.3848 - val_acc: 0.9583
Epoch 163/500
96/96 [==============================] - 0s - loss: 0.3880 - acc: 0.9792 - val_loss: 0.3834 - val_acc: 0.9583
Epoch 164/500
96/96 [==============================] - 0s - loss: 0.3865 - acc: 0.9792 - val_loss: 0.3821 - val_acc: 0.9583
Epoch 165/500
96/96 [==============================] - 0s - loss: 0.3854 - acc: 0.9792 - val_loss: 0.3833 - val_acc: 0.9583
Epoch 166/500
96/96 [==============================] - 0s - loss: 0.3838 - acc: 0.9792 - val_loss: 0.3840 - val_acc: 0.9583
Epoch 167/500
96/96 [==============================] - 0s - loss: 0.3833 - acc: 0.9792 - val_loss: 0.3861 - val_acc: 0.9583
Epoch 168/500
96/96 [==============================] - 0s - loss: 0.3805 - acc: 0.9792 - val_loss: 0.3801 - val_acc: 0.9583
Epoch 169/500
96/96 [==============================] - 0s - loss: 0.3788 - acc: 0.9792 - val_loss: 0.3775 - val_acc: 0.9583
Epoch 170/500
96/96 [==============================] - 0s - loss: 0.3775 - acc: 0.9792 - val_loss: 0.3722 - val_acc: 0.9583
Epoch 171/500
96/96 [==============================] - 0s - loss: 0.3762 - acc: 0.9792 - val_loss: 0.3682 - val_acc: 0.9583
Epoch 172/500
96/96 [==============================] - 0s - loss: 0.3745 - acc: 0.9792 - val_loss: 0.3672 - val_acc: 0.9583
Epoch 173/500
96/96 [==============================] - 0s - loss: 0.3732 - acc: 0.9792 - val_loss: 0.3664 - val_acc: 0.9583
Epoch 174/500
96/96 [==============================] - 0s - loss: 0.3725 - acc: 0.9792 - val_loss: 0.3728 - val_acc: 0.9583
Epoch 175/500
96/96 [==============================] - 0s - loss: 0.3701 - acc: 0.9792 - val_loss: 0.3713 - val_acc: 0.9583
Epoch 176/500
96/96 [==============================] - ETA: 0s - loss: 0.3889 - acc: 1.000 - 0s - loss: 0.3693 - acc: 0.9792 - val_loss: 0.3757 - val_acc: 0.9583
Epoch 177/500
96/96 [==============================] - 0s - loss: 0.3670 - acc: 0.9792 - val_loss: 0.3746 - val_acc: 0.9583
Epoch 178/500
96/96 [==============================] - 0s - loss: 0.3672 - acc: 0.9792 - val_loss: 0.3670 - val_acc: 0.9583
Epoch 179/500
96/96 [==============================] - 0s - loss: 0.3639 - acc: 0.9792 - val_loss: 0.3657 - val_acc: 0.9583
Epoch 180/500
96/96 [==============================] - 0s - loss: 0.3625 - acc: 0.9792 - val_loss: 0.3658 - val_acc: 0.9583
Epoch 181/500
96/96 [==============================] - 0s - loss: 0.3611 - acc: 0.9792 - val_loss: 0.3639 - val_acc: 0.9583
Epoch 182/500
96/96 [==============================] - 0s - loss: 0.3597 - acc: 0.9792 - val_loss: 0.3636 - val_acc: 0.9583
Epoch 183/500
96/96 [==============================] - 0s - loss: 0.3587 - acc: 0.9792 - val_loss: 0.3649 - val_acc: 0.9583
Epoch 184/500
96/96 [==============================] - 0s - loss: 0.3574 - acc: 0.9792 - val_loss: 0.3601 - val_acc: 0.9583
Epoch 185/500
96/96 [==============================] - 0s - loss: 0.3560 - acc: 0.9792 - val_loss: 0.3565 - val_acc: 0.9583
Epoch 186/500
96/96 [==============================] - 0s - loss: 0.3541 - acc: 0.9792 - val_loss: 0.3589 - val_acc: 0.9583
Epoch 187/500
96/96 [==============================] - 0s - loss: 0.3526 - acc: 0.9792 - val_loss: 0.3573 - val_acc: 0.9583
Epoch 188/500
96/96 [==============================] - 0s - loss: 0.3511 - acc: 0.9792 - val_loss: 0.3580 - val_acc: 0.9583
Epoch 189/500
96/96 [==============================] - 0s - loss: 0.3506 - acc: 0.9792 - val_loss: 0.3590 - val_acc: 0.9583
Epoch 190/500
96/96 [==============================] - 0s - loss: 0.3483 - acc: 0.9792 - val_loss: 0.3581 - val_acc: 0.9583
Epoch 191/500
96/96 [==============================] - 0s - loss: 0.3470 - acc: 0.9792 - val_loss: 0.3557 - val_acc: 0.9583
Epoch 192/500
96/96 [==============================] - 0s - loss: 0.3459 - acc: 0.9792 - val_loss: 0.3561 - val_acc: 0.9583
Epoch 193/500
96/96 [==============================] - 0s - loss: 0.3443 - acc: 0.9792 - val_loss: 0.3538 - val_acc: 0.9583
Epoch 194/500
96/96 [==============================] - 0s - loss: 0.3428 - acc: 0.9792 - val_loss: 0.3516 - val_acc: 0.9583
Epoch 195/500
96/96 [==============================] - 0s - loss: 0.3423 - acc: 0.9792 - val_loss: 0.3464 - val_acc: 0.9583
Epoch 196/500
96/96 [==============================] - 0s - loss: 0.3402 - acc: 0.9792 - val_loss: 0.3464 - val_acc: 0.9583
Epoch 197/500
96/96 [==============================] - 0s - loss: 0.3388 - acc: 0.9792 - val_loss: 0.3475 - val_acc: 0.9583
Epoch 198/500
96/96 [==============================] - 0s - loss: 0.3374 - acc: 0.9792 - val_loss: 0.3477 - val_acc: 0.9583
Epoch 199/500
96/96 [==============================] - 0s - loss: 0.3361 - acc: 0.9792 - val_loss: 0.3481 - val_acc: 0.9583
Epoch 200/500
96/96 [==============================] - 0s - loss: 0.3347 - acc: 0.9792 - val_loss: 0.3476 - val_acc: 0.9583
Epoch 201/500
96/96 [==============================] - 0s - loss: 0.3333 - acc: 0.9792 - val_loss: 0.3464 - val_acc: 0.9583
Epoch 202/500
96/96 [==============================] - 0s - loss: 0.3321 - acc: 0.9792 - val_loss: 0.3460 - val_acc: 0.9583
Epoch 203/500
96/96 [==============================] - 0s - loss: 0.3307 - acc: 0.9792 - val_loss: 0.3432 - val_acc: 0.9583
Epoch 204/500
96/96 [==============================] - 0s - loss: 0.3294 - acc: 0.9792 - val_loss: 0.3404 - val_acc: 0.9583
Epoch 205/500
96/96 [==============================] - 0s - loss: 0.3281 - acc: 0.9792 - val_loss: 0.3404 - val_acc: 0.9583
Epoch 206/500
96/96 [==============================] - 0s - loss: 0.3269 - acc: 0.9792 - val_loss: 0.3409 - val_acc: 0.9583
Epoch 207/500
96/96 [==============================] - 0s - loss: 0.3263 - acc: 0.9792 - val_loss: 0.3428 - val_acc: 0.9583
Epoch 208/500
96/96 [==============================] - 0s - loss: 0.3240 - acc: 0.9792 - val_loss: 0.3403 - val_acc: 0.9583
Epoch 209/500
96/96 [==============================] - 0s - loss: 0.3227 - acc: 0.9792 - val_loss: 0.3363 - val_acc: 0.9583
Epoch 210/500
96/96 [==============================] - 0s - loss: 0.3215 - acc: 0.9792 - val_loss: 0.3351 - val_acc: 0.9583
Epoch 211/500
96/96 [==============================] - 0s - loss: 0.3202 - acc: 0.9792 - val_loss: 0.3337 - val_acc: 0.9583
Epoch 212/500
96/96 [==============================] - 0s - loss: 0.3202 - acc: 0.9792 - val_loss: 0.3281 - val_acc: 0.9583
Epoch 213/500
96/96 [==============================] - 0s - loss: 0.3179 - acc: 0.9792 - val_loss: 0.3276 - val_acc: 0.9583
Epoch 214/500
96/96 [==============================] - 0s - loss: 0.3168 - acc: 0.9792 - val_loss: 0.3302 - val_acc: 0.9583
Epoch 215/500
96/96 [==============================] - 0s - loss: 0.3156 - acc: 0.9792 - val_loss: 0.3343 - val_acc: 0.9583
Epoch 216/500
96/96 [==============================] - 0s - loss: 0.3139 - acc: 0.9792 - val_loss: 0.3351 - val_acc: 0.9583
Epoch 217/500
96/96 [==============================] - 0s - loss: 0.3129 - acc: 0.9792 - val_loss: 0.3355 - val_acc: 0.9583
Epoch 218/500
96/96 [==============================] - 0s - loss: 0.3116 - acc: 0.9792 - val_loss: 0.3308 - val_acc: 0.9583
Epoch 219/500
96/96 [==============================] - 0s - loss: 0.3101 - acc: 0.9792 - val_loss: 0.3273 - val_acc: 0.9583
Epoch 220/500
96/96 [==============================] - 0s - loss: 0.3088 - acc: 0.9792 - val_loss: 0.3258 - val_acc: 0.9583
Epoch 221/500
96/96 [==============================] - 0s - loss: 0.3075 - acc: 0.9792 - val_loss: 0.3246 - val_acc: 0.9583
Epoch 222/500
96/96 [==============================] - 0s - loss: 0.3063 - acc: 0.9792 - val_loss: 0.3247 - val_acc: 0.9583
Epoch 223/500
96/96 [==============================] - 0s - loss: 0.3053 - acc: 0.9792 - val_loss: 0.3220 - val_acc: 0.9583
Epoch 224/500
96/96 [==============================] - 0s - loss: 0.3038 - acc: 0.9792 - val_loss: 0.3218 - val_acc: 0.9583
Epoch 225/500
96/96 [==============================] - 0s - loss: 0.3025 - acc: 0.9792 - val_loss: 0.3229 - val_acc: 0.9583
Epoch 226/500
96/96 [==============================] - 0s - loss: 0.3014 - acc: 0.9792 - val_loss: 0.3218 - val_acc: 0.9583
Epoch 227/500
96/96 [==============================] - 0s - loss: 0.3001 - acc: 0.9792 - val_loss: 0.3226 - val_acc: 0.9583
Epoch 228/500
96/96 [==============================] - 0s - loss: 0.2992 - acc: 0.9792 - val_loss: 0.3210 - val_acc: 0.9583
Epoch 229/500
96/96 [==============================] - 0s - loss: 0.2979 - acc: 0.9792 - val_loss: 0.3232 - val_acc: 0.9583
Epoch 230/500
96/96 [==============================] - 0s - loss: 0.2965 - acc: 0.9792 - val_loss: 0.3228 - val_acc: 0.9583
Epoch 231/500
96/96 [==============================] - 0s - loss: 0.2952 - acc: 0.9792 - val_loss: 0.3205 - val_acc: 0.9583
Epoch 232/500
96/96 [==============================] - 0s - loss: 0.2940 - acc: 0.9792 - val_loss: 0.3183 - val_acc: 0.9583
Epoch 233/500
96/96 [==============================] - 0s - loss: 0.2927 - acc: 0.9792 - val_loss: 0.3166 - val_acc: 0.9583
Epoch 234/500
96/96 [==============================] - 0s - loss: 0.2918 - acc: 0.9792 - val_loss: 0.3135 - val_acc: 0.9583
Epoch 235/500
96/96 [==============================] - 0s - loss: 0.2907 - acc: 0.9792 - val_loss: 0.3146 - val_acc: 0.9583
Epoch 236/500
96/96 [==============================] - 0s - loss: 0.2894 - acc: 0.9792 - val_loss: 0.3118 - val_acc: 0.9583
Epoch 237/500
96/96 [==============================] - 0s - loss: 0.2883 - acc: 0.9792 - val_loss: 0.3130 - val_acc: 0.9583
Epoch 238/500
96/96 [==============================] - 0s - loss: 0.2868 - acc: 0.9792 - val_loss: 0.3119 - val_acc: 0.9583
Epoch 239/500
96/96 [==============================] - 0s - loss: 0.2859 - acc: 0.9792 - val_loss: 0.3091 - val_acc: 0.9583
Epoch 240/500
96/96 [==============================] - 0s - loss: 0.2846 - acc: 0.9792 - val_loss: 0.3100 - val_acc: 0.9583
Epoch 241/500
96/96 [==============================] - 0s - loss: 0.2835 - acc: 0.9792 - val_loss: 0.3076 - val_acc: 0.9583
Epoch 242/500
96/96 [==============================] - 0s - loss: 0.2829 - acc: 0.9792 - val_loss: 0.3104 - val_acc: 0.9583
Epoch 243/500
96/96 [==============================] - 0s - loss: 0.2812 - acc: 0.9792 - val_loss: 0.3100 - val_acc: 0.9583
Epoch 244/500
96/96 [==============================] - 0s - loss: 0.2805 - acc: 0.9792 - val_loss: 0.3057 - val_acc: 0.9583
Epoch 245/500
96/96 [==============================] - 0s - loss: 0.2787 - acc: 0.9792 - val_loss: 0.3047 - val_acc: 0.9583
Epoch 246/500
96/96 [==============================] - 0s - loss: 0.2778 - acc: 0.9792 - val_loss: 0.3054 - val_acc: 0.9583
Epoch 247/500
96/96 [==============================] - 0s - loss: 0.2766 - acc: 0.9792 - val_loss: 0.3051 - val_acc: 0.9583
Epoch 248/500
96/96 [==============================] - 0s - loss: 0.2752 - acc: 0.9792 - val_loss: 0.3036 - val_acc: 0.9583
Epoch 249/500
96/96 [==============================] - 0s - loss: 0.2742 - acc: 0.9792 - val_loss: 0.3016 - val_acc: 0.9583
Epoch 250/500
96/96 [==============================] - 0s - loss: 0.2731 - acc: 0.9792 - val_loss: 0.3023 - val_acc: 0.9583
Epoch 251/500
96/96 [==============================] - 0s - loss: 0.2720 - acc: 0.9792 - val_loss: 0.3002 - val_acc: 0.9583
Epoch 252/500
96/96 [==============================] - 0s - loss: 0.2709 - acc: 0.9792 - val_loss: 0.3001 - val_acc: 0.9583
Epoch 253/500
96/96 [==============================] - 0s - loss: 0.2698 - acc: 0.9792 - val_loss: 0.2981 - val_acc: 0.9583
Epoch 254/500
96/96 [==============================] - 0s - loss: 0.2686 - acc: 0.9792 - val_loss: 0.2993 - val_acc: 0.9583
Epoch 255/500
96/96 [==============================] - 0s - loss: 0.2676 - acc: 0.9792 - val_loss: 0.2978 - val_acc: 0.9583
Epoch 256/500
96/96 [==============================] - ETA: 0s - loss: 0.3252 - acc: 0.968 - 0s - loss: 0.2663 - acc: 0.9792 - val_loss: 0.2975 - val_acc: 0.9583
Epoch 257/500
96/96 [==============================] - 0s - loss: 0.2653 - acc: 0.9792 - val_loss: 0.2983 - val_acc: 0.9583
Epoch 258/500
96/96 [==============================] - 0s - loss: 0.2645 - acc: 0.9792 - val_loss: 0.2995 - val_acc: 0.9583
Epoch 259/500
96/96 [==============================] - 0s - loss: 0.2631 - acc: 0.9792 - val_loss: 0.2975 - val_acc: 0.9583
Epoch 260/500
96/96 [==============================] - 0s - loss: 0.2619 - acc: 0.9792 - val_loss: 0.2952 - val_acc: 0.9583
Epoch 261/500
96/96 [==============================] - 0s - loss: 0.2609 - acc: 0.9792 - val_loss: 0.2933 - val_acc: 0.9583
Epoch 262/500
96/96 [==============================] - 0s - loss: 0.2598 - acc: 0.9792 - val_loss: 0.2923 - val_acc: 0.9583
Epoch 263/500
96/96 [==============================] - 0s - loss: 0.2586 - acc: 0.9792 - val_loss: 0.2909 - val_acc: 0.9583
Epoch 264/500
96/96 [==============================] - 0s - loss: 0.2577 - acc: 0.9792 - val_loss: 0.2886 - val_acc: 0.9583
Epoch 265/500
96/96 [==============================] - 0s - loss: 0.2566 - acc: 0.9792 - val_loss: 0.2885 - val_acc: 0.9583
Epoch 266/500
96/96 [==============================] - 0s - loss: 0.2558 - acc: 0.9792 - val_loss: 0.2862 - val_acc: 0.9583
Epoch 267/500
96/96 [==============================] - 0s - loss: 0.2544 - acc: 0.9792 - val_loss: 0.2868 - val_acc: 0.9583
Epoch 268/500
96/96 [==============================] - 0s - loss: 0.2536 - acc: 0.9792 - val_loss: 0.2857 - val_acc: 0.9583
Epoch 269/500
96/96 [==============================] - 0s - loss: 0.2524 - acc: 0.9792 - val_loss: 0.2880 - val_acc: 0.9583
Epoch 270/500
96/96 [==============================] - 0s - loss: 0.2514 - acc: 0.9792 - val_loss: 0.2868 - val_acc: 0.9583
Epoch 271/500
96/96 [==============================] - 0s - loss: 0.2502 - acc: 0.9792 - val_loss: 0.2879 - val_acc: 0.9583
Epoch 272/500
96/96 [==============================] - 0s - loss: 0.2493 - acc: 0.9792 - val_loss: 0.2865 - val_acc: 0.9583
Epoch 273/500
96/96 [==============================] - 0s - loss: 0.2482 - acc: 0.9792 - val_loss: 0.2860 - val_acc: 0.9583
Epoch 274/500
96/96 [==============================] - 0s - loss: 0.2472 - acc: 0.9792 - val_loss: 0.2857 - val_acc: 0.9583
Epoch 275/500
96/96 [==============================] - ETA: 0s - loss: 0.2157 - acc: 0.968 - 0s - loss: 0.2463 - acc: 0.9792 - val_loss: 0.2835 - val_acc: 0.9583
Epoch 276/500
96/96 [==============================] - 0s - loss: 0.2451 - acc: 0.9792 - val_loss: 0.2824 - val_acc: 0.9583
Epoch 277/500
96/96 [==============================] - 0s - loss: 0.2443 - acc: 0.9792 - val_loss: 0.2798 - val_acc: 0.9583
Epoch 278/500
96/96 [==============================] - 0s - loss: 0.2432 - acc: 0.9792 - val_loss: 0.2807 - val_acc: 0.9583
Epoch 279/500
96/96 [==============================] - ETA: 0s - loss: 0.2204 - acc: 1.000 - 0s - loss: 0.2420 - acc: 0.9792 - val_loss: 0.2798 - val_acc: 0.9583
Epoch 280/500
96/96 [==============================] - 0s - loss: 0.2412 - acc: 0.9792 - val_loss: 0.2803 - val_acc: 0.9583
Epoch 281/500
96/96 [==============================] - 0s - loss: 0.2403 - acc: 0.9792 - val_loss: 0.2797 - val_acc: 0.9583
Epoch 282/500
96/96 [==============================] - 0s - loss: 0.2390 - acc: 0.9792 - val_loss: 0.2786 - val_acc: 0.9583
Epoch 283/500
96/96 [==============================] - 0s - loss: 0.2384 - acc: 0.9792 - val_loss: 0.2745 - val_acc: 0.9583
Epoch 284/500
96/96 [==============================] - 0s - loss: 0.2372 - acc: 0.9792 - val_loss: 0.2733 - val_acc: 0.9583
Epoch 285/500
96/96 [==============================] - 0s - loss: 0.2361 - acc: 0.9792 - val_loss: 0.2732 - val_acc: 0.9583
Epoch 286/500
96/96 [==============================] - 0s - loss: 0.2353 - acc: 0.9792 - val_loss: 0.2751 - val_acc: 0.9583
Epoch 287/500
96/96 [==============================] - 0s - loss: 0.2341 - acc: 0.9792 - val_loss: 0.2744 - val_acc: 0.9583
Epoch 288/500
96/96 [==============================] - 0s - loss: 0.2336 - acc: 0.9792 - val_loss: 0.2727 - val_acc: 0.9583
Epoch 289/500
96/96 [==============================] - 0s - loss: 0.2325 - acc: 0.9792 - val_loss: 0.2743 - val_acc: 0.9583
Epoch 290/500
96/96 [==============================] - 0s - loss: 0.2314 - acc: 0.9792 - val_loss: 0.2741 - val_acc: 0.9583
Epoch 291/500
96/96 [==============================] - 0s - loss: 0.2304 - acc: 0.9792 - val_loss: 0.2709 - val_acc: 0.9583
Epoch 292/500
96/96 [==============================] - 0s - loss: 0.2293 - acc: 0.9792 - val_loss: 0.2693 - val_acc: 0.9583
Epoch 293/500
96/96 [==============================] - 0s - loss: 0.2285 - acc: 0.9792 - val_loss: 0.2700 - val_acc: 0.9583
Epoch 294/500
96/96 [==============================] - ETA: 0s - loss: 0.2401 - acc: 0.937 - 0s - loss: 0.2274 - acc: 0.9792 - val_loss: 0.2693 - val_acc: 0.9583
Epoch 295/500
96/96 [==============================] - 0s - loss: 0.2266 - acc: 0.9792 - val_loss: 0.2691 - val_acc: 0.9583
Epoch 296/500
96/96 [==============================] - 0s - loss: 0.2259 - acc: 0.9792 - val_loss: 0.2656 - val_acc: 0.9583
Epoch 297/500
96/96 [==============================] - 0s - loss: 0.2247 - acc: 0.9792 - val_loss: 0.2661 - val_acc: 0.9583
Epoch 298/500
96/96 [==============================] - 0s - loss: 0.2238 - acc: 0.9792 - val_loss: 0.2642 - val_acc: 0.9583
Epoch 299/500
96/96 [==============================] - 0s - loss: 0.2227 - acc: 0.9792 - val_loss: 0.2642 - val_acc: 0.9583
Epoch 300/500
96/96 [==============================] - 0s - loss: 0.2219 - acc: 0.9792 - val_loss: 0.2651 - val_acc: 0.9583
Epoch 301/500
96/96 [==============================] - 0s - loss: 0.2209 - acc: 0.9792 - val_loss: 0.2646 - val_acc: 0.9583
Epoch 302/500
96/96 [==============================] - 0s - loss: 0.2200 - acc: 0.9792 - val_loss: 0.2649 - val_acc: 0.9583
Epoch 303/500
96/96 [==============================] - 0s - loss: 0.2192 - acc: 0.9792 - val_loss: 0.2627 - val_acc: 0.9583
Epoch 304/500
96/96 [==============================] - 0s - loss: 0.2183 - acc: 0.9792 - val_loss: 0.2634 - val_acc: 0.9583
Epoch 305/500
96/96 [==============================] - 0s - loss: 0.2176 - acc: 0.9792 - val_loss: 0.2605 - val_acc: 0.9583
Epoch 306/500
96/96 [==============================] - 0s - loss: 0.2163 - acc: 0.9792 - val_loss: 0.2600 - val_acc: 0.9583
Epoch 307/500
96/96 [==============================] - 0s - loss: 0.2158 - acc: 0.9792 - val_loss: 0.2615 - val_acc: 0.9583
Epoch 308/500
96/96 [==============================] - 0s - loss: 0.2146 - acc: 0.9792 - val_loss: 0.2607 - val_acc: 0.9583
Epoch 309/500
96/96 [==============================] - 0s - loss: 0.2137 - acc: 0.9792 - val_loss: 0.2591 - val_acc: 0.9583
Epoch 310/500
96/96 [==============================] - 0s - loss: 0.2128 - acc: 0.9792 - val_loss: 0.2581 - val_acc: 0.9583
Epoch 311/500
96/96 [==============================] - 0s - loss: 0.2122 - acc: 0.9792 - val_loss: 0.2581 - val_acc: 0.9583
Epoch 312/500
96/96 [==============================] - 0s - loss: 0.2115 - acc: 0.9792 - val_loss: 0.2546 - val_acc: 0.9583
Epoch 313/500
96/96 [==============================] - 0s - loss: 0.2102 - acc: 0.9792 - val_loss: 0.2544 - val_acc: 0.9583
Epoch 314/500
96/96 [==============================] - 0s - loss: 0.2094 - acc: 0.9792 - val_loss: 0.2545 - val_acc: 0.9583
Epoch 315/500
96/96 [==============================] - 0s - loss: 0.2086 - acc: 0.9792 - val_loss: 0.2547 - val_acc: 0.9583
Epoch 316/500
96/96 [==============================] - 0s - loss: 0.2081 - acc: 0.9792 - val_loss: 0.2520 - val_acc: 0.9583
Epoch 317/500
96/96 [==============================] - 0s - loss: 0.2076 - acc: 0.9792 - val_loss: 0.2498 - val_acc: 0.9583
Epoch 318/500
96/96 [==============================] - 0s - loss: 0.2061 - acc: 0.9792 - val_loss: 0.2524 - val_acc: 0.9583
Epoch 319/500
96/96 [==============================] - 0s - loss: 0.2054 - acc: 0.9792 - val_loss: 0.2524 - val_acc: 0.9583
Epoch 320/500
96/96 [==============================] - 0s - loss: 0.2047 - acc: 0.9792 - val_loss: 0.2552 - val_acc: 0.9583
Epoch 321/500
96/96 [==============================] - 0s - loss: 0.2035 - acc: 0.9792 - val_loss: 0.2538 - val_acc: 0.9583
Epoch 322/500
96/96 [==============================] - 0s - loss: 0.2026 - acc: 0.9792 - val_loss: 0.2529 - val_acc: 0.9583
Epoch 323/500
96/96 [==============================] - 0s - loss: 0.2022 - acc: 0.9792 - val_loss: 0.2499 - val_acc: 0.9583
Epoch 324/500
96/96 [==============================] - 0s - loss: 0.2010 - acc: 0.9792 - val_loss: 0.2491 - val_acc: 0.9583
Epoch 325/500
96/96 [==============================] - 0s - loss: 0.2002 - acc: 0.9792 - val_loss: 0.2485 - val_acc: 0.9583
Epoch 326/500
96/96 [==============================] - 0s - loss: 0.1997 - acc: 0.9792 - val_loss: 0.2463 - val_acc: 0.9583
Epoch 327/500
96/96 [==============================] - ETA: 0s - loss: 0.1978 - acc: 0.968 - 0s - loss: 0.1986 - acc: 0.9792 - val_loss: 0.2454 - val_acc: 0.9583
Epoch 328/500
96/96 [==============================] - 0s - loss: 0.1978 - acc: 0.9792 - val_loss: 0.2454 - val_acc: 0.9583
Epoch 329/500
96/96 [==============================] - 0s - loss: 0.1970 - acc: 0.9792 - val_loss: 0.2461 - val_acc: 0.9583
Epoch 330/500
96/96 [==============================] - 0s - loss: 0.1961 - acc: 0.9792 - val_loss: 0.2459 - val_acc: 0.9583
Epoch 331/500
96/96 [==============================] - 0s - loss: 0.1956 - acc: 0.9792 - val_loss: 0.2445 - val_acc: 0.9583
Epoch 332/500
96/96 [==============================] - 0s - loss: 0.1946 - acc: 0.9792 - val_loss: 0.2447 - val_acc: 0.9583
Epoch 333/500
96/96 [==============================] - 0s - loss: 0.1945 - acc: 0.9792 - val_loss: 0.2462 - val_acc: 0.9583
Epoch 334/500
96/96 [==============================] - 0s - loss: 0.1930 - acc: 0.9792 - val_loss: 0.2445 - val_acc: 0.9583
Epoch 335/500
96/96 [==============================] - 0s - loss: 0.1923 - acc: 0.9792 - val_loss: 0.2423 - val_acc: 0.9583
Epoch 336/500
96/96 [==============================] - 0s - loss: 0.1917 - acc: 0.9792 - val_loss: 0.2428 - val_acc: 0.9583
Epoch 337/500
96/96 [==============================] - 0s - loss: 0.1907 - acc: 0.9792 - val_loss: 0.2411 - val_acc: 0.9583
Epoch 338/500
96/96 [==============================] - 0s - loss: 0.1901 - acc: 0.9792 - val_loss: 0.2388 - val_acc: 0.9583
Epoch 339/500
96/96 [==============================] - ETA: 0s - loss: 0.1832 - acc: 0.968 - 0s - loss: 0.1892 - acc: 0.9792 - val_loss: 0.2390 - val_acc: 0.9583
Epoch 340/500
96/96 [==============================] - 0s - loss: 0.1885 - acc: 0.9792 - val_loss: 0.2393 - val_acc: 0.9583
Epoch 341/500
96/96 [==============================] - 0s - loss: 0.1878 - acc: 0.9792 - val_loss: 0.2376 - val_acc: 0.9583
Epoch 342/500
96/96 [==============================] - 0s - loss: 0.1871 - acc: 0.9792 - val_loss: 0.2378 - val_acc: 0.9583
Epoch 343/500
96/96 [==============================] - ETA: 0s - loss: 0.1889 - acc: 0.968 - 0s - loss: 0.1867 - acc: 0.9792 - val_loss: 0.2355 - val_acc: 0.9583
Epoch 344/500
96/96 [==============================] - 0s - loss: 0.1860 - acc: 0.9792 - val_loss: 0.2381 - val_acc: 0.9583
Epoch 345/500
96/96 [==============================] - 0s - loss: 0.1848 - acc: 0.9792 - val_loss: 0.2384 - val_acc: 0.9583
Epoch 346/500
96/96 [==============================] - 0s - loss: 0.1841 - acc: 0.9792 - val_loss: 0.2384 - val_acc: 0.9583
Epoch 347/500
96/96 [==============================] - 0s - loss: 0.1836 - acc: 0.9792 - val_loss: 0.2351 - val_acc: 0.9583
Epoch 348/500
96/96 [==============================] - 0s - loss: 0.1827 - acc: 0.9792 - val_loss: 0.2330 - val_acc: 0.9583
Epoch 349/500
96/96 [==============================] - 0s - loss: 0.1821 - acc: 0.9792 - val_loss: 0.2318 - val_acc: 0.9583
Epoch 350/500
96/96 [==============================] - 0s - loss: 0.1812 - acc: 0.9792 - val_loss: 0.2329 - val_acc: 0.9583
Epoch 351/500
96/96 [==============================] - 0s - loss: 0.1807 - acc: 0.9792 - val_loss: 0.2346 - val_acc: 0.9583
Epoch 352/500
96/96 [==============================] - 0s - loss: 0.1797 - acc: 0.9792 - val_loss: 0.2346 - val_acc: 0.9583
Epoch 353/500
96/96 [==============================] - 0s - loss: 0.1790 - acc: 0.9792 - val_loss: 0.2330 - val_acc: 0.9583
Epoch 354/500
96/96 [==============================] - 0s - loss: 0.1785 - acc: 0.9792 - val_loss: 0.2306 - val_acc: 0.9583
Epoch 355/500
96/96 [==============================] - 0s - loss: 0.1777 - acc: 0.9792 - val_loss: 0.2301 - val_acc: 0.9583
Epoch 356/500
96/96 [==============================] - 0s - loss: 0.1769 - acc: 0.9792 - val_loss: 0.2302 - val_acc: 0.9583
Epoch 357/500
96/96 [==============================] - 0s - loss: 0.1764 - acc: 0.9792 - val_loss: 0.2310 - val_acc: 0.9583
Epoch 358/500
96/96 [==============================] - 0s - loss: 0.1756 - acc: 0.9792 - val_loss: 0.2304 - val_acc: 0.9583
Epoch 359/500
96/96 [==============================] - 0s - loss: 0.1748 - acc: 0.9792 - val_loss: 0.2289 - val_acc: 0.9583
Epoch 360/500
96/96 [==============================] - 0s - loss: 0.1743 - acc: 0.9792 - val_loss: 0.2267 - val_acc: 0.9583
Epoch 361/500
96/96 [==============================] - 0s - loss: 0.1737 - acc: 0.9792 - val_loss: 0.2252 - val_acc: 0.9583
Epoch 362/500
96/96 [==============================] - 0s - loss: 0.1731 - acc: 0.9792 - val_loss: 0.2255 - val_acc: 0.9583
Epoch 363/500
96/96 [==============================] - 0s - loss: 0.1723 - acc: 0.9792 - val_loss: 0.2269 - val_acc: 0.9583
Epoch 364/500
96/96 [==============================] - 0s - loss: 0.1715 - acc: 0.9792 - val_loss: 0.2269 - val_acc: 0.9583
Epoch 365/500
96/96 [==============================] - 0s - loss: 0.1708 - acc: 0.9792 - val_loss: 0.2267 - val_acc: 0.9583
Epoch 366/500
96/96 [==============================] - 0s - loss: 0.1704 - acc: 0.9792 - val_loss: 0.2262 - val_acc: 0.9583
Epoch 367/500
96/96 [==============================] - 0s - loss: 0.1697 - acc: 0.9792 - val_loss: 0.2247 - val_acc: 0.9583
Epoch 368/500
96/96 [==============================] - 0s - loss: 0.1689 - acc: 0.9792 - val_loss: 0.2234 - val_acc: 0.9583
Epoch 369/500
96/96 [==============================] - 0s - loss: 0.1685 - acc: 0.9792 - val_loss: 0.2243 - val_acc: 0.9583
Epoch 370/500
96/96 [==============================] - 0s - loss: 0.1676 - acc: 0.9792 - val_loss: 0.2234 - val_acc: 0.9583
Epoch 371/500
96/96 [==============================] - 0s - loss: 0.1669 - acc: 0.9792 - val_loss: 0.2226 - val_acc: 0.9583
Epoch 372/500
96/96 [==============================] - 0s - loss: 0.1667 - acc: 0.9792 - val_loss: 0.2230 - val_acc: 0.9583
Epoch 373/500
96/96 [==============================] - 0s - loss: 0.1663 - acc: 0.9792 - val_loss: 0.2192 - val_acc: 0.9583
Epoch 374/500
96/96 [==============================] - 0s - loss: 0.1651 - acc: 0.9792 - val_loss: 0.2185 - val_acc: 0.9583
Epoch 375/500
96/96 [==============================] - 0s - loss: 0.1644 - acc: 0.9792 - val_loss: 0.2186 - val_acc: 0.9583
Epoch 376/500
96/96 [==============================] - 0s - loss: 0.1638 - acc: 0.9792 - val_loss: 0.2192 - val_acc: 0.9583
Epoch 377/500
96/96 [==============================] - 0s - loss: 0.1633 - acc: 0.9792 - val_loss: 0.2201 - val_acc: 0.9583
Epoch 378/500
96/96 [==============================] - 0s - loss: 0.1627 - acc: 0.9792 - val_loss: 0.2203 - val_acc: 0.9583
Epoch 379/500
96/96 [==============================] - 0s - loss: 0.1620 - acc: 0.9792 - val_loss: 0.2186 - val_acc: 0.9583
Epoch 380/500
96/96 [==============================] - 0s - loss: 0.1615 - acc: 0.9792 - val_loss: 0.2172 - val_acc: 0.9583
Epoch 381/500
96/96 [==============================] - 0s - loss: 0.1608 - acc: 0.9792 - val_loss: 0.2176 - val_acc: 0.9583
Epoch 382/500
96/96 [==============================] - 0s - loss: 0.1603 - acc: 0.9792 - val_loss: 0.2158 - val_acc: 0.9583
Epoch 383/500
96/96 [==============================] - 0s - loss: 0.1596 - acc: 0.9792 - val_loss: 0.2149 - val_acc: 0.9583
Epoch 384/500
96/96 [==============================] - 0s - loss: 0.1589 - acc: 0.9792 - val_loss: 0.2155 - val_acc: 0.9583
Epoch 385/500
96/96 [==============================] - 0s - loss: 0.1583 - acc: 0.9792 - val_loss: 0.2156 - val_acc: 0.9583
Epoch 386/500
96/96 [==============================] - 0s - loss: 0.1579 - acc: 0.9792 - val_loss: 0.2166 - val_acc: 0.9583
Epoch 387/500
96/96 [==============================] - 0s - loss: 0.1572 - acc: 0.9792 - val_loss: 0.2162 - val_acc: 0.9583
Epoch 388/500
96/96 [==============================] - 0s - loss: 0.1567 - acc: 0.9792 - val_loss: 0.2140 - val_acc: 0.9583
Epoch 389/500
96/96 [==============================] - 0s - loss: 0.1562 - acc: 0.9792 - val_loss: 0.2118 - val_acc: 0.9583
Epoch 390/500
96/96 [==============================] - 0s - loss: 0.1555 - acc: 0.9792 - val_loss: 0.2114 - val_acc: 0.9583
Epoch 391/500
96/96 [==============================] - 0s - loss: 0.1549 - acc: 0.9792 - val_loss: 0.2110 - val_acc: 0.9583
Epoch 392/500
96/96 [==============================] - 0s - loss: 0.1545 - acc: 0.9792 - val_loss: 0.2099 - val_acc: 0.9583
Epoch 393/500
96/96 [==============================] - 0s - loss: 0.1546 - acc: 0.9792 - val_loss: 0.2130 - val_acc: 0.9583
Epoch 394/500
96/96 [==============================] - 0s - loss: 0.1533 - acc: 0.9792 - val_loss: 0.2117 - val_acc: 0.9583
Epoch 395/500
96/96 [==============================] - 0s - loss: 0.1528 - acc: 0.9792 - val_loss: 0.2106 - val_acc: 0.9583
Epoch 396/500
96/96 [==============================] - 0s - loss: 0.1521 - acc: 0.9792 - val_loss: 0.2099 - val_acc: 0.9583
Epoch 397/500
96/96 [==============================] - 0s - loss: 0.1516 - acc: 0.9792 - val_loss: 0.2088 - val_acc: 0.9583
Epoch 398/500
96/96 [==============================] - 0s - loss: 0.1509 - acc: 0.9792 - val_loss: 0.2089 - val_acc: 0.9583
Epoch 399/500
96/96 [==============================] - 0s - loss: 0.1504 - acc: 0.9792 - val_loss: 0.2089 - val_acc: 0.9583
Epoch 400/500
96/96 [==============================] - 0s - loss: 0.1501 - acc: 0.9792 - val_loss: 0.2106 - val_acc: 0.9583
Epoch 401/500
96/96 [==============================] - 0s - loss: 0.1493 - acc: 0.9792 - val_loss: 0.2101 - val_acc: 0.9583
Epoch 402/500
96/96 [==============================] - 0s - loss: 0.1488 - acc: 0.9792 - val_loss: 0.2084 - val_acc: 0.9583
Epoch 403/500
96/96 [==============================] - 0s - loss: 0.1485 - acc: 0.9792 - val_loss: 0.2079 - val_acc: 0.9583
Epoch 404/500
96/96 [==============================] - 0s - loss: 0.1478 - acc: 0.9792 - val_loss: 0.2066 - val_acc: 0.9583
Epoch 405/500
96/96 [==============================] - 0s - loss: 0.1472 - acc: 0.9792 - val_loss: 0.2048 - val_acc: 0.9583
Epoch 406/500
96/96 [==============================] - 0s - loss: 0.1468 - acc: 0.9792 - val_loss: 0.2033 - val_acc: 0.9583
Epoch 407/500
96/96 [==============================] - 0s - loss: 0.1463 - acc: 0.9792 - val_loss: 0.2042 - val_acc: 0.9583
Epoch 408/500
96/96 [==============================] - 0s - loss: 0.1456 - acc: 0.9792 - val_loss: 0.2035 - val_acc: 0.9583
Epoch 409/500
96/96 [==============================] - 0s - loss: 0.1451 - acc: 0.9792 - val_loss: 0.2032 - val_acc: 0.9583
Epoch 410/500
96/96 [==============================] - 0s - loss: 0.1447 - acc: 0.9792 - val_loss: 0.2047 - val_acc: 0.9583
Epoch 411/500
96/96 [==============================] - ETA: 0s - loss: 0.1519 - acc: 0.968 - 0s - loss: 0.1442 - acc: 0.9792 - val_loss: 0.2044 - val_acc: 0.9583
Epoch 412/500
96/96 [==============================] - 0s - loss: 0.1437 - acc: 0.9792 - val_loss: 0.2049 - val_acc: 0.9583
Epoch 413/500
96/96 [==============================] - 0s - loss: 0.1431 - acc: 0.9792 - val_loss: 0.2036 - val_acc: 0.9583
Epoch 414/500
96/96 [==============================] - 0s - loss: 0.1426 - acc: 0.9792 - val_loss: 0.2030 - val_acc: 0.9583
Epoch 415/500
96/96 [==============================] - 0s - loss: 0.1422 - acc: 0.9792 - val_loss: 0.2009 - val_acc: 0.9583
Epoch 416/500
96/96 [==============================] - 0s - loss: 0.1416 - acc: 0.9792 - val_loss: 0.2001 - val_acc: 0.9583
Epoch 417/500
96/96 [==============================] - 0s - loss: 0.1412 - acc: 0.9792 - val_loss: 0.2004 - val_acc: 0.9583
Epoch 418/500
96/96 [==============================] - 0s - loss: 0.1405 - acc: 0.9792 - val_loss: 0.2001 - val_acc: 0.9583
Epoch 419/500
96/96 [==============================] - 0s - loss: 0.1401 - acc: 0.9792 - val_loss: 0.1994 - val_acc: 0.9583
Epoch 420/500
96/96 [==============================] - 0s - loss: 0.1397 - acc: 0.9792 - val_loss: 0.1985 - val_acc: 0.9583
Epoch 421/500
96/96 [==============================] - 0s - loss: 0.1391 - acc: 0.9792 - val_loss: 0.1986 - val_acc: 0.9583
Epoch 422/500
96/96 [==============================] - 0s - loss: 0.1387 - acc: 0.9792 - val_loss: 0.1996 - val_acc: 0.9583
Epoch 423/500
96/96 [==============================] - 0s - loss: 0.1387 - acc: 0.9792 - val_loss: 0.2013 - val_acc: 0.9583
Epoch 424/500
96/96 [==============================] - 0s - loss: 0.1378 - acc: 0.9792 - val_loss: 0.1994 - val_acc: 0.9583
Epoch 425/500
96/96 [==============================] - 0s - loss: 0.1372 - acc: 0.9792 - val_loss: 0.1982 - val_acc: 0.9583
Epoch 426/500
96/96 [==============================] - 0s - loss: 0.1370 - acc: 0.9792 - val_loss: 0.1982 - val_acc: 0.9583
Epoch 427/500
96/96 [==============================] - 0s - loss: 0.1363 - acc: 0.9792 - val_loss: 0.1960 - val_acc: 0.9583
Epoch 428/500
96/96 [==============================] - 0s - loss: 0.1357 - acc: 0.9792 - val_loss: 0.1948 - val_acc: 0.9583
Epoch 429/500
96/96 [==============================] - 0s - loss: 0.1356 - acc: 0.9792 - val_loss: 0.1929 - val_acc: 0.9583
Epoch 430/500
96/96 [==============================] - 0s - loss: 0.1350 - acc: 0.9792 - val_loss: 0.1926 - val_acc: 0.9583
Epoch 431/500
96/96 [==============================] - 0s - loss: 0.1344 - acc: 0.9792 - val_loss: 0.1933 - val_acc: 0.9583
Epoch 432/500
96/96 [==============================] - 0s - loss: 0.1339 - acc: 0.9792 - val_loss: 0.1942 - val_acc: 0.9583
Epoch 433/500
96/96 [==============================] - 0s - loss: 0.1336 - acc: 0.9792 - val_loss: 0.1938 - val_acc: 0.9583
Epoch 434/500
96/96 [==============================] - 0s - loss: 0.1332 - acc: 0.9792 - val_loss: 0.1957 - val_acc: 0.9583
Epoch 435/500
96/96 [==============================] - 0s - loss: 0.1333 - acc: 0.9792 - val_loss: 0.1977 - val_acc: 0.9583
Epoch 436/500
96/96 [==============================] - 0s - loss: 0.1328 - acc: 0.9792 - val_loss: 0.1944 - val_acc: 0.9583
Epoch 437/500
96/96 [==============================] - 0s - loss: 0.1318 - acc: 0.9792 - val_loss: 0.1931 - val_acc: 0.9583
Epoch 438/500
96/96 [==============================] - 0s - loss: 0.1313 - acc: 0.9792 - val_loss: 0.1922 - val_acc: 0.9583
Epoch 439/500
96/96 [==============================] - 0s - loss: 0.1310 - acc: 0.9792 - val_loss: 0.1906 - val_acc: 0.9583
Epoch 440/500
96/96 [==============================] - 0s - loss: 0.1306 - acc: 0.9792 - val_loss: 0.1895 - val_acc: 0.9583
Epoch 441/500
96/96 [==============================] - 0s - loss: 0.1300 - acc: 0.9792 - val_loss: 0.1896 - val_acc: 0.9583
Epoch 442/500
96/96 [==============================] - 0s - loss: 0.1295 - acc: 0.9792 - val_loss: 0.1904 - val_acc: 0.9583
Epoch 443/500
96/96 [==============================] - 0s - loss: 0.1293 - acc: 0.9792 - val_loss: 0.1920 - val_acc: 0.9583
Epoch 444/500
96/96 [==============================] - 0s - loss: 0.1289 - acc: 0.9792 - val_loss: 0.1929 - val_acc: 0.9583
Epoch 445/500
96/96 [==============================] - 0s - loss: 0.1290 - acc: 0.9792 - val_loss: 0.1940 - val_acc: 0.9583
Epoch 446/500
96/96 [==============================] - 0s - loss: 0.1280 - acc: 0.9792 - val_loss: 0.1913 - val_acc: 0.9583
Epoch 447/500
96/96 [==============================] - 0s - loss: 0.1274 - acc: 0.9792 - val_loss: 0.1890 - val_acc: 0.9583
Epoch 448/500
96/96 [==============================] - 0s - loss: 0.1271 - acc: 0.9792 - val_loss: 0.1868 - val_acc: 0.9583
Epoch 449/500
96/96 [==============================] - 0s - loss: 0.1266 - acc: 0.9792 - val_loss: 0.1866 - val_acc: 0.9583
Epoch 450/500
96/96 [==============================] - 0s - loss: 0.1262 - acc: 0.9792 - val_loss: 0.1863 - val_acc: 0.9583
Epoch 451/500
96/96 [==============================] - 0s - loss: 0.1258 - acc: 0.9792 - val_loss: 0.1861 - val_acc: 0.9583
Epoch 452/500
96/96 [==============================] - 0s - loss: 0.1256 - acc: 0.9792 - val_loss: 0.1850 - val_acc: 0.9583
Epoch 453/500
96/96 [==============================] - 0s - loss: 0.1251 - acc: 0.9792 - val_loss: 0.1865 - val_acc: 0.9583
Epoch 454/500
96/96 [==============================] - 0s - loss: 0.1246 - acc: 0.9792 - val_loss: 0.1867 - val_acc: 0.9583
Epoch 455/500
96/96 [==============================] - 0s - loss: 0.1243 - acc: 0.9792 - val_loss: 0.1876 - val_acc: 0.9583
Epoch 456/500
96/96 [==============================] - 0s - loss: 0.1239 - acc: 0.9792 - val_loss: 0.1861 - val_acc: 0.9583
Epoch 457/500
96/96 [==============================] - 0s - loss: 0.1236 - acc: 0.9792 - val_loss: 0.1845 - val_acc: 0.9583
Epoch 458/500
96/96 [==============================] - 0s - loss: 0.1230 - acc: 0.9792 - val_loss: 0.1844 - val_acc: 0.9583
Epoch 459/500
96/96 [==============================] - 0s - loss: 0.1227 - acc: 0.9792 - val_loss: 0.1854 - val_acc: 0.9583
Epoch 460/500
96/96 [==============================] - 0s - loss: 0.1223 - acc: 0.9792 - val_loss: 0.1841 - val_acc: 0.9583
Epoch 461/500
96/96 [==============================] - 0s - loss: 0.1218 - acc: 0.9792 - val_loss: 0.1841 - val_acc: 0.9583
Epoch 462/500
96/96 [==============================] - 0s - loss: 0.1214 - acc: 0.9792 - val_loss: 0.1841 - val_acc: 0.9583
Epoch 463/500
96/96 [==============================] - 0s - loss: 0.1213 - acc: 0.9792 - val_loss: 0.1841 - val_acc: 0.9583
Epoch 464/500
96/96 [==============================] - 0s - loss: 0.1208 - acc: 0.9792 - val_loss: 0.1831 - val_acc: 0.9583
Epoch 465/500
96/96 [==============================] - 0s - loss: 0.1203 - acc: 0.9792 - val_loss: 0.1822 - val_acc: 0.9583
Epoch 466/500
96/96 [==============================] - 0s - loss: 0.1199 - acc: 0.9792 - val_loss: 0.1825 - val_acc: 0.9583
Epoch 467/500
96/96 [==============================] - 0s - loss: 0.1195 - acc: 0.9792 - val_loss: 0.1818 - val_acc: 0.9583
Epoch 468/500
96/96 [==============================] - 0s - loss: 0.1192 - acc: 0.9792 - val_loss: 0.1813 - val_acc: 0.9583
Epoch 469/500
96/96 [==============================] - 0s - loss: 0.1189 - acc: 0.9792 - val_loss: 0.1817 - val_acc: 0.9583
Epoch 470/500
96/96 [==============================] - 0s - loss: 0.1184 - acc: 0.9792 - val_loss: 0.1814 - val_acc: 0.9583
Epoch 471/500
96/96 [==============================] - 0s - loss: 0.1181 - acc: 0.9792 - val_loss: 0.1803 - val_acc: 0.9583
Epoch 472/500
96/96 [==============================] - 0s - loss: 0.1177 - acc: 0.9792 - val_loss: 0.1802 - val_acc: 0.9583
Epoch 473/500
96/96 [==============================] - 0s - loss: 0.1173 - acc: 0.9792 - val_loss: 0.1798 - val_acc: 0.9583
Epoch 474/500
96/96 [==============================] - 0s - loss: 0.1170 - acc: 0.9792 - val_loss: 0.1797 - val_acc: 0.9583
Epoch 475/500
96/96 [==============================] - 0s - loss: 0.1166 - acc: 0.9792 - val_loss: 0.1793 - val_acc: 0.9583
Epoch 476/500
96/96 [==============================] - 0s - loss: 0.1163 - acc: 0.9792 - val_loss: 0.1784 - val_acc: 0.9583
Epoch 477/500
96/96 [==============================] - 0s - loss: 0.1161 - acc: 0.9792 - val_loss: 0.1775 - val_acc: 0.9583
Epoch 478/500
96/96 [==============================] - 0s - loss: 0.1157 - acc: 0.9792 - val_loss: 0.1786 - val_acc: 0.9583
Epoch 479/500
96/96 [==============================] - 0s - loss: 0.1153 - acc: 0.9792 - val_loss: 0.1793 - val_acc: 0.9583
Epoch 480/500
96/96 [==============================] - 0s - loss: 0.1151 - acc: 0.9792 - val_loss: 0.1777 - val_acc: 0.9583
Epoch 481/500
96/96 [==============================] - 0s - loss: 0.1147 - acc: 0.9792 - val_loss: 0.1765 - val_acc: 0.9583
Epoch 482/500
96/96 [==============================] - 0s - loss: 0.1142 - acc: 0.9792 - val_loss: 0.1764 - val_acc: 0.9583
Epoch 483/500
96/96 [==============================] - ETA: 0s - loss: 0.1219 - acc: 0.968 - 0s - loss: 0.1140 - acc: 0.9792 - val_loss: 0.1756 - val_acc: 0.9583
Epoch 484/500
96/96 [==============================] - 0s - loss: 0.1134 - acc: 0.9792 - val_loss: 0.1759 - val_acc: 0.9583
Epoch 485/500
96/96 [==============================] - 0s - loss: 0.1133 - acc: 0.9792 - val_loss: 0.1770 - val_acc: 0.9583
Epoch 486/500
96/96 [==============================] - 0s - loss: 0.1128 - acc: 0.9792 - val_loss: 0.1774 - val_acc: 0.9583
Epoch 487/500
96/96 [==============================] - 0s - loss: 0.1126 - acc: 0.9792 - val_loss: 0.1778 - val_acc: 0.9583
Epoch 488/500
96/96 [==============================] - 0s - loss: 0.1122 - acc: 0.9792 - val_loss: 0.1767 - val_acc: 0.9583
Epoch 489/500
96/96 [==============================] - 0s - loss: 0.1124 - acc: 0.9792 - val_loss: 0.1747 - val_acc: 0.9583
Epoch 490/500
96/96 [==============================] - ETA: 0s - loss: 0.0913 - acc: 1.000 - 0s - loss: 0.1117 - acc: 0.9792 - val_loss: 0.1732 - val_acc: 0.9583
Epoch 491/500
96/96 [==============================] - 0s - loss: 0.1112 - acc: 0.9792 - val_loss: 0.1739 - val_acc: 0.9583
Epoch 492/500
96/96 [==============================] - 0s - loss: 0.1108 - acc: 0.9792 - val_loss: 0.1734 - val_acc: 0.9583
Epoch 493/500
96/96 [==============================] - 0s - loss: 0.1107 - acc: 0.9792 - val_loss: 0.1734 - val_acc: 0.9583
Epoch 494/500
96/96 [==============================] - 0s - loss: 0.1102 - acc: 0.9792 - val_loss: 0.1729 - val_acc: 0.9583
Epoch 495/500
96/96 [==============================] - 0s - loss: 0.1099 - acc: 0.9792 - val_loss: 0.1732 - val_acc: 0.9583
Epoch 496/500
96/96 [==============================] - 0s - loss: 0.1099 - acc: 0.9792 - val_loss: 0.1743 - val_acc: 0.9583
Epoch 497/500
96/96 [==============================] - 0s - loss: 0.1095 - acc: 0.9792 - val_loss: 0.1746 - val_acc: 0.9583
Epoch 498/500
96/96 [==============================] - 0s - loss: 0.1091 - acc: 0.9792 - val_loss: 0.1723 - val_acc: 0.9583
Epoch 499/500
96/96 [==============================] - 0s - loss: 0.1086 - acc: 0.9792 - val_loss: 0.1720 - val_acc: 0.9583
Epoch 500/500
96/96 [==============================] - 0s - loss: 0.1082 - acc: 0.9792 - val_loss: 0.1712 - val_acc: 0.9583
CPU times: user 9.72 s, sys: 3.81 s, total: 13.5 s
Wall time: 12.4 s
Out[65]:
<keras.callbacks.History at 0x7f64c2329a90>

Evaluation


In [34]:
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[34]:
array([[  9.75674212e-01,   2.37371735e-02,   5.88657393e-04]], dtype=float32)

In [35]:
X[0], y[0]


Out[35]:
(array([ 5.1,  3.5,  1.4,  0.2]), array([ 1.,  0.,  0.]))

In [66]:
train_loss, train_accuracy = model.evaluate(X_train, y_train)
train_loss, train_accuracy


 32/120 [=======>......................] - ETA: 0s
Out[66]:
(0.12065287530422211, 0.97499999602635701)

In [67]:
test_loss, test_accuracy = model.evaluate(X_test, y_test)
test_loss, test_accuracy


30/30 [==============================] - 0s
Out[67]:
(0.14061750471591949, 0.96666663885116577)

Save Model in Keras Format


In [38]:
# Keras format
model.save('nn-iris.hdf5')

One of the primary applications for Deep Learning

Convolutional Neural Networks for Image Recognition

Example of a Convolution

Original Image

Many convolutional filters applied over all channels

http://cs.stanford.edu/people/karpathy/convnetjs/demo/cifar10.html

Convolutional Blocks: Cascading many Convolutional Layers having down sampling in between

http://cs231n.github.io/convolutional-networks/#conv

Processing Sequences using RNNs (recursive neural networks)

  • RNNs can also have backward connections

E.g. for text processing (words only make sense in conjunction with others)

Using LSTM (Long-Short Term Memory layer)

  • LSTMs can remember sequences of values

Example: Sentiment Analysis

https://transcranial.github.io/keras-js


In [ ]: