Einführung in Neuronale Netzwerke


In [1]:
import warnings
warnings.filterwarnings('ignore')

In [2]:
%matplotlib inline
%pylab inline


Populating the interactive namespace from numpy and matplotlib

In [3]:
import matplotlib.pylab as plt
import numpy as np

In [4]:
from distutils.version import StrictVersion

In [5]:
import sklearn
print(sklearn.__version__)

assert StrictVersion(sklearn.__version__ ) >= StrictVersion('0.18.1')


0.18.1

In [6]:
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.ERROR)
print(tf.__version__)

assert StrictVersion(tf.__version__) >= StrictVersion('1.1.0')


1.2.1

In [7]:
import keras
print(keras.__version__)

assert StrictVersion(keras.__version__) >= StrictVersion('2.0.0')


Using TensorFlow backend.
2.0.5

Iris mit Neuronalen Netzwerken

Das künstliche Neuron


Hands-On

Erzeuge eine Python-Implementierung eines Neurons mit zwei Eingabevariablen ohne Activation Funktion

  • Denke die Werte für w1, w2 und den Bias aus
  • Kannst du eine Skizze des Graphs der Funktion mit x1 und x2 an den Achsen erstellen?
  • Was ist das für eine Funktion?


In [ ]:
%load https://djcordhose.github.io/ai/fragments/neuron.py

Wir probieren unser Modell mit dem Iris Dataset


In [11]:
from sklearn.datasets import load_iris
iris = load_iris()
iris.data[0]


Out[11]:
array([ 5.1,  3.5,  1.4,  0.2])

In [37]:
neuron_no_activation(5.1, 3.5)


Out[37]:
-10.399999999999999

Wie sollen wir das interpretieren? Damit können wir nicht viel anfangen

Activation Functions


In [16]:
def centerAxis(uses_negative=False):
    # http://matplotlib.org/api/pyplot_api.html#matplotlib.pyplot.plot
    ax = plt.gca()
    ax.spines['left'].set_position('center')
    if uses_negative:
        ax.spines['bottom'].set_position('center')
    ax.spines['right'].set_color('none')
    ax.spines['top'].set_color('none')
    ax.xaxis.set_ticks_position('bottom')
    ax.yaxis.set_ticks_position('left')

Sigmoid


In [18]:
def np_sigmoid(X):
    return 1 / (1 + np.exp(X * -1))

In [19]:
x = np.arange(-10,10,0.01)
y = np_sigmoid(x)

centerAxis()
plt.plot(x,y,lw=3)


Out[19]:
[<matplotlib.lines.Line2D at 0x7f53a21f9128>]

Relu


In [20]:
def np_relu(x):
    return np.maximum(0, x)

In [21]:
x = np.arange(-10, 10, 0.01)
y = np_relu(x)

centerAxis()
plt.plot(x,y,lw=3)


Out[21]:
[<matplotlib.lines.Line2D at 0x7f53a2155b00>]

Das komplette Neuron


In [24]:
w0 = 3
w1 = -4
w2 = 2

import math as math
def sigmoid(x):
    return 1 / (1 + math.exp(x * -1))

def neuron(x1, x2):
    sum = w0 + x1 * w1 + x2 * w2
    return sigmoid(sum)

In [25]:
neuron(5.1, 3.5)


Out[25]:
3.043155690056538e-05

Unser erste Neuronales Netz mit Keras


In [26]:
from keras.layers import Input
inputs = Input(shape=(4, ))

In [27]:
from keras.layers import Dense
fc = Dense(3)(inputs)

In [28]:
from keras.models import Model
model = Model(input=inputs, output=fc)

In [29]:
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         (None, 4)                 0         
_________________________________________________________________
dense_1 (Dense)              (None, 3)                 15        
=================================================================
Total params: 15
Trainable params: 15
Non-trainable params: 0
_________________________________________________________________

In [30]:
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In [31]:
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[31]:
array([[ 3.23584461, -1.13255548, -3.45655632]], dtype=float32)


In [32]:
inputs = Input(shape=(4, ))
fc = Dense(3)(inputs)
predictions = Dense(3, activation='softmax')(fc)
model = Model(input=inputs, output=predictions)

In [33]:
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_2 (InputLayer)         (None, 4)                 0         
_________________________________________________________________
dense_2 (Dense)              (None, 3)                 15        
_________________________________________________________________
dense_3 (Dense)              (None, 3)                 12        
=================================================================
Total params: 27
Trainable params: 27
Non-trainable params: 0
_________________________________________________________________

In [34]:
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In [35]:
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[35]:
array([[ 0.09841966,  0.87404394,  0.02753644]], dtype=float32)

Training


In [36]:
X = np.array(iris.data)
y = np.array(iris.target)
X.shape, y.shape


Out[36]:
((150, 4), (150,))

In [39]:
y[100]


Out[39]:
2

In [40]:
# tiny little pieces of feature engeneering
from keras.utils.np_utils import to_categorical

num_categories = 3

y = to_categorical(y, num_categories)

In [41]:
y[100]


Out[41]:
array([ 0.,  0.,  1.])

In [42]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.20, random_state=42, stratify=y)

In [43]:
X_train.shape, X_test.shape, y_train.shape, y_test.shape


Out[43]:
((120, 4), (30, 4), (120, 3), (30, 3))

In [ ]:
# !rm -r tf_log
# tb_callback = keras.callbacks.TensorBoard(log_dir='./tf_log')

# https://keras.io/callbacks/#tensorboard
# To start tensorboard
# tensorboard --logdir=/mnt/c/Users/olive/Development/ml/tf_log
# open http://localhost:6006

In [44]:
# %time model.fit(X_train, y_train, epochs=500, validation_split=0.3, callbacks=[tb_callback])
%time model.fit(X_train, y_train, epochs=500, validation_split=0.3)


Train on 84 samples, validate on 36 samples
Epoch 1/500
84/84 [==============================] - 0s - loss: 1.6569 - acc: 0.3690 - val_loss: 2.0478 - val_acc: 0.2222
Epoch 2/500
84/84 [==============================] - 0s - loss: 1.6029 - acc: 0.3690 - val_loss: 1.9716 - val_acc: 0.2222
Epoch 3/500
84/84 [==============================] - 0s - loss: 1.5480 - acc: 0.3690 - val_loss: 1.8985 - val_acc: 0.2222
Epoch 4/500
84/84 [==============================] - 0s - loss: 1.4995 - acc: 0.3690 - val_loss: 1.8268 - val_acc: 0.2222
Epoch 5/500
84/84 [==============================] - 0s - loss: 1.4511 - acc: 0.3690 - val_loss: 1.7574 - val_acc: 0.2222
Epoch 6/500
84/84 [==============================] - 0s - loss: 1.4036 - acc: 0.3690 - val_loss: 1.6917 - val_acc: 0.2222
Epoch 7/500
84/84 [==============================] - 0s - loss: 1.3624 - acc: 0.3690 - val_loss: 1.6280 - val_acc: 0.2222
Epoch 8/500
84/84 [==============================] - 0s - loss: 1.3182 - acc: 0.3690 - val_loss: 1.5681 - val_acc: 0.2222
Epoch 9/500
84/84 [==============================] - 0s - loss: 1.2816 - acc: 0.3690 - val_loss: 1.5108 - val_acc: 0.2222
Epoch 10/500
84/84 [==============================] - 0s - loss: 1.2443 - acc: 0.3690 - val_loss: 1.4566 - val_acc: 0.2222
Epoch 11/500
84/84 [==============================] - 0s - loss: 1.2100 - acc: 0.3690 - val_loss: 1.4059 - val_acc: 0.2222
Epoch 12/500
84/84 [==============================] - 0s - loss: 1.1789 - acc: 0.3690 - val_loss: 1.3584 - val_acc: 0.2222
Epoch 13/500
84/84 [==============================] - 0s - loss: 1.1506 - acc: 0.3690 - val_loss: 1.3140 - val_acc: 0.2222
Epoch 14/500
84/84 [==============================] - 0s - loss: 1.1242 - acc: 0.3690 - val_loss: 1.2732 - val_acc: 0.2222
Epoch 15/500
84/84 [==============================] - 0s - loss: 1.1006 - acc: 0.3690 - val_loss: 1.2355 - val_acc: 0.2222
Epoch 16/500
84/84 [==============================] - 0s - loss: 1.0785 - acc: 0.3690 - val_loss: 1.2008 - val_acc: 0.2222
Epoch 17/500
84/84 [==============================] - 0s - loss: 1.0588 - acc: 0.3690 - val_loss: 1.1691 - val_acc: 0.2222
Epoch 18/500
84/84 [==============================] - 0s - loss: 1.0415 - acc: 0.3690 - val_loss: 1.1396 - val_acc: 0.2222
Epoch 19/500
84/84 [==============================] - 0s - loss: 1.0253 - acc: 0.3810 - val_loss: 1.1126 - val_acc: 0.2222
Epoch 20/500
84/84 [==============================] - 0s - loss: 1.0110 - acc: 0.3810 - val_loss: 1.0874 - val_acc: 0.2222
Epoch 21/500
84/84 [==============================] - 0s - loss: 0.9981 - acc: 0.3810 - val_loss: 1.0642 - val_acc: 0.2222
Epoch 22/500
84/84 [==============================] - 0s - loss: 0.9863 - acc: 0.3810 - val_loss: 1.0435 - val_acc: 0.2222
Epoch 23/500
84/84 [==============================] - 0s - loss: 0.9769 - acc: 0.3929 - val_loss: 1.0243 - val_acc: 0.2778
Epoch 24/500
84/84 [==============================] - 0s - loss: 0.9663 - acc: 0.4524 - val_loss: 1.0079 - val_acc: 0.3333
Epoch 25/500
84/84 [==============================] - 0s - loss: 0.9581 - acc: 0.5000 - val_loss: 0.9925 - val_acc: 0.3611
Epoch 26/500
84/84 [==============================] - 0s - loss: 0.9502 - acc: 0.5238 - val_loss: 0.9785 - val_acc: 0.4722
Epoch 27/500
84/84 [==============================] - 0s - loss: 0.9430 - acc: 0.5714 - val_loss: 0.9659 - val_acc: 0.4444
Epoch 28/500
84/84 [==============================] - 0s - loss: 0.9357 - acc: 0.5714 - val_loss: 0.9549 - val_acc: 0.4444
Epoch 29/500
84/84 [==============================] - 0s - loss: 0.9297 - acc: 0.5952 - val_loss: 0.9443 - val_acc: 0.5278
Epoch 30/500
84/84 [==============================] - 0s - loss: 0.9232 - acc: 0.5952 - val_loss: 0.9350 - val_acc: 0.5556
Epoch 31/500
84/84 [==============================] - 0s - loss: 0.9180 - acc: 0.5714 - val_loss: 0.9254 - val_acc: 0.5556
Epoch 32/500
84/84 [==============================] - 0s - loss: 0.9120 - acc: 0.5714 - val_loss: 0.9171 - val_acc: 0.5556
Epoch 33/500
84/84 [==============================] - 0s - loss: 0.9065 - acc: 0.5714 - val_loss: 0.9093 - val_acc: 0.5556
Epoch 34/500
84/84 [==============================] - 0s - loss: 0.9014 - acc: 0.5833 - val_loss: 0.9013 - val_acc: 0.5833
Epoch 35/500
84/84 [==============================] - 0s - loss: 0.8959 - acc: 0.5833 - val_loss: 0.8944 - val_acc: 0.5833
Epoch 36/500
84/84 [==============================] - 0s - loss: 0.8909 - acc: 0.5833 - val_loss: 0.8872 - val_acc: 0.5833
Epoch 37/500
84/84 [==============================] - ETA: 0s - loss: 0.9152 - acc: 0.500 - 0s - loss: 0.8862 - acc: 0.5833 - val_loss: 0.8806 - val_acc: 0.5556
Epoch 38/500
84/84 [==============================] - ETA: 0s - loss: 0.8614 - acc: 0.593 - 0s - loss: 0.8805 - acc: 0.5714 - val_loss: 0.8749 - val_acc: 0.5833
Epoch 39/500
84/84 [==============================] - 0s - loss: 0.8759 - acc: 0.5714 - val_loss: 0.8693 - val_acc: 0.5833
Epoch 40/500
84/84 [==============================] - 0s - loss: 0.8711 - acc: 0.5714 - val_loss: 0.8642 - val_acc: 0.5833
Epoch 41/500
84/84 [==============================] - 0s - loss: 0.8661 - acc: 0.5714 - val_loss: 0.8591 - val_acc: 0.5833
Epoch 42/500
84/84 [==============================] - 0s - loss: 0.8613 - acc: 0.5714 - val_loss: 0.8541 - val_acc: 0.5833
Epoch 43/500
84/84 [==============================] - 0s - loss: 0.8565 - acc: 0.5714 - val_loss: 0.8492 - val_acc: 0.5833
Epoch 44/500
84/84 [==============================] - 0s - loss: 0.8519 - acc: 0.5714 - val_loss: 0.8452 - val_acc: 0.5833
Epoch 45/500
84/84 [==============================] - 0s - loss: 0.8469 - acc: 0.5714 - val_loss: 0.8405 - val_acc: 0.5833
Epoch 46/500
84/84 [==============================] - 0s - loss: 0.8424 - acc: 0.5714 - val_loss: 0.8359 - val_acc: 0.5833
Epoch 47/500
84/84 [==============================] - 0s - loss: 0.8372 - acc: 0.5714 - val_loss: 0.8319 - val_acc: 0.5833
Epoch 48/500
84/84 [==============================] - 0s - loss: 0.8326 - acc: 0.5714 - val_loss: 0.8278 - val_acc: 0.6111
Epoch 49/500
84/84 [==============================] - 0s - loss: 0.8280 - acc: 0.5833 - val_loss: 0.8230 - val_acc: 0.6111
Epoch 50/500
84/84 [==============================] - ETA: 0s - loss: 0.8040 - acc: 0.687 - 0s - loss: 0.8232 - acc: 0.5833 - val_loss: 0.8189 - val_acc: 0.5833
Epoch 51/500
84/84 [==============================] - 0s - loss: 0.8188 - acc: 0.5833 - val_loss: 0.8142 - val_acc: 0.5833
Epoch 52/500
84/84 [==============================] - 0s - loss: 0.8140 - acc: 0.5833 - val_loss: 0.8090 - val_acc: 0.6111
Epoch 53/500
84/84 [==============================] - 0s - loss: 0.8092 - acc: 0.5833 - val_loss: 0.8046 - val_acc: 0.6111
Epoch 54/500
84/84 [==============================] - 0s - loss: 0.8050 - acc: 0.5833 - val_loss: 0.8002 - val_acc: 0.6111
Epoch 55/500
84/84 [==============================] - 0s - loss: 0.8002 - acc: 0.5833 - val_loss: 0.7964 - val_acc: 0.5833
Epoch 56/500
84/84 [==============================] - ETA: 0s - loss: 0.7841 - acc: 0.687 - 0s - loss: 0.7956 - acc: 0.5833 - val_loss: 0.7922 - val_acc: 0.6111
Epoch 57/500
84/84 [==============================] - 0s - loss: 0.7914 - acc: 0.5952 - val_loss: 0.7877 - val_acc: 0.6111
Epoch 58/500
84/84 [==============================] - 0s - loss: 0.7868 - acc: 0.6190 - val_loss: 0.7821 - val_acc: 0.6389
Epoch 59/500
84/84 [==============================] - 0s - loss: 0.7823 - acc: 0.6310 - val_loss: 0.7772 - val_acc: 0.6944
Epoch 60/500
84/84 [==============================] - 0s - loss: 0.7779 - acc: 0.6667 - val_loss: 0.7729 - val_acc: 0.7222
Epoch 61/500
84/84 [==============================] - 0s - loss: 0.7738 - acc: 0.7024 - val_loss: 0.7690 - val_acc: 0.7778
Epoch 62/500
84/84 [==============================] - 0s - loss: 0.7693 - acc: 0.7381 - val_loss: 0.7645 - val_acc: 0.8333
Epoch 63/500
84/84 [==============================] - ETA: 0s - loss: 0.7676 - acc: 0.843 - 0s - loss: 0.7651 - acc: 0.7619 - val_loss: 0.7601 - val_acc: 0.8333
Epoch 64/500
84/84 [==============================] - ETA: 0s - loss: 0.7493 - acc: 0.843 - 0s - loss: 0.7609 - acc: 0.7976 - val_loss: 0.7552 - val_acc: 0.8333
Epoch 65/500
84/84 [==============================] - 0s - loss: 0.7566 - acc: 0.8214 - val_loss: 0.7508 - val_acc: 0.8333
Epoch 66/500
84/84 [==============================] - 0s - loss: 0.7527 - acc: 0.8214 - val_loss: 0.7460 - val_acc: 0.8056
Epoch 67/500
84/84 [==============================] - 0s - loss: 0.7485 - acc: 0.8214 - val_loss: 0.7420 - val_acc: 0.8333
Epoch 68/500
84/84 [==============================] - 0s - loss: 0.7442 - acc: 0.8571 - val_loss: 0.7377 - val_acc: 0.8611
Epoch 69/500
84/84 [==============================] - 0s - loss: 0.7405 - acc: 0.8571 - val_loss: 0.7332 - val_acc: 0.8611
Epoch 70/500
84/84 [==============================] - 0s - loss: 0.7361 - acc: 0.8690 - val_loss: 0.7294 - val_acc: 0.8611
Epoch 71/500
84/84 [==============================] - 0s - loss: 0.7323 - acc: 0.8690 - val_loss: 0.7254 - val_acc: 0.8889
Epoch 72/500
84/84 [==============================] - 0s - loss: 0.7281 - acc: 0.8929 - val_loss: 0.7222 - val_acc: 0.9167
Epoch 73/500
84/84 [==============================] - 0s - loss: 0.7244 - acc: 0.8929 - val_loss: 0.7195 - val_acc: 0.9167
Epoch 74/500
84/84 [==============================] - 0s - loss: 0.7202 - acc: 0.9048 - val_loss: 0.7158 - val_acc: 0.9167
Epoch 75/500
84/84 [==============================] - ETA: 0s - loss: 0.7205 - acc: 0.906 - 0s - loss: 0.7165 - acc: 0.9048 - val_loss: 0.7121 - val_acc: 0.9167
Epoch 76/500
84/84 [==============================] - 0s - loss: 0.7125 - acc: 0.9048 - val_loss: 0.7077 - val_acc: 0.9167
Epoch 77/500
84/84 [==============================] - 0s - loss: 0.7086 - acc: 0.9048 - val_loss: 0.7041 - val_acc: 0.9167
Epoch 78/500
84/84 [==============================] - ETA: 0s - loss: 0.6891 - acc: 0.937 - 0s - loss: 0.7050 - acc: 0.9048 - val_loss: 0.7005 - val_acc: 0.9167
Epoch 79/500
84/84 [==============================] - 0s - loss: 0.7010 - acc: 0.9048 - val_loss: 0.6976 - val_acc: 0.9167
Epoch 80/500
84/84 [==============================] - 0s - loss: 0.6973 - acc: 0.9048 - val_loss: 0.6949 - val_acc: 0.9167
Epoch 81/500
84/84 [==============================] - 0s - loss: 0.6935 - acc: 0.9167 - val_loss: 0.6920 - val_acc: 0.9167
Epoch 82/500
84/84 [==============================] - 0s - loss: 0.6899 - acc: 0.9167 - val_loss: 0.6890 - val_acc: 0.9167
Epoch 83/500
84/84 [==============================] - 0s - loss: 0.6863 - acc: 0.9167 - val_loss: 0.6852 - val_acc: 0.9167
Epoch 84/500
84/84 [==============================] - ETA: 0s - loss: 0.6639 - acc: 0.937 - 0s - loss: 0.6824 - acc: 0.9167 - val_loss: 0.6819 - val_acc: 0.9167
Epoch 85/500
84/84 [==============================] - 0s - loss: 0.6788 - acc: 0.9167 - val_loss: 0.6786 - val_acc: 0.9167
Epoch 86/500
84/84 [==============================] - 0s - loss: 0.6752 - acc: 0.9167 - val_loss: 0.6753 - val_acc: 0.9167
Epoch 87/500
84/84 [==============================] - 0s - loss: 0.6717 - acc: 0.9167 - val_loss: 0.6722 - val_acc: 0.9167
Epoch 88/500
84/84 [==============================] - 0s - loss: 0.6685 - acc: 0.9167 - val_loss: 0.6679 - val_acc: 0.9167
Epoch 89/500
84/84 [==============================] - 0s - loss: 0.6648 - acc: 0.9167 - val_loss: 0.6638 - val_acc: 0.9167
Epoch 90/500
84/84 [==============================] - ETA: 0s - loss: 0.6654 - acc: 0.906 - 0s - loss: 0.6611 - acc: 0.9286 - val_loss: 0.6608 - val_acc: 0.9167
Epoch 91/500
84/84 [==============================] - 0s - loss: 0.6576 - acc: 0.9286 - val_loss: 0.6581 - val_acc: 0.9167
Epoch 92/500
84/84 [==============================] - 0s - loss: 0.6541 - acc: 0.9286 - val_loss: 0.6550 - val_acc: 0.9167
Epoch 93/500
84/84 [==============================] - 0s - loss: 0.6508 - acc: 0.9286 - val_loss: 0.6513 - val_acc: 0.9167
Epoch 94/500
84/84 [==============================] - 0s - loss: 0.6473 - acc: 0.9286 - val_loss: 0.6476 - val_acc: 0.9167
Epoch 95/500
84/84 [==============================] - 0s - loss: 0.6440 - acc: 0.9286 - val_loss: 0.6438 - val_acc: 0.9167
Epoch 96/500
84/84 [==============================] - 0s - loss: 0.6407 - acc: 0.9286 - val_loss: 0.6397 - val_acc: 0.9444
Epoch 97/500
84/84 [==============================] - 0s - loss: 0.6373 - acc: 0.9286 - val_loss: 0.6366 - val_acc: 0.9444
Epoch 98/500
84/84 [==============================] - ETA: 0s - loss: 0.6078 - acc: 0.937 - 0s - loss: 0.6341 - acc: 0.9286 - val_loss: 0.6332 - val_acc: 0.9444
Epoch 99/500
84/84 [==============================] - 0s - loss: 0.6308 - acc: 0.9286 - val_loss: 0.6310 - val_acc: 0.9167
Epoch 100/500
84/84 [==============================] - ETA: 0s - loss: 0.6982 - acc: 0.843 - 0s - loss: 0.6279 - acc: 0.9286 - val_loss: 0.6290 - val_acc: 0.9167
Epoch 101/500
84/84 [==============================] - 0s - loss: 0.6243 - acc: 0.9286 - val_loss: 0.6255 - val_acc: 0.9167
Epoch 102/500
84/84 [==============================] - 0s - loss: 0.6213 - acc: 0.9286 - val_loss: 0.6224 - val_acc: 0.9167
Epoch 103/500
84/84 [==============================] - ETA: 0s - loss: 0.6260 - acc: 0.906 - 0s - loss: 0.6182 - acc: 0.9286 - val_loss: 0.6202 - val_acc: 0.9167
Epoch 104/500
84/84 [==============================] - 0s - loss: 0.6150 - acc: 0.9286 - val_loss: 0.6168 - val_acc: 0.9167
Epoch 105/500
84/84 [==============================] - 0s - loss: 0.6119 - acc: 0.9286 - val_loss: 0.6136 - val_acc: 0.9167
Epoch 106/500
84/84 [==============================] - 0s - loss: 0.6087 - acc: 0.9286 - val_loss: 0.6107 - val_acc: 0.9167
Epoch 107/500
84/84 [==============================] - 0s - loss: 0.6057 - acc: 0.9286 - val_loss: 0.6081 - val_acc: 0.9167
Epoch 108/500
84/84 [==============================] - 0s - loss: 0.6028 - acc: 0.9286 - val_loss: 0.6054 - val_acc: 0.9167
Epoch 109/500
84/84 [==============================] - ETA: 0s - loss: 0.6026 - acc: 0.875 - 0s - loss: 0.5999 - acc: 0.9286 - val_loss: 0.6025 - val_acc: 0.9167
Epoch 110/500
84/84 [==============================] - 0s - loss: 0.5967 - acc: 0.9286 - val_loss: 0.5990 - val_acc: 0.9167
Epoch 111/500
84/84 [==============================] - 0s - loss: 0.5939 - acc: 0.9286 - val_loss: 0.5950 - val_acc: 0.9444
Epoch 112/500
84/84 [==============================] - 0s - loss: 0.5910 - acc: 0.9286 - val_loss: 0.5920 - val_acc: 0.9444
Epoch 113/500
84/84 [==============================] - 0s - loss: 0.5880 - acc: 0.9286 - val_loss: 0.5890 - val_acc: 0.9444
Epoch 114/500
84/84 [==============================] - 0s - loss: 0.5851 - acc: 0.9286 - val_loss: 0.5864 - val_acc: 0.9444
Epoch 115/500
84/84 [==============================] - 0s - loss: 0.5823 - acc: 0.9286 - val_loss: 0.5833 - val_acc: 0.9444
Epoch 116/500
84/84 [==============================] - 0s - loss: 0.5795 - acc: 0.9286 - val_loss: 0.5803 - val_acc: 0.9444
Epoch 117/500
84/84 [==============================] - 0s - loss: 0.5769 - acc: 0.9286 - val_loss: 0.5770 - val_acc: 0.9444
Epoch 118/500
84/84 [==============================] - 0s - loss: 0.5740 - acc: 0.9286 - val_loss: 0.5745 - val_acc: 0.9444
Epoch 119/500
84/84 [==============================] - ETA: 0s - loss: 0.5774 - acc: 0.968 - 0s - loss: 0.5713 - acc: 0.9286 - val_loss: 0.5725 - val_acc: 0.9444
Epoch 120/500
84/84 [==============================] - 0s - loss: 0.5686 - acc: 0.9286 - val_loss: 0.5703 - val_acc: 0.9444
Epoch 121/500
84/84 [==============================] - ETA: 0s - loss: 0.5552 - acc: 0.906 - 0s - loss: 0.5659 - acc: 0.9286 - val_loss: 0.5680 - val_acc: 0.9444
Epoch 122/500
84/84 [==============================] - 0s - loss: 0.5634 - acc: 0.9286 - val_loss: 0.5661 - val_acc: 0.9444
Epoch 123/500
84/84 [==============================] - 0s - loss: 0.5606 - acc: 0.9286 - val_loss: 0.5631 - val_acc: 0.9444
Epoch 124/500
84/84 [==============================] - 0s - loss: 0.5580 - acc: 0.9286 - val_loss: 0.5609 - val_acc: 0.9444
Epoch 125/500
84/84 [==============================] - ETA: 0s - loss: 0.4979 - acc: 1.000 - 0s - loss: 0.5555 - acc: 0.9286 - val_loss: 0.5585 - val_acc: 0.9444
Epoch 126/500
84/84 [==============================] - 0s - loss: 0.5529 - acc: 0.9286 - val_loss: 0.5569 - val_acc: 0.9444
Epoch 127/500
84/84 [==============================] - ETA: 0s - loss: 0.5301 - acc: 0.968 - 0s - loss: 0.5502 - acc: 0.9286 - val_loss: 0.5547 - val_acc: 0.9444
Epoch 128/500
84/84 [==============================] - 0s - loss: 0.5478 - acc: 0.9286 - val_loss: 0.5531 - val_acc: 0.9167
Epoch 129/500
84/84 [==============================] - ETA: 0s - loss: 0.5464 - acc: 0.937 - 0s - loss: 0.5453 - acc: 0.9286 - val_loss: 0.5515 - val_acc: 0.9167
Epoch 130/500
84/84 [==============================] - ETA: 0s - loss: 0.5672 - acc: 0.906 - 0s - loss: 0.5427 - acc: 0.9286 - val_loss: 0.5501 - val_acc: 0.9167
Epoch 131/500
84/84 [==============================] - ETA: 0s - loss: 0.5083 - acc: 0.937 - 0s - loss: 0.5403 - acc: 0.9286 - val_loss: 0.5487 - val_acc: 0.9167
Epoch 132/500
84/84 [==============================] - 0s - loss: 0.5379 - acc: 0.9524 - val_loss: 0.5473 - val_acc: 0.9167
Epoch 133/500
84/84 [==============================] - 0s - loss: 0.5355 - acc: 0.9524 - val_loss: 0.5452 - val_acc: 0.9167
Epoch 134/500
84/84 [==============================] - 0s - loss: 0.5331 - acc: 0.9524 - val_loss: 0.5430 - val_acc: 0.9167
Epoch 135/500
84/84 [==============================] - 0s - loss: 0.5307 - acc: 0.9524 - val_loss: 0.5412 - val_acc: 0.9167
Epoch 136/500
84/84 [==============================] - 0s - loss: 0.5285 - acc: 0.9524 - val_loss: 0.5390 - val_acc: 0.9167
Epoch 137/500
84/84 [==============================] - 0s - loss: 0.5260 - acc: 0.9524 - val_loss: 0.5366 - val_acc: 0.9167
Epoch 138/500
84/84 [==============================] - 0s - loss: 0.5238 - acc: 0.9524 - val_loss: 0.5344 - val_acc: 0.9167
Epoch 139/500
84/84 [==============================] - 0s - loss: 0.5215 - acc: 0.9524 - val_loss: 0.5316 - val_acc: 0.9167
Epoch 140/500
84/84 [==============================] - 0s - loss: 0.5192 - acc: 0.9524 - val_loss: 0.5295 - val_acc: 0.9167
Epoch 141/500
84/84 [==============================] - 0s - loss: 0.5170 - acc: 0.9524 - val_loss: 0.5268 - val_acc: 0.9167
Epoch 142/500
84/84 [==============================] - 0s - loss: 0.5147 - acc: 0.9524 - val_loss: 0.5242 - val_acc: 0.9167
Epoch 143/500
84/84 [==============================] - 0s - loss: 0.5125 - acc: 0.9524 - val_loss: 0.5220 - val_acc: 0.9167
Epoch 144/500
84/84 [==============================] - 0s - loss: 0.5103 - acc: 0.9524 - val_loss: 0.5195 - val_acc: 0.9167
Epoch 145/500
84/84 [==============================] - 0s - loss: 0.5081 - acc: 0.9524 - val_loss: 0.5167 - val_acc: 0.9444
Epoch 146/500
84/84 [==============================] - ETA: 0s - loss: 0.5039 - acc: 1.000 - 0s - loss: 0.5060 - acc: 0.9524 - val_loss: 0.5143 - val_acc: 0.9444
Epoch 147/500
84/84 [==============================] - 0s - loss: 0.5038 - acc: 0.9524 - val_loss: 0.5120 - val_acc: 0.9444
Epoch 148/500
84/84 [==============================] - 0s - loss: 0.5018 - acc: 0.9286 - val_loss: 0.5098 - val_acc: 0.9444
Epoch 149/500
84/84 [==============================] - 0s - loss: 0.4996 - acc: 0.9524 - val_loss: 0.5088 - val_acc: 0.9444
Epoch 150/500
84/84 [==============================] - 0s - loss: 0.4975 - acc: 0.9524 - val_loss: 0.5072 - val_acc: 0.9444
Epoch 151/500
84/84 [==============================] - 0s - loss: 0.4953 - acc: 0.9524 - val_loss: 0.5058 - val_acc: 0.9167
Epoch 152/500
84/84 [==============================] - 0s - loss: 0.4934 - acc: 0.9524 - val_loss: 0.5044 - val_acc: 0.9167
Epoch 153/500
84/84 [==============================] - 0s - loss: 0.4913 - acc: 0.9524 - val_loss: 0.5025 - val_acc: 0.9167
Epoch 154/500
84/84 [==============================] - 0s - loss: 0.4893 - acc: 0.9524 - val_loss: 0.5010 - val_acc: 0.9167
Epoch 155/500
84/84 [==============================] - 0s - loss: 0.4873 - acc: 0.9524 - val_loss: 0.4996 - val_acc: 0.9167
Epoch 156/500
84/84 [==============================] - 0s - loss: 0.4853 - acc: 0.9524 - val_loss: 0.4975 - val_acc: 0.9167
Epoch 157/500
84/84 [==============================] - 0s - loss: 0.4834 - acc: 0.9524 - val_loss: 0.4961 - val_acc: 0.9167
Epoch 158/500
84/84 [==============================] - 0s - loss: 0.4815 - acc: 0.9524 - val_loss: 0.4947 - val_acc: 0.9167
Epoch 159/500
84/84 [==============================] - 0s - loss: 0.4795 - acc: 0.9524 - val_loss: 0.4924 - val_acc: 0.9167
Epoch 160/500
84/84 [==============================] - 0s - loss: 0.4776 - acc: 0.9524 - val_loss: 0.4904 - val_acc: 0.9167
Epoch 161/500
84/84 [==============================] - 0s - loss: 0.4757 - acc: 0.9524 - val_loss: 0.4885 - val_acc: 0.9167
Epoch 162/500
84/84 [==============================] - 0s - loss: 0.4738 - acc: 0.9524 - val_loss: 0.4874 - val_acc: 0.9167
Epoch 163/500
84/84 [==============================] - 0s - loss: 0.4719 - acc: 0.9524 - val_loss: 0.4857 - val_acc: 0.9167
Epoch 164/500
84/84 [==============================] - ETA: 0s - loss: 0.4623 - acc: 0.968 - 0s - loss: 0.4702 - acc: 0.9643 - val_loss: 0.4839 - val_acc: 0.9167
Epoch 165/500
84/84 [==============================] - 0s - loss: 0.4683 - acc: 0.9762 - val_loss: 0.4824 - val_acc: 0.9167
Epoch 166/500
84/84 [==============================] - 0s - loss: 0.4668 - acc: 0.9524 - val_loss: 0.4793 - val_acc: 0.9444
Epoch 167/500
84/84 [==============================] - 0s - loss: 0.4648 - acc: 0.9762 - val_loss: 0.4783 - val_acc: 0.9167
Epoch 168/500
84/84 [==============================] - 0s - loss: 0.4628 - acc: 0.9762 - val_loss: 0.4767 - val_acc: 0.9444
Epoch 169/500
84/84 [==============================] - 0s - loss: 0.4611 - acc: 0.9762 - val_loss: 0.4753 - val_acc: 0.9167
Epoch 170/500
84/84 [==============================] - 0s - loss: 0.4593 - acc: 0.9762 - val_loss: 0.4733 - val_acc: 0.9444
Epoch 171/500
84/84 [==============================] - 0s - loss: 0.4579 - acc: 0.9524 - val_loss: 0.4710 - val_acc: 0.9444
Epoch 172/500
84/84 [==============================] - 0s - loss: 0.4558 - acc: 0.9762 - val_loss: 0.4706 - val_acc: 0.9444
Epoch 173/500
84/84 [==============================] - 0s - loss: 0.4541 - acc: 0.9762 - val_loss: 0.4699 - val_acc: 0.9167
Epoch 174/500
84/84 [==============================] - 0s - loss: 0.4525 - acc: 0.9762 - val_loss: 0.4692 - val_acc: 0.9167
Epoch 175/500
84/84 [==============================] - 0s - loss: 0.4507 - acc: 0.9762 - val_loss: 0.4682 - val_acc: 0.9167
Epoch 176/500
84/84 [==============================] - 0s - loss: 0.4492 - acc: 0.9762 - val_loss: 0.4670 - val_acc: 0.9167
Epoch 177/500
84/84 [==============================] - 0s - loss: 0.4475 - acc: 0.9762 - val_loss: 0.4667 - val_acc: 0.9167
Epoch 178/500
84/84 [==============================] - 0s - loss: 0.4458 - acc: 0.9762 - val_loss: 0.4655 - val_acc: 0.9167
Epoch 179/500
84/84 [==============================] - 0s - loss: 0.4442 - acc: 0.9762 - val_loss: 0.4638 - val_acc: 0.9167
Epoch 180/500
84/84 [==============================] - 0s - loss: 0.4425 - acc: 0.9762 - val_loss: 0.4616 - val_acc: 0.9167
Epoch 181/500
84/84 [==============================] - 0s - loss: 0.4409 - acc: 0.9762 - val_loss: 0.4597 - val_acc: 0.9167
Epoch 182/500
84/84 [==============================] - ETA: 0s - loss: 0.4329 - acc: 0.968 - 0s - loss: 0.4393 - acc: 0.9762 - val_loss: 0.4578 - val_acc: 0.9167
Epoch 183/500
84/84 [==============================] - 0s - loss: 0.4377 - acc: 0.9762 - val_loss: 0.4565 - val_acc: 0.9167
Epoch 184/500
84/84 [==============================] - ETA: 0s - loss: 0.4332 - acc: 0.968 - 0s - loss: 0.4361 - acc: 0.9762 - val_loss: 0.4552 - val_acc: 0.9167
Epoch 185/500
84/84 [==============================] - 0s - loss: 0.4347 - acc: 0.9762 - val_loss: 0.4535 - val_acc: 0.9167
Epoch 186/500
84/84 [==============================] - 0s - loss: 0.4330 - acc: 0.9762 - val_loss: 0.4532 - val_acc: 0.9167
Epoch 187/500
84/84 [==============================] - 0s - loss: 0.4315 - acc: 0.9762 - val_loss: 0.4526 - val_acc: 0.9167
Epoch 188/500
84/84 [==============================] - 0s - loss: 0.4300 - acc: 0.9762 - val_loss: 0.4512 - val_acc: 0.9167
Epoch 189/500
84/84 [==============================] - 0s - loss: 0.4284 - acc: 0.9762 - val_loss: 0.4494 - val_acc: 0.9167
Epoch 190/500
84/84 [==============================] - 0s - loss: 0.4269 - acc: 0.9762 - val_loss: 0.4479 - val_acc: 0.9167
Epoch 191/500
84/84 [==============================] - 0s - loss: 0.4254 - acc: 0.9762 - val_loss: 0.4458 - val_acc: 0.9167
Epoch 192/500
84/84 [==============================] - 0s - loss: 0.4240 - acc: 0.9762 - val_loss: 0.4429 - val_acc: 0.9444
Epoch 193/500
84/84 [==============================] - 0s - loss: 0.4225 - acc: 0.9762 - val_loss: 0.4404 - val_acc: 0.9444
Epoch 194/500
84/84 [==============================] - 0s - loss: 0.4210 - acc: 0.9881 - val_loss: 0.4393 - val_acc: 0.9444
Epoch 195/500
84/84 [==============================] - 0s - loss: 0.4195 - acc: 0.9881 - val_loss: 0.4377 - val_acc: 0.9444
Epoch 196/500
84/84 [==============================] - 0s - loss: 0.4180 - acc: 0.9881 - val_loss: 0.4365 - val_acc: 0.9444
Epoch 197/500
84/84 [==============================] - 0s - loss: 0.4166 - acc: 0.9881 - val_loss: 0.4352 - val_acc: 0.9444
Epoch 198/500
84/84 [==============================] - 0s - loss: 0.4151 - acc: 0.9881 - val_loss: 0.4341 - val_acc: 0.9444
Epoch 199/500
84/84 [==============================] - 0s - loss: 0.4138 - acc: 0.9881 - val_loss: 0.4340 - val_acc: 0.9444
Epoch 200/500
84/84 [==============================] - 0s - loss: 0.4123 - acc: 0.9762 - val_loss: 0.4325 - val_acc: 0.9444
Epoch 201/500
84/84 [==============================] - 0s - loss: 0.4109 - acc: 0.9881 - val_loss: 0.4319 - val_acc: 0.9444
Epoch 202/500
84/84 [==============================] - 0s - loss: 0.4100 - acc: 0.9762 - val_loss: 0.4320 - val_acc: 0.9167
Epoch 203/500
84/84 [==============================] - 0s - loss: 0.4081 - acc: 0.9762 - val_loss: 0.4302 - val_acc: 0.9167
Epoch 204/500
84/84 [==============================] - 0s - loss: 0.4067 - acc: 0.9762 - val_loss: 0.4281 - val_acc: 0.9444
Epoch 205/500
84/84 [==============================] - 0s - loss: 0.4054 - acc: 0.9881 - val_loss: 0.4258 - val_acc: 0.9444
Epoch 206/500
84/84 [==============================] - 0s - loss: 0.4041 - acc: 0.9881 - val_loss: 0.4240 - val_acc: 0.9444
Epoch 207/500
84/84 [==============================] - 0s - loss: 0.4027 - acc: 0.9881 - val_loss: 0.4237 - val_acc: 0.9444
Epoch 208/500
84/84 [==============================] - 0s - loss: 0.4015 - acc: 0.9881 - val_loss: 0.4232 - val_acc: 0.9444
Epoch 209/500
84/84 [==============================] - 0s - loss: 0.3999 - acc: 0.9881 - val_loss: 0.4213 - val_acc: 0.9444
Epoch 210/500
84/84 [==============================] - 0s - loss: 0.3987 - acc: 0.9881 - val_loss: 0.4201 - val_acc: 0.9444
Epoch 211/500
84/84 [==============================] - 0s - loss: 0.3973 - acc: 0.9881 - val_loss: 0.4183 - val_acc: 0.9444
Epoch 212/500
84/84 [==============================] - 0s - loss: 0.3961 - acc: 0.9881 - val_loss: 0.4166 - val_acc: 0.9444
Epoch 213/500
84/84 [==============================] - 0s - loss: 0.3948 - acc: 0.9881 - val_loss: 0.4160 - val_acc: 0.9444
Epoch 214/500
84/84 [==============================] - 0s - loss: 0.3934 - acc: 0.9881 - val_loss: 0.4147 - val_acc: 0.9444
Epoch 215/500
84/84 [==============================] - 0s - loss: 0.3922 - acc: 0.9881 - val_loss: 0.4141 - val_acc: 0.9444
Epoch 216/500
84/84 [==============================] - 0s - loss: 0.3909 - acc: 0.9881 - val_loss: 0.4123 - val_acc: 0.9444
Epoch 217/500
84/84 [==============================] - 0s - loss: 0.3896 - acc: 0.9881 - val_loss: 0.4114 - val_acc: 0.9444
Epoch 218/500
84/84 [==============================] - 0s - loss: 0.3883 - acc: 0.9881 - val_loss: 0.4110 - val_acc: 0.9444
Epoch 219/500
84/84 [==============================] - 0s - loss: 0.3870 - acc: 0.9881 - val_loss: 0.4103 - val_acc: 0.9444
Epoch 220/500
84/84 [==============================] - ETA: 0s - loss: 0.4505 - acc: 0.968 - 0s - loss: 0.3858 - acc: 0.9881 - val_loss: 0.4099 - val_acc: 0.9444
Epoch 221/500
84/84 [==============================] - 0s - loss: 0.3845 - acc: 0.9881 - val_loss: 0.4094 - val_acc: 0.9444
Epoch 222/500
84/84 [==============================] - 0s - loss: 0.3832 - acc: 0.9881 - val_loss: 0.4101 - val_acc: 0.9167
Epoch 223/500
84/84 [==============================] - 0s - loss: 0.3820 - acc: 0.9881 - val_loss: 0.4105 - val_acc: 0.9167
Epoch 224/500
84/84 [==============================] - 0s - loss: 0.3808 - acc: 0.9762 - val_loss: 0.4103 - val_acc: 0.8889
Epoch 225/500
84/84 [==============================] - 0s - loss: 0.3798 - acc: 0.9762 - val_loss: 0.4111 - val_acc: 0.8889
Epoch 226/500
84/84 [==============================] - 0s - loss: 0.3784 - acc: 0.9762 - val_loss: 0.4100 - val_acc: 0.8889
Epoch 227/500
84/84 [==============================] - 0s - loss: 0.3772 - acc: 0.9762 - val_loss: 0.4085 - val_acc: 0.8889
Epoch 228/500
84/84 [==============================] - 0s - loss: 0.3760 - acc: 0.9762 - val_loss: 0.4069 - val_acc: 0.8889
Epoch 229/500
84/84 [==============================] - 0s - loss: 0.3747 - acc: 0.9762 - val_loss: 0.4050 - val_acc: 0.8889
Epoch 230/500
84/84 [==============================] - 0s - loss: 0.3735 - acc: 0.9762 - val_loss: 0.4029 - val_acc: 0.8889
Epoch 231/500
84/84 [==============================] - 0s - loss: 0.3723 - acc: 0.9881 - val_loss: 0.4005 - val_acc: 0.9444
Epoch 232/500
84/84 [==============================] - 0s - loss: 0.3713 - acc: 0.9881 - val_loss: 0.3979 - val_acc: 0.9444
Epoch 233/500
84/84 [==============================] - 0s - loss: 0.3699 - acc: 0.9881 - val_loss: 0.3969 - val_acc: 0.9444
Epoch 234/500
84/84 [==============================] - 0s - loss: 0.3687 - acc: 0.9881 - val_loss: 0.3961 - val_acc: 0.9444
Epoch 235/500
84/84 [==============================] - 0s - loss: 0.3676 - acc: 0.9881 - val_loss: 0.3960 - val_acc: 0.9444
Epoch 236/500
84/84 [==============================] - 0s - loss: 0.3664 - acc: 0.9881 - val_loss: 0.3954 - val_acc: 0.9444
Epoch 237/500
84/84 [==============================] - 0s - loss: 0.3652 - acc: 0.9881 - val_loss: 0.3944 - val_acc: 0.9444
Epoch 238/500
84/84 [==============================] - 0s - loss: 0.3641 - acc: 0.9881 - val_loss: 0.3936 - val_acc: 0.9444
Epoch 239/500
84/84 [==============================] - 0s - loss: 0.3630 - acc: 0.9881 - val_loss: 0.3925 - val_acc: 0.9444
Epoch 240/500
84/84 [==============================] - 0s - loss: 0.3617 - acc: 0.9881 - val_loss: 0.3913 - val_acc: 0.9444
Epoch 241/500
84/84 [==============================] - 0s - loss: 0.3607 - acc: 0.9881 - val_loss: 0.3900 - val_acc: 0.9444
Epoch 242/500
84/84 [==============================] - 0s - loss: 0.3594 - acc: 0.9881 - val_loss: 0.3894 - val_acc: 0.9444
Epoch 243/500
84/84 [==============================] - 0s - loss: 0.3583 - acc: 0.9881 - val_loss: 0.3890 - val_acc: 0.9444
Epoch 244/500
84/84 [==============================] - 0s - loss: 0.3572 - acc: 0.9881 - val_loss: 0.3882 - val_acc: 0.9167
Epoch 245/500
84/84 [==============================] - 0s - loss: 0.3561 - acc: 0.9881 - val_loss: 0.3865 - val_acc: 0.9444
Epoch 246/500
84/84 [==============================] - 0s - loss: 0.3550 - acc: 0.9881 - val_loss: 0.3855 - val_acc: 0.9444
Epoch 247/500
84/84 [==============================] - 0s - loss: 0.3540 - acc: 0.9881 - val_loss: 0.3856 - val_acc: 0.9167
Epoch 248/500
84/84 [==============================] - 0s - loss: 0.3527 - acc: 0.9881 - val_loss: 0.3841 - val_acc: 0.9167
Epoch 249/500
84/84 [==============================] - 0s - loss: 0.3516 - acc: 0.9881 - val_loss: 0.3832 - val_acc: 0.9167
Epoch 250/500
84/84 [==============================] - ETA: 0s - loss: 0.3350 - acc: 1.000 - 0s - loss: 0.3505 - acc: 0.9881 - val_loss: 0.3812 - val_acc: 0.9444
Epoch 251/500
84/84 [==============================] - ETA: 0s - loss: 0.3523 - acc: 1.000 - 0s - loss: 0.3494 - acc: 0.9881 - val_loss: 0.3797 - val_acc: 0.9444
Epoch 252/500
84/84 [==============================] - 0s - loss: 0.3483 - acc: 0.9881 - val_loss: 0.3787 - val_acc: 0.9444
Epoch 253/500
84/84 [==============================] - 0s - loss: 0.3471 - acc: 0.9881 - val_loss: 0.3782 - val_acc: 0.9444
Epoch 254/500
84/84 [==============================] - 0s - loss: 0.3460 - acc: 0.9881 - val_loss: 0.3784 - val_acc: 0.9167
Epoch 255/500
84/84 [==============================] - 0s - loss: 0.3450 - acc: 0.9881 - val_loss: 0.3794 - val_acc: 0.8889
Epoch 256/500
84/84 [==============================] - 0s - loss: 0.3440 - acc: 0.9881 - val_loss: 0.3801 - val_acc: 0.8889
Epoch 257/500
84/84 [==============================] - 0s - loss: 0.3428 - acc: 0.9881 - val_loss: 0.3793 - val_acc: 0.8889
Epoch 258/500
84/84 [==============================] - ETA: 0s - loss: 0.3816 - acc: 1.000 - 0s - loss: 0.3418 - acc: 0.9881 - val_loss: 0.3789 - val_acc: 0.8889
Epoch 259/500
84/84 [==============================] - ETA: 0s - loss: 0.3556 - acc: 1.000 - 0s - loss: 0.3408 - acc: 0.9881 - val_loss: 0.3771 - val_acc: 0.8889
Epoch 260/500
84/84 [==============================] - 0s - loss: 0.3396 - acc: 0.9881 - val_loss: 0.3759 - val_acc: 0.8889
Epoch 261/500
84/84 [==============================] - 0s - loss: 0.3386 - acc: 0.9881 - val_loss: 0.3750 - val_acc: 0.8889
Epoch 262/500
84/84 [==============================] - 0s - loss: 0.3376 - acc: 0.9881 - val_loss: 0.3725 - val_acc: 0.9167
Epoch 263/500
84/84 [==============================] - 0s - loss: 0.3364 - acc: 0.9881 - val_loss: 0.3717 - val_acc: 0.9167
Epoch 264/500
84/84 [==============================] - ETA: 0s - loss: 0.3150 - acc: 1.000 - 0s - loss: 0.3353 - acc: 0.9881 - val_loss: 0.3708 - val_acc: 0.9167
Epoch 265/500
84/84 [==============================] - ETA: 0s - loss: 0.3157 - acc: 0.968 - 0s - loss: 0.3344 - acc: 0.9881 - val_loss: 0.3701 - val_acc: 0.9167
Epoch 266/500
84/84 [==============================] - ETA: 0s - loss: 0.3483 - acc: 1.000 - 0s - loss: 0.3333 - acc: 0.9881 - val_loss: 0.3677 - val_acc: 0.9167
Epoch 267/500
84/84 [==============================] - 0s - loss: 0.3322 - acc: 0.9881 - val_loss: 0.3662 - val_acc: 0.9167
Epoch 268/500
84/84 [==============================] - 0s - loss: 0.3311 - acc: 0.9881 - val_loss: 0.3646 - val_acc: 0.9167
Epoch 269/500
84/84 [==============================] - 0s - loss: 0.3300 - acc: 0.9881 - val_loss: 0.3628 - val_acc: 0.9444
Epoch 270/500
84/84 [==============================] - 0s - loss: 0.3292 - acc: 0.9881 - val_loss: 0.3606 - val_acc: 0.9444
Epoch 271/500
84/84 [==============================] - 0s - loss: 0.3281 - acc: 0.9881 - val_loss: 0.3595 - val_acc: 0.9444
Epoch 272/500
84/84 [==============================] - 0s - loss: 0.3272 - acc: 0.9881 - val_loss: 0.3595 - val_acc: 0.9444
Epoch 273/500
84/84 [==============================] - 0s - loss: 0.3261 - acc: 0.9881 - val_loss: 0.3591 - val_acc: 0.9444
Epoch 274/500
84/84 [==============================] - 0s - loss: 0.3250 - acc: 0.9881 - val_loss: 0.3581 - val_acc: 0.9444
Epoch 275/500
84/84 [==============================] - 0s - loss: 0.3241 - acc: 0.9881 - val_loss: 0.3574 - val_acc: 0.9444
Epoch 276/500
84/84 [==============================] - 0s - loss: 0.3230 - acc: 0.9881 - val_loss: 0.3566 - val_acc: 0.9444
Epoch 277/500
84/84 [==============================] - 0s - loss: 0.3220 - acc: 0.9881 - val_loss: 0.3557 - val_acc: 0.9444
Epoch 278/500
84/84 [==============================] - 0s - loss: 0.3210 - acc: 0.9881 - val_loss: 0.3548 - val_acc: 0.9444
Epoch 279/500
84/84 [==============================] - ETA: 0s - loss: 0.3293 - acc: 1.000 - 0s - loss: 0.3200 - acc: 0.9881 - val_loss: 0.3550 - val_acc: 0.9167
Epoch 280/500
84/84 [==============================] - 0s - loss: 0.3189 - acc: 0.9881 - val_loss: 0.3551 - val_acc: 0.9167
Epoch 281/500
84/84 [==============================] - ETA: 0s - loss: 0.3197 - acc: 1.000 - 0s - loss: 0.3179 - acc: 0.9881 - val_loss: 0.3550 - val_acc: 0.9167
Epoch 282/500
84/84 [==============================] - 0s - loss: 0.3169 - acc: 0.9881 - val_loss: 0.3558 - val_acc: 0.9167
Epoch 283/500
84/84 [==============================] - 0s - loss: 0.3159 - acc: 0.9881 - val_loss: 0.3558 - val_acc: 0.9167
Epoch 284/500
84/84 [==============================] - 0s - loss: 0.3152 - acc: 0.9881 - val_loss: 0.3566 - val_acc: 0.8889
Epoch 285/500
84/84 [==============================] - 0s - loss: 0.3142 - acc: 0.9881 - val_loss: 0.3561 - val_acc: 0.8889
Epoch 286/500
84/84 [==============================] - 0s - loss: 0.3130 - acc: 0.9881 - val_loss: 0.3540 - val_acc: 0.9167
Epoch 287/500
84/84 [==============================] - 0s - loss: 0.3120 - acc: 0.9881 - val_loss: 0.3517 - val_acc: 0.9167
Epoch 288/500
84/84 [==============================] - 0s - loss: 0.3111 - acc: 0.9881 - val_loss: 0.3493 - val_acc: 0.9167
Epoch 289/500
84/84 [==============================] - 0s - loss: 0.3103 - acc: 0.9881 - val_loss: 0.3470 - val_acc: 0.9167
Epoch 290/500
84/84 [==============================] - 0s - loss: 0.3091 - acc: 0.9881 - val_loss: 0.3457 - val_acc: 0.9167
Epoch 291/500
84/84 [==============================] - 0s - loss: 0.3082 - acc: 0.9881 - val_loss: 0.3451 - val_acc: 0.9167
Epoch 292/500
84/84 [==============================] - 0s - loss: 0.3074 - acc: 0.9881 - val_loss: 0.3434 - val_acc: 0.9167
Epoch 293/500
84/84 [==============================] - 0s - loss: 0.3063 - acc: 0.9881 - val_loss: 0.3435 - val_acc: 0.9167
Epoch 294/500
84/84 [==============================] - 0s - loss: 0.3056 - acc: 0.9881 - val_loss: 0.3438 - val_acc: 0.9167
Epoch 295/500
84/84 [==============================] - 0s - loss: 0.3045 - acc: 0.9881 - val_loss: 0.3433 - val_acc: 0.9167
Epoch 296/500
84/84 [==============================] - 0s - loss: 0.3034 - acc: 0.9881 - val_loss: 0.3419 - val_acc: 0.9167
Epoch 297/500
84/84 [==============================] - 0s - loss: 0.3025 - acc: 0.9881 - val_loss: 0.3401 - val_acc: 0.9167
Epoch 298/500
84/84 [==============================] - 0s - loss: 0.3018 - acc: 0.9881 - val_loss: 0.3388 - val_acc: 0.9167
Epoch 299/500
84/84 [==============================] - 0s - loss: 0.3006 - acc: 0.9881 - val_loss: 0.3392 - val_acc: 0.9167
Epoch 300/500
84/84 [==============================] - 0s - loss: 0.2998 - acc: 0.9881 - val_loss: 0.3404 - val_acc: 0.9167
Epoch 301/500
84/84 [==============================] - 0s - loss: 0.2987 - acc: 0.9881 - val_loss: 0.3404 - val_acc: 0.9167
Epoch 302/500
84/84 [==============================] - 0s - loss: 0.2978 - acc: 0.9881 - val_loss: 0.3404 - val_acc: 0.9167
Epoch 303/500
84/84 [==============================] - 0s - loss: 0.2970 - acc: 0.9881 - val_loss: 0.3402 - val_acc: 0.9167
Epoch 304/500
84/84 [==============================] - 0s - loss: 0.2960 - acc: 0.9881 - val_loss: 0.3385 - val_acc: 0.9167
Epoch 305/500
84/84 [==============================] - 0s - loss: 0.2950 - acc: 0.9881 - val_loss: 0.3369 - val_acc: 0.9167
Epoch 306/500
84/84 [==============================] - 0s - loss: 0.2942 - acc: 0.9881 - val_loss: 0.3347 - val_acc: 0.9167
Epoch 307/500
84/84 [==============================] - 0s - loss: 0.2935 - acc: 0.9881 - val_loss: 0.3328 - val_acc: 0.9167
Epoch 308/500
84/84 [==============================] - 0s - loss: 0.2923 - acc: 0.9881 - val_loss: 0.3323 - val_acc: 0.9167
Epoch 309/500
84/84 [==============================] - 0s - loss: 0.2914 - acc: 0.9881 - val_loss: 0.3322 - val_acc: 0.9167
Epoch 310/500
84/84 [==============================] - 0s - loss: 0.2905 - acc: 0.9881 - val_loss: 0.3328 - val_acc: 0.9167
Epoch 311/500
84/84 [==============================] - 0s - loss: 0.2896 - acc: 0.9881 - val_loss: 0.3329 - val_acc: 0.9167
Epoch 312/500
84/84 [==============================] - 0s - loss: 0.2887 - acc: 0.9881 - val_loss: 0.3324 - val_acc: 0.9167
Epoch 313/500
84/84 [==============================] - 0s - loss: 0.2878 - acc: 0.9881 - val_loss: 0.3316 - val_acc: 0.9167
Epoch 314/500
84/84 [==============================] - 0s - loss: 0.2869 - acc: 0.9881 - val_loss: 0.3311 - val_acc: 0.9167
Epoch 315/500
84/84 [==============================] - 0s - loss: 0.2861 - acc: 0.9881 - val_loss: 0.3308 - val_acc: 0.9167
Epoch 316/500
84/84 [==============================] - 0s - loss: 0.2851 - acc: 0.9881 - val_loss: 0.3303 - val_acc: 0.9167
Epoch 317/500
84/84 [==============================] - 0s - loss: 0.2845 - acc: 0.9881 - val_loss: 0.3313 - val_acc: 0.9167
Epoch 318/500
84/84 [==============================] - 0s - loss: 0.2835 - acc: 0.9881 - val_loss: 0.3309 - val_acc: 0.9167
Epoch 319/500
84/84 [==============================] - 0s - loss: 0.2825 - acc: 0.9881 - val_loss: 0.3289 - val_acc: 0.9167
Epoch 320/500
84/84 [==============================] - 0s - loss: 0.2818 - acc: 0.9881 - val_loss: 0.3261 - val_acc: 0.9167
Epoch 321/500
84/84 [==============================] - 0s - loss: 0.2807 - acc: 0.9881 - val_loss: 0.3251 - val_acc: 0.9167
Epoch 322/500
84/84 [==============================] - 0s - loss: 0.2798 - acc: 0.9881 - val_loss: 0.3236 - val_acc: 0.9167
Epoch 323/500
84/84 [==============================] - 0s - loss: 0.2794 - acc: 0.9881 - val_loss: 0.3216 - val_acc: 0.9167
Epoch 324/500
84/84 [==============================] - 0s - loss: 0.2781 - acc: 0.9881 - val_loss: 0.3216 - val_acc: 0.9167
Epoch 325/500
84/84 [==============================] - 0s - loss: 0.2774 - acc: 0.9881 - val_loss: 0.3229 - val_acc: 0.9167
Epoch 326/500
84/84 [==============================] - 0s - loss: 0.2764 - acc: 0.9881 - val_loss: 0.3224 - val_acc: 0.9167
Epoch 327/500
84/84 [==============================] - 0s - loss: 0.2755 - acc: 0.9881 - val_loss: 0.3226 - val_acc: 0.9167
Epoch 328/500
84/84 [==============================] - ETA: 0s - loss: 0.2790 - acc: 1.000 - 0s - loss: 0.2748 - acc: 0.9881 - val_loss: 0.3234 - val_acc: 0.9167
Epoch 329/500
84/84 [==============================] - 0s - loss: 0.2738 - acc: 0.9881 - val_loss: 0.3235 - val_acc: 0.9167
Epoch 330/500
84/84 [==============================] - 0s - loss: 0.2730 - acc: 0.9881 - val_loss: 0.3240 - val_acc: 0.8889
Epoch 331/500
84/84 [==============================] - 0s - loss: 0.2722 - acc: 0.9881 - val_loss: 0.3235 - val_acc: 0.8889
Epoch 332/500
84/84 [==============================] - 0s - loss: 0.2714 - acc: 0.9881 - val_loss: 0.3230 - val_acc: 0.8889
Epoch 333/500
84/84 [==============================] - 0s - loss: 0.2708 - acc: 0.9881 - val_loss: 0.3215 - val_acc: 0.9167
Epoch 334/500
84/84 [==============================] - 0s - loss: 0.2697 - acc: 0.9881 - val_loss: 0.3211 - val_acc: 0.9167
Epoch 335/500
84/84 [==============================] - ETA: 0s - loss: 0.2561 - acc: 1.000 - 0s - loss: 0.2690 - acc: 0.9881 - val_loss: 0.3197 - val_acc: 0.9167
Epoch 336/500
84/84 [==============================] - ETA: 0s - loss: 0.2707 - acc: 1.000 - 0s - loss: 0.2680 - acc: 0.9881 - val_loss: 0.3186 - val_acc: 0.9167
Epoch 337/500
84/84 [==============================] - 0s - loss: 0.2672 - acc: 0.9881 - val_loss: 0.3169 - val_acc: 0.9167
Epoch 338/500
84/84 [==============================] - 0s - loss: 0.2664 - acc: 0.9881 - val_loss: 0.3147 - val_acc: 0.9167
Epoch 339/500
84/84 [==============================] - 0s - loss: 0.2655 - acc: 0.9881 - val_loss: 0.3138 - val_acc: 0.9167
Epoch 340/500
84/84 [==============================] - 0s - loss: 0.2647 - acc: 0.9881 - val_loss: 0.3118 - val_acc: 0.9167
Epoch 341/500
84/84 [==============================] - 0s - loss: 0.2638 - acc: 0.9881 - val_loss: 0.3110 - val_acc: 0.9167
Epoch 342/500
84/84 [==============================] - 0s - loss: 0.2630 - acc: 0.9881 - val_loss: 0.3102 - val_acc: 0.9167
Epoch 343/500
84/84 [==============================] - 0s - loss: 0.2622 - acc: 0.9881 - val_loss: 0.3097 - val_acc: 0.9167
Epoch 344/500
84/84 [==============================] - 0s - loss: 0.2615 - acc: 0.9881 - val_loss: 0.3078 - val_acc: 0.9167
Epoch 345/500
84/84 [==============================] - 0s - loss: 0.2605 - acc: 0.9881 - val_loss: 0.3076 - val_acc: 0.9167
Epoch 346/500
84/84 [==============================] - 0s - loss: 0.2597 - acc: 0.9881 - val_loss: 0.3073 - val_acc: 0.9167
Epoch 347/500
84/84 [==============================] - 0s - loss: 0.2591 - acc: 0.9881 - val_loss: 0.3058 - val_acc: 0.9167
Epoch 348/500
84/84 [==============================] - 0s - loss: 0.2581 - acc: 0.9881 - val_loss: 0.3053 - val_acc: 0.9167
Epoch 349/500
84/84 [==============================] - 0s - loss: 0.2573 - acc: 0.9881 - val_loss: 0.3053 - val_acc: 0.9167
Epoch 350/500
84/84 [==============================] - 0s - loss: 0.2565 - acc: 0.9881 - val_loss: 0.3054 - val_acc: 0.9167
Epoch 351/500
84/84 [==============================] - 0s - loss: 0.2557 - acc: 0.9881 - val_loss: 0.3050 - val_acc: 0.9167
Epoch 352/500
84/84 [==============================] - 0s - loss: 0.2550 - acc: 0.9881 - val_loss: 0.3056 - val_acc: 0.9167
Epoch 353/500
84/84 [==============================] - 0s - loss: 0.2542 - acc: 0.9881 - val_loss: 0.3054 - val_acc: 0.9167
Epoch 354/500
84/84 [==============================] - 0s - loss: 0.2534 - acc: 0.9881 - val_loss: 0.3052 - val_acc: 0.9167
Epoch 355/500
84/84 [==============================] - 0s - loss: 0.2526 - acc: 0.9881 - val_loss: 0.3039 - val_acc: 0.9167
Epoch 356/500
84/84 [==============================] - 0s - loss: 0.2518 - acc: 0.9881 - val_loss: 0.3036 - val_acc: 0.9167
Epoch 357/500
84/84 [==============================] - 0s - loss: 0.2511 - acc: 0.9881 - val_loss: 0.3018 - val_acc: 0.9167
Epoch 358/500
84/84 [==============================] - 0s - loss: 0.2502 - acc: 0.9881 - val_loss: 0.3009 - val_acc: 0.9167
Epoch 359/500
84/84 [==============================] - 0s - loss: 0.2495 - acc: 0.9881 - val_loss: 0.3009 - val_acc: 0.9167
Epoch 360/500
84/84 [==============================] - 0s - loss: 0.2486 - acc: 0.9881 - val_loss: 0.2997 - val_acc: 0.9167
Epoch 361/500
84/84 [==============================] - 0s - loss: 0.2479 - acc: 0.9881 - val_loss: 0.2993 - val_acc: 0.9167
Epoch 362/500
84/84 [==============================] - 0s - loss: 0.2470 - acc: 0.9881 - val_loss: 0.2977 - val_acc: 0.9167
Epoch 363/500
84/84 [==============================] - 0s - loss: 0.2464 - acc: 0.9881 - val_loss: 0.2961 - val_acc: 0.9167
Epoch 364/500
84/84 [==============================] - 0s - loss: 0.2455 - acc: 0.9881 - val_loss: 0.2957 - val_acc: 0.9167
Epoch 365/500
84/84 [==============================] - 0s - loss: 0.2448 - acc: 0.9881 - val_loss: 0.2943 - val_acc: 0.9167
Epoch 366/500
84/84 [==============================] - 0s - loss: 0.2440 - acc: 0.9881 - val_loss: 0.2938 - val_acc: 0.9167
Epoch 367/500
84/84 [==============================] - 0s - loss: 0.2432 - acc: 0.9881 - val_loss: 0.2937 - val_acc: 0.9167
Epoch 368/500
84/84 [==============================] - 0s - loss: 0.2430 - acc: 0.9881 - val_loss: 0.2948 - val_acc: 0.9167
Epoch 369/500
84/84 [==============================] - 0s - loss: 0.2417 - acc: 0.9881 - val_loss: 0.2928 - val_acc: 0.9167
Epoch 370/500
84/84 [==============================] - 0s - loss: 0.2409 - acc: 0.9881 - val_loss: 0.2913 - val_acc: 0.9167
Epoch 371/500
84/84 [==============================] - ETA: 0s - loss: 0.2342 - acc: 1.000 - 0s - loss: 0.2403 - acc: 0.9881 - val_loss: 0.2901 - val_acc: 0.9444
Epoch 372/500
84/84 [==============================] - 0s - loss: 0.2396 - acc: 0.9881 - val_loss: 0.2903 - val_acc: 0.9167
Epoch 373/500
84/84 [==============================] - 0s - loss: 0.2389 - acc: 0.9881 - val_loss: 0.2889 - val_acc: 0.9444
Epoch 374/500
84/84 [==============================] - 0s - loss: 0.2380 - acc: 0.9881 - val_loss: 0.2891 - val_acc: 0.9167
Epoch 375/500
84/84 [==============================] - 0s - loss: 0.2373 - acc: 0.9881 - val_loss: 0.2885 - val_acc: 0.9167
Epoch 376/500
84/84 [==============================] - 0s - loss: 0.2365 - acc: 0.9881 - val_loss: 0.2887 - val_acc: 0.9167
Epoch 377/500
84/84 [==============================] - 0s - loss: 0.2358 - acc: 0.9881 - val_loss: 0.2893 - val_acc: 0.9167
Epoch 378/500
84/84 [==============================] - 0s - loss: 0.2351 - acc: 0.9881 - val_loss: 0.2892 - val_acc: 0.9167
Epoch 379/500
84/84 [==============================] - 0s - loss: 0.2343 - acc: 0.9881 - val_loss: 0.2882 - val_acc: 0.9167
Epoch 380/500
84/84 [==============================] - 0s - loss: 0.2336 - acc: 0.9881 - val_loss: 0.2871 - val_acc: 0.9167
Epoch 381/500
84/84 [==============================] - 0s - loss: 0.2329 - acc: 0.9881 - val_loss: 0.2867 - val_acc: 0.9167
Epoch 382/500
84/84 [==============================] - 0s - loss: 0.2322 - acc: 0.9881 - val_loss: 0.2857 - val_acc: 0.9167
Epoch 383/500
84/84 [==============================] - 0s - loss: 0.2315 - acc: 0.9881 - val_loss: 0.2849 - val_acc: 0.9444
Epoch 384/500
84/84 [==============================] - 0s - loss: 0.2308 - acc: 0.9881 - val_loss: 0.2846 - val_acc: 0.9167
Epoch 385/500
84/84 [==============================] - 0s - loss: 0.2300 - acc: 0.9881 - val_loss: 0.2839 - val_acc: 0.9444
Epoch 386/500
84/84 [==============================] - 0s - loss: 0.2293 - acc: 0.9881 - val_loss: 0.2833 - val_acc: 0.9444
Epoch 387/500
84/84 [==============================] - 0s - loss: 0.2286 - acc: 0.9881 - val_loss: 0.2823 - val_acc: 0.9444
Epoch 388/500
84/84 [==============================] - 0s - loss: 0.2279 - acc: 0.9881 - val_loss: 0.2813 - val_acc: 0.9444
Epoch 389/500
84/84 [==============================] - 0s - loss: 0.2273 - acc: 0.9881 - val_loss: 0.2808 - val_acc: 0.9444
Epoch 390/500
84/84 [==============================] - 0s - loss: 0.2267 - acc: 0.9881 - val_loss: 0.2783 - val_acc: 0.9444
Epoch 391/500
84/84 [==============================] - 0s - loss: 0.2258 - acc: 0.9881 - val_loss: 0.2778 - val_acc: 0.9444
Epoch 392/500
84/84 [==============================] - 0s - loss: 0.2251 - acc: 0.9881 - val_loss: 0.2781 - val_acc: 0.9444
Epoch 393/500
84/84 [==============================] - 0s - loss: 0.2243 - acc: 0.9881 - val_loss: 0.2788 - val_acc: 0.9444
Epoch 394/500
84/84 [==============================] - ETA: 0s - loss: 0.2114 - acc: 1.000 - 0s - loss: 0.2236 - acc: 0.9881 - val_loss: 0.2795 - val_acc: 0.9444
Epoch 395/500
84/84 [==============================] - ETA: 0s - loss: 0.1955 - acc: 1.000 - 0s - loss: 0.2229 - acc: 0.9881 - val_loss: 0.2793 - val_acc: 0.9444
Epoch 396/500
84/84 [==============================] - ETA: 0s - loss: 0.2013 - acc: 1.000 - 0s - loss: 0.2222 - acc: 0.9881 - val_loss: 0.2793 - val_acc: 0.9444
Epoch 397/500
84/84 [==============================] - ETA: 0s - loss: 0.2219 - acc: 0.968 - 0s - loss: 0.2217 - acc: 0.9881 - val_loss: 0.2801 - val_acc: 0.9167
Epoch 398/500
84/84 [==============================] - 0s - loss: 0.2209 - acc: 0.9881 - val_loss: 0.2794 - val_acc: 0.9167
Epoch 399/500
84/84 [==============================] - 0s - loss: 0.2202 - acc: 0.9881 - val_loss: 0.2787 - val_acc: 0.9167
Epoch 400/500
84/84 [==============================] - ETA: 0s - loss: 0.2128 - acc: 0.968 - 0s - loss: 0.2195 - acc: 0.9881 - val_loss: 0.2775 - val_acc: 0.9444
Epoch 401/500
84/84 [==============================] - 0s - loss: 0.2189 - acc: 0.9881 - val_loss: 0.2757 - val_acc: 0.9444
Epoch 402/500
84/84 [==============================] - 0s - loss: 0.2181 - acc: 0.9881 - val_loss: 0.2753 - val_acc: 0.9444
Epoch 403/500
84/84 [==============================] - 0s - loss: 0.2174 - acc: 0.9881 - val_loss: 0.2737 - val_acc: 0.9444
Epoch 404/500
84/84 [==============================] - 0s - loss: 0.2168 - acc: 0.9881 - val_loss: 0.2733 - val_acc: 0.9444
Epoch 405/500
84/84 [==============================] - 0s - loss: 0.2161 - acc: 0.9881 - val_loss: 0.2724 - val_acc: 0.9444
Epoch 406/500
84/84 [==============================] - 0s - loss: 0.2154 - acc: 0.9881 - val_loss: 0.2708 - val_acc: 0.9444
Epoch 407/500
84/84 [==============================] - 0s - loss: 0.2147 - acc: 0.9881 - val_loss: 0.2695 - val_acc: 0.9444
Epoch 408/500
84/84 [==============================] - 0s - loss: 0.2141 - acc: 0.9881 - val_loss: 0.2688 - val_acc: 0.9444
Epoch 409/500
84/84 [==============================] - 0s - loss: 0.2134 - acc: 0.9881 - val_loss: 0.2687 - val_acc: 0.9444
Epoch 410/500
84/84 [==============================] - 0s - loss: 0.2127 - acc: 0.9881 - val_loss: 0.2685 - val_acc: 0.9444
Epoch 411/500
84/84 [==============================] - 0s - loss: 0.2121 - acc: 0.9881 - val_loss: 0.2680 - val_acc: 0.9444
Epoch 412/500
84/84 [==============================] - ETA: 0s - loss: 0.2321 - acc: 0.968 - 0s - loss: 0.2115 - acc: 0.9881 - val_loss: 0.2670 - val_acc: 0.9444
Epoch 413/500
84/84 [==============================] - 0s - loss: 0.2107 - acc: 0.9881 - val_loss: 0.2665 - val_acc: 0.9444
Epoch 414/500
84/84 [==============================] - 0s - loss: 0.2101 - acc: 0.9881 - val_loss: 0.2661 - val_acc: 0.9444
Epoch 415/500
84/84 [==============================] - 0s - loss: 0.2096 - acc: 0.9881 - val_loss: 0.2665 - val_acc: 0.9444
Epoch 416/500
84/84 [==============================] - 0s - loss: 0.2089 - acc: 0.9881 - val_loss: 0.2649 - val_acc: 0.9444
Epoch 417/500
84/84 [==============================] - 0s - loss: 0.2081 - acc: 0.9881 - val_loss: 0.2642 - val_acc: 0.9444
Epoch 418/500
84/84 [==============================] - 0s - loss: 0.2075 - acc: 0.9881 - val_loss: 0.2636 - val_acc: 0.9444
Epoch 419/500
84/84 [==============================] - 0s - loss: 0.2068 - acc: 0.9881 - val_loss: 0.2634 - val_acc: 0.9444
Epoch 420/500
84/84 [==============================] - 0s - loss: 0.2062 - acc: 0.9881 - val_loss: 0.2639 - val_acc: 0.9444
Epoch 421/500
84/84 [==============================] - ETA: 0s - loss: 0.2447 - acc: 0.968 - 0s - loss: 0.2055 - acc: 0.9881 - val_loss: 0.2638 - val_acc: 0.9444
Epoch 422/500
84/84 [==============================] - 0s - loss: 0.2049 - acc: 0.9881 - val_loss: 0.2633 - val_acc: 0.9444
Epoch 423/500
84/84 [==============================] - 0s - loss: 0.2043 - acc: 0.9881 - val_loss: 0.2631 - val_acc: 0.9444
Epoch 424/500
84/84 [==============================] - 0s - loss: 0.2039 - acc: 0.9881 - val_loss: 0.2647 - val_acc: 0.9444
Epoch 425/500
84/84 [==============================] - 0s - loss: 0.2030 - acc: 0.9881 - val_loss: 0.2643 - val_acc: 0.9444
Epoch 426/500
84/84 [==============================] - 0s - loss: 0.2024 - acc: 0.9881 - val_loss: 0.2639 - val_acc: 0.9444
Epoch 427/500
84/84 [==============================] - 0s - loss: 0.2018 - acc: 0.9881 - val_loss: 0.2633 - val_acc: 0.9444
Epoch 428/500
84/84 [==============================] - 0s - loss: 0.2011 - acc: 0.9881 - val_loss: 0.2622 - val_acc: 0.9444
Epoch 429/500
84/84 [==============================] - 0s - loss: 0.2005 - acc: 0.9881 - val_loss: 0.2609 - val_acc: 0.9444
Epoch 430/500
84/84 [==============================] - 0s - loss: 0.2000 - acc: 0.9881 - val_loss: 0.2591 - val_acc: 0.9444
Epoch 431/500
84/84 [==============================] - 0s - loss: 0.1993 - acc: 0.9881 - val_loss: 0.2581 - val_acc: 0.9444
Epoch 432/500
84/84 [==============================] - 0s - loss: 0.1986 - acc: 0.9881 - val_loss: 0.2557 - val_acc: 0.9444
Epoch 433/500
84/84 [==============================] - ETA: 0s - loss: 0.1985 - acc: 0.968 - 0s - loss: 0.1981 - acc: 0.9881 - val_loss: 0.2535 - val_acc: 0.9444
Epoch 434/500
84/84 [==============================] - 0s - loss: 0.1977 - acc: 0.9881 - val_loss: 0.2519 - val_acc: 0.9444
Epoch 435/500
84/84 [==============================] - 0s - loss: 0.1970 - acc: 0.9881 - val_loss: 0.2524 - val_acc: 0.9444
Epoch 436/500
84/84 [==============================] - 0s - loss: 0.1965 - acc: 0.9881 - val_loss: 0.2531 - val_acc: 0.9444
Epoch 437/500
84/84 [==============================] - 0s - loss: 0.1957 - acc: 0.9881 - val_loss: 0.2529 - val_acc: 0.9444
Epoch 438/500
84/84 [==============================] - 0s - loss: 0.1951 - acc: 0.9881 - val_loss: 0.2535 - val_acc: 0.9444
Epoch 439/500
84/84 [==============================] - 0s - loss: 0.1945 - acc: 0.9881 - val_loss: 0.2533 - val_acc: 0.9444
Epoch 440/500
84/84 [==============================] - 0s - loss: 0.1942 - acc: 0.9881 - val_loss: 0.2547 - val_acc: 0.9444
Epoch 441/500
84/84 [==============================] - 0s - loss: 0.1932 - acc: 0.9881 - val_loss: 0.2541 - val_acc: 0.9444
Epoch 442/500
84/84 [==============================] - 0s - loss: 0.1927 - acc: 0.9881 - val_loss: 0.2535 - val_acc: 0.9444
Epoch 443/500
84/84 [==============================] - 0s - loss: 0.1920 - acc: 0.9881 - val_loss: 0.2541 - val_acc: 0.9444
Epoch 444/500
84/84 [==============================] - 0s - loss: 0.1914 - acc: 0.9881 - val_loss: 0.2542 - val_acc: 0.9444
Epoch 445/500
84/84 [==============================] - 0s - loss: 0.1912 - acc: 0.9881 - val_loss: 0.2548 - val_acc: 0.9444
Epoch 446/500
84/84 [==============================] - 0s - loss: 0.1903 - acc: 0.9881 - val_loss: 0.2528 - val_acc: 0.9444
Epoch 447/500
84/84 [==============================] - 0s - loss: 0.1898 - acc: 0.9881 - val_loss: 0.2513 - val_acc: 0.9444
Epoch 448/500
84/84 [==============================] - 0s - loss: 0.1891 - acc: 0.9881 - val_loss: 0.2509 - val_acc: 0.9444
Epoch 449/500
84/84 [==============================] - 0s - loss: 0.1887 - acc: 0.9881 - val_loss: 0.2511 - val_acc: 0.9444
Epoch 450/500
84/84 [==============================] - 0s - loss: 0.1880 - acc: 0.9881 - val_loss: 0.2494 - val_acc: 0.9444
Epoch 451/500
84/84 [==============================] - 0s - loss: 0.1874 - acc: 0.9881 - val_loss: 0.2490 - val_acc: 0.9444
Epoch 452/500
84/84 [==============================] - 0s - loss: 0.1868 - acc: 0.9881 - val_loss: 0.2481 - val_acc: 0.9444
Epoch 453/500
84/84 [==============================] - 0s - loss: 0.1863 - acc: 0.9881 - val_loss: 0.2478 - val_acc: 0.9444
Epoch 454/500
84/84 [==============================] - 0s - loss: 0.1857 - acc: 0.9881 - val_loss: 0.2471 - val_acc: 0.9444
Epoch 455/500
84/84 [==============================] - 0s - loss: 0.1851 - acc: 0.9881 - val_loss: 0.2456 - val_acc: 0.9444
Epoch 456/500
84/84 [==============================] - 0s - loss: 0.1846 - acc: 0.9881 - val_loss: 0.2450 - val_acc: 0.9444
Epoch 457/500
84/84 [==============================] - 0s - loss: 0.1842 - acc: 0.9881 - val_loss: 0.2434 - val_acc: 0.9444
Epoch 458/500
84/84 [==============================] - 0s - loss: 0.1835 - acc: 0.9881 - val_loss: 0.2434 - val_acc: 0.9444
Epoch 459/500
84/84 [==============================] - 0s - loss: 0.1830 - acc: 0.9881 - val_loss: 0.2437 - val_acc: 0.9444
Epoch 460/500
84/84 [==============================] - 0s - loss: 0.1823 - acc: 0.9881 - val_loss: 0.2438 - val_acc: 0.9444
Epoch 461/500
84/84 [==============================] - 0s - loss: 0.1818 - acc: 0.9881 - val_loss: 0.2437 - val_acc: 0.9444
Epoch 462/500
84/84 [==============================] - 0s - loss: 0.1812 - acc: 0.9881 - val_loss: 0.2436 - val_acc: 0.9444
Epoch 463/500
84/84 [==============================] - 0s - loss: 0.1807 - acc: 0.9881 - val_loss: 0.2430 - val_acc: 0.9444
Epoch 464/500
84/84 [==============================] - 0s - loss: 0.1801 - acc: 0.9881 - val_loss: 0.2423 - val_acc: 0.9444
Epoch 465/500
84/84 [==============================] - 0s - loss: 0.1796 - acc: 0.9881 - val_loss: 0.2428 - val_acc: 0.9444
Epoch 466/500
84/84 [==============================] - 0s - loss: 0.1790 - acc: 0.9881 - val_loss: 0.2431 - val_acc: 0.9444
Epoch 467/500
84/84 [==============================] - 0s - loss: 0.1785 - acc: 0.9881 - val_loss: 0.2434 - val_acc: 0.9444
Epoch 468/500
84/84 [==============================] - 0s - loss: 0.1779 - acc: 0.9881 - val_loss: 0.2430 - val_acc: 0.9444
Epoch 469/500
84/84 [==============================] - 0s - loss: 0.1775 - acc: 0.9881 - val_loss: 0.2415 - val_acc: 0.9444
Epoch 470/500
84/84 [==============================] - 0s - loss: 0.1771 - acc: 0.9881 - val_loss: 0.2402 - val_acc: 0.9444
Epoch 471/500
84/84 [==============================] - 0s - loss: 0.1764 - acc: 0.9881 - val_loss: 0.2400 - val_acc: 0.9444
Epoch 472/500
84/84 [==============================] - 0s - loss: 0.1758 - acc: 0.9881 - val_loss: 0.2385 - val_acc: 0.9444
Epoch 473/500
84/84 [==============================] - 0s - loss: 0.1753 - acc: 0.9881 - val_loss: 0.2382 - val_acc: 0.9444
Epoch 474/500
84/84 [==============================] - 0s - loss: 0.1748 - acc: 0.9881 - val_loss: 0.2380 - val_acc: 0.9444
Epoch 475/500
84/84 [==============================] - 0s - loss: 0.1744 - acc: 0.9881 - val_loss: 0.2366 - val_acc: 0.9444
Epoch 476/500
84/84 [==============================] - 0s - loss: 0.1738 - acc: 0.9881 - val_loss: 0.2362 - val_acc: 0.9444
Epoch 477/500
84/84 [==============================] - 0s - loss: 0.1733 - acc: 0.9881 - val_loss: 0.2348 - val_acc: 0.9444
Epoch 478/500
84/84 [==============================] - 0s - loss: 0.1727 - acc: 0.9881 - val_loss: 0.2345 - val_acc: 0.9444
Epoch 479/500
84/84 [==============================] - 0s - loss: 0.1722 - acc: 0.9881 - val_loss: 0.2342 - val_acc: 0.9444
Epoch 480/500
84/84 [==============================] - 0s - loss: 0.1717 - acc: 0.9881 - val_loss: 0.2345 - val_acc: 0.9444
Epoch 481/500
84/84 [==============================] - 0s - loss: 0.1711 - acc: 0.9881 - val_loss: 0.2340 - val_acc: 0.9444
Epoch 482/500
84/84 [==============================] - 0s - loss: 0.1706 - acc: 0.9881 - val_loss: 0.2340 - val_acc: 0.9444
Epoch 483/500
84/84 [==============================] - ETA: 0s - loss: 0.1585 - acc: 1.000 - 0s - loss: 0.1701 - acc: 0.9881 - val_loss: 0.2347 - val_acc: 0.9444
Epoch 484/500
84/84 [==============================] - 0s - loss: 0.1696 - acc: 0.9881 - val_loss: 0.2350 - val_acc: 0.9444
Epoch 485/500
84/84 [==============================] - 0s - loss: 0.1691 - acc: 0.9881 - val_loss: 0.2344 - val_acc: 0.9444
Epoch 486/500
84/84 [==============================] - 0s - loss: 0.1686 - acc: 0.9881 - val_loss: 0.2353 - val_acc: 0.9444
Epoch 487/500
84/84 [==============================] - 0s - loss: 0.1681 - acc: 0.9881 - val_loss: 0.2354 - val_acc: 0.9444
Epoch 488/500
84/84 [==============================] - 0s - loss: 0.1676 - acc: 0.9881 - val_loss: 0.2348 - val_acc: 0.9444
Epoch 489/500
84/84 [==============================] - 0s - loss: 0.1671 - acc: 0.9881 - val_loss: 0.2345 - val_acc: 0.9444
Epoch 490/500
84/84 [==============================] - 0s - loss: 0.1666 - acc: 0.9881 - val_loss: 0.2343 - val_acc: 0.9444
Epoch 491/500
84/84 [==============================] - 0s - loss: 0.1661 - acc: 0.9881 - val_loss: 0.2335 - val_acc: 0.9444
Epoch 492/500
84/84 [==============================] - 0s - loss: 0.1657 - acc: 0.9881 - val_loss: 0.2331 - val_acc: 0.9444
Epoch 493/500
84/84 [==============================] - 0s - loss: 0.1651 - acc: 0.9881 - val_loss: 0.2332 - val_acc: 0.9444
Epoch 494/500
84/84 [==============================] - 0s - loss: 0.1647 - acc: 0.9881 - val_loss: 0.2339 - val_acc: 0.9444
Epoch 495/500
84/84 [==============================] - 0s - loss: 0.1642 - acc: 0.9881 - val_loss: 0.2331 - val_acc: 0.9444
Epoch 496/500
84/84 [==============================] - 0s - loss: 0.1637 - acc: 0.9881 - val_loss: 0.2324 - val_acc: 0.9444
Epoch 497/500
84/84 [==============================] - 0s - loss: 0.1633 - acc: 0.9881 - val_loss: 0.2325 - val_acc: 0.9444
Epoch 498/500
84/84 [==============================] - 0s - loss: 0.1628 - acc: 0.9881 - val_loss: 0.2306 - val_acc: 0.9444
Epoch 499/500
84/84 [==============================] - 0s - loss: 0.1622 - acc: 0.9881 - val_loss: 0.2301 - val_acc: 0.9444
Epoch 500/500
84/84 [==============================] - 0s - loss: 0.1621 - acc: 0.9881 - val_loss: 0.2281 - val_acc: 0.9444
CPU times: user 4.48 s, sys: 984 ms, total: 5.46 s
Wall time: 4.47 s
Out[44]:
<keras.callbacks.History at 0x7f535008b278>

Bewertung


In [45]:
model.predict(np.array([[ 5.1,  3.5,  1.4,  0.2]]))


Out[45]:
array([[  9.82259035e-01,   1.77392866e-02,   1.65961740e-06]], dtype=float32)

In [46]:
X[0], y[0]


Out[46]:
(array([ 5.1,  3.5,  1.4,  0.2]), array([ 1.,  0.,  0.]))

In [47]:
train_loss, train_accuracy = model.evaluate(X_train, y_train)
train_loss, train_accuracy


 32/120 [=======>......................] - ETA: 0s
Out[47]:
(0.18136776983737946, 0.97499999602635701)

In [48]:
test_loss, test_accuracy = model.evaluate(X_test, y_test)
test_loss, test_accuracy


30/30 [==============================] - 0s
Out[48]:
(0.19082625210285187, 0.96666663885116577)

Hands-On

Vollziehe das Notebook bis hier nach und spiele mit den einigen Parametern

  • Variiere die Anzahl der Neuronen im Hidden Layer. Wieso geht das überhaupt mit 3 Neuronen
  • Ziehe eine weitere Schicht ein
  • Kannst du eine Skizze des Graphs der Funktion mit x1 und x2 an den Achsen erstellen?
  • Was ist das für eine Funktion?

Stop Here

Optionaler Teil

Model im Keras und TensorFlow Format speichern


In [ ]:
model.save('nn-iris.hdf5')

In [ ]:
import os
from keras import backend as K

In [ ]:
K.set_learning_phase(0)

In [ ]:
sess = K.get_session()

In [ ]:
!rm -r tf

In [ ]:
tf.app.flags.DEFINE_integer('model_version', 1, 'version number of the model.')
tf.app.flags.DEFINE_string('work_dir', '/tmp', 'Working directory.')
FLAGS = tf.app.flags.FLAGS

In [ ]:
export_path_base = 'tf'
export_path = os.path.join(
  tf.compat.as_bytes(export_path_base),
  tf.compat.as_bytes(str(FLAGS.model_version)))

In [ ]:
classification_inputs = tf.saved_model.utils.build_tensor_info(model.input)
classification_outputs_scores = tf.saved_model.utils.build_tensor_info(model.output)

In [ ]:
from tensorflow.python.saved_model.signature_def_utils_impl import build_signature_def, predict_signature_def

In [ ]:
signature = predict_signature_def(inputs={'inputs': model.input},
                                  outputs={'scores': model.output})

In [ ]:
builder = tf.saved_model.builder.SavedModelBuilder(export_path)

In [ ]:
builder.add_meta_graph_and_variables(
      sess, 
     tags=[tf.saved_model.tag_constants.SERVING],
      signature_def_map={
          tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature
      })

In [ ]:
builder.save()

In [ ]:
!ls -lhR tf

Dieses Tensorflow Modell kann man bei Google Cloud ML hochladen und für Berechnungen nutzen


In [ ]:
# cd tf
# gsutil cp -R 1 gs://irisnn
# create model and version at https://console.cloud.google.com/mlengine
# gcloud ml-engine predict --model=irisnn --json-instances=./sample_iris.json
# SCORES
# [0.9954029321670532, 0.004596732556819916, 3.3544753819114703e-07]