Neural Network Example with Keras

(C) 2018-2019 by Damir Cavar

Version: 1.1, January 2019

This is a tutorial related to the L665 course on Machine Learning for NLP focusing on Deep Learning, Spring 2018 and 2019 at Indiana University.

This material is based on Jason Brownlee's tutorial Develop Your First Neural Network in Python With Keras Step-By-Step. See for more details and explanations this page. All copyrights are his, except on a few small comments that I added.

Keras is a neural network module that is running on top of TensorFlow (among others). Make sure that you install TensorFlow on your system. Go to the Keras homepage and install the module in Python. This example also requires that Scipy and Numpy are installed in your system.

Introduction

As explained in the above tutorial, the steps are:

  • loading data (prepared for the process, that is vectorized and formated)
  • defining a model (layers)
  • compiling the model
  • fitting the model
  • evaluating the model

We have to import the necessary modules from Keras:


In [2]:
from keras.models import Sequential
from keras.layers import Dense


Using TensorFlow backend.
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-2-b9e2c8277ae4> in <module>
----> 1 from keras.models import Sequential
      2 from keras.layers import Dense

C:\ProgramData\Anaconda3\lib\site-packages\keras\__init__.py in <module>
      1 from __future__ import absolute_import
      2 
----> 3 from . import utils
      4 from . import activations
      5 from . import applications

C:\ProgramData\Anaconda3\lib\site-packages\keras\utils\__init__.py in <module>
      4 from . import data_utils
      5 from . import io_utils
----> 6 from . import conv_utils
      7 from . import losses_utils
      8 from . import metrics_utils

C:\ProgramData\Anaconda3\lib\site-packages\keras\utils\conv_utils.py in <module>
      7 from six.moves import range
      8 import numpy as np
----> 9 from .. import backend as K
     10 
     11 

C:\ProgramData\Anaconda3\lib\site-packages\keras\backend\__init__.py in <module>
----> 1 from .load_backend import epsilon
      2 from .load_backend import set_epsilon
      3 from .load_backend import floatx
      4 from .load_backend import set_floatx
      5 from .load_backend import cast_to_floatx

C:\ProgramData\Anaconda3\lib\site-packages\keras\backend\load_backend.py in <module>
     88 elif _BACKEND == 'tensorflow':
     89     sys.stderr.write('Using TensorFlow backend.\n')
---> 90     from .tensorflow_backend import *
     91 else:
     92     # Try and load external backend.

C:\ProgramData\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py in <module>
      3 from __future__ import print_function
      4 
----> 5 import tensorflow as tf
      6 from tensorflow.python.eager import context
      7 from tensorflow.python.framework import device as tfdev

ModuleNotFoundError: No module named 'tensorflow'

We will use numpy as well:


In [2]:
import numpy

In his tutorial, as linked above, Jason Brownlee suggests that we initialize the random number generator with a fixed number to make sure that the results are the same at every run, since the learning algorithm makes use of a stochastic process. We initialize the random number generator with 7:


In [3]:
numpy.random.seed(7)

The data-set suggested in Brownlee's tutorial is Pima Indians Diabetes Data Set. The required file can be downloaded using this link. It is available in the local data subfolder with the .csv filename-ending.


In [4]:
dataset = numpy.loadtxt("data/pima-indians-diabetes.csv", delimiter=",")

The data is organized as follows: the first 8 columns per row define the features, that is the input variables for the neural network. The last column defines the output as a binary value of $0$ or $1$. We can separate those two from the dataset into two variables:


In [5]:
X = dataset[:,0:8]
Y = dataset[:,8]

Just to verify the content:


In [6]:
X


Out[6]:
array([[  6.   , 148.   ,  72.   , ...,  33.6  ,   0.627,  50.   ],
       [  1.   ,  85.   ,  66.   , ...,  26.6  ,   0.351,  31.   ],
       [  8.   , 183.   ,  64.   , ...,  23.3  ,   0.672,  32.   ],
       ...,
       [  5.   , 121.   ,  72.   , ...,  26.2  ,   0.245,  30.   ],
       [  1.   , 126.   ,  60.   , ...,  30.1  ,   0.349,  47.   ],
       [  1.   ,  93.   ,  70.   , ...,  30.4  ,   0.315,  23.   ]])

In [7]:
Y


Out[7]:
array([1., 0., 1., 0., 1., 0., 1., 0., 1., 1., 0., 1., 0., 1., 1., 1., 1.,
       1., 0., 1., 0., 0., 1., 1., 1., 1., 1., 0., 0., 0., 0., 1., 0., 0.,
       0., 0., 0., 1., 1., 1., 0., 0., 0., 1., 0., 1., 0., 0., 1., 0., 0.,
       0., 0., 1., 0., 0., 1., 0., 0., 0., 0., 1., 0., 0., 1., 0., 1., 0.,
       0., 0., 1., 0., 1., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0., 0., 1.,
       0., 0., 0., 1., 0., 0., 0., 0., 1., 0., 0., 0., 0., 0., 1., 1., 0.,
       0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 0., 0., 1., 1., 1., 0., 0.,
       0., 1., 0., 0., 0., 1., 1., 0., 0., 1., 1., 1., 1., 1., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 1.,
       0., 1., 1., 0., 0., 0., 1., 0., 0., 0., 0., 1., 1., 0., 0., 0., 0.,
       1., 1., 0., 0., 0., 1., 0., 1., 0., 1., 0., 0., 0., 0., 0., 1., 1.,
       1., 1., 1., 0., 0., 1., 1., 0., 1., 0., 1., 1., 1., 0., 0., 0., 0.,
       0., 0., 1., 1., 0., 1., 0., 0., 0., 1., 1., 1., 1., 0., 1., 1., 1.,
       1., 0., 0., 0., 0., 0., 1., 0., 0., 1., 1., 0., 0., 0., 1., 1., 1.,
       1., 0., 0., 0., 1., 1., 0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 1.,
       1., 0., 0., 0., 1., 0., 1., 0., 0., 1., 0., 1., 0., 0., 1., 1., 0.,
       0., 0., 0., 0., 1., 0., 0., 0., 1., 0., 0., 1., 1., 0., 0., 1., 0.,
       0., 0., 1., 1., 1., 0., 0., 1., 0., 1., 0., 1., 1., 0., 1., 0., 0.,
       1., 0., 1., 1., 0., 0., 1., 0., 1., 0., 0., 1., 0., 1., 0., 1., 1.,
       1., 0., 0., 1., 0., 1., 0., 0., 0., 1., 0., 0., 0., 0., 1., 1., 1.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0., 0., 1., 1.,
       1., 0., 1., 1., 0., 0., 1., 0., 0., 1., 0., 0., 1., 1., 0., 0., 0.,
       0., 1., 0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 0., 0.,
       1., 0., 0., 1., 0., 0., 1., 0., 1., 1., 0., 1., 0., 1., 0., 1., 0.,
       1., 1., 0., 0., 0., 0., 1., 1., 0., 1., 0., 1., 0., 0., 0., 0., 1.,
       1., 0., 1., 0., 1., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0., 1., 0.,
       0., 1., 1., 1., 0., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0., 0., 1.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0., 0., 0., 0.,
       1., 0., 0., 0., 1., 0., 0., 0., 1., 1., 0., 0., 0., 0., 0., 0., 0.,
       1., 0., 0., 0., 0., 1., 0., 0., 0., 1., 0., 0., 0., 1., 0., 0., 0.,
       1., 0., 0., 0., 0., 1., 1., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0., 1., 1., 1., 1., 0.,
       0., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1.,
       1., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 1.,
       0., 1., 1., 0., 0., 0., 1., 0., 1., 0., 1., 0., 1., 0., 1., 0., 0.,
       1., 0., 0., 1., 0., 0., 0., 0., 1., 1., 0., 1., 0., 0., 0., 0., 1.,
       1., 0., 1., 0., 0., 0., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 1., 0., 0., 0., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0., 0., 0.,
       1., 1., 1., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0., 1., 0., 1., 1.,
       1., 1., 0., 1., 1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 0., 1., 0.,
       0., 1., 0., 1., 0., 0., 0., 0., 0., 1., 0., 1., 0., 1., 0., 1., 1.,
       0., 0., 0., 0., 1., 1., 0., 0., 0., 1., 0., 1., 1., 0., 0., 1., 0.,
       0., 1., 1., 0., 0., 1., 0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 1.,
       1., 1., 0., 0., 0., 0., 0., 0., 1., 1., 0., 0., 1., 0., 0., 1., 0.,
       1., 1., 1., 0., 0., 1., 1., 1., 0., 1., 0., 1., 0., 1., 0., 0., 0.,
       0., 1., 0.])

We will define our model in the next step. The first layer is the input layer. It is set to have 8 inputs for the 8 variables using the attribute input_dim. The Dense class defines the layers to be fully connected. The number of neurons is specified as the first argument to the initializer. We are choosing also the activation function using the activation attribute. This should be clear from the presentations in class and other examples and discussions on related notebooks here in this collection. The output layer consists of one neuron and uses the sigmoid activation function to return a weight between $0$ and $1$:


In [23]:
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

The defined network needs to be compiled. The compilation process creates a specific implementation of it using the backend (e.g. TensorFlow or Theano), decides whether a GPU or a CPU will be used, which loss and optimization function to select, and which metrics should be collected during training. In this case we use the binary cross-entropy as a loss function, the efficient implementation of a gradient decent algorithm called Adam, and we store the classification accuracy for the output and analysis.


In [24]:
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

The training of the model is achieved by calling the fit method. The parameters specify the input matrix and output vector in our case, as well as the number of iterations through the data set for training, called epochs. The batch size specifies the number of instances that are evaluated before an update of the parameters is applied.


In [26]:
model.fit(X, Y, epochs=150, batch_size=4)


Epoch 1/150
768/768 [==============================] - 0s 356us/step - loss: 0.5882 - acc: 0.7305
Epoch 2/150
768/768 [==============================] - 0s 326us/step - loss: 0.5404 - acc: 0.7435
Epoch 3/150
768/768 [==============================] - 0s 335us/step - loss: 0.5319 - acc: 0.7487
Epoch 4/150
768/768 [==============================] - 0s 342us/step - loss: 0.5148 - acc: 0.7578
Epoch 5/150
768/768 [==============================] - 0s 369us/step - loss: 0.5291 - acc: 0.7461
Epoch 6/150
768/768 [==============================] - 0s 391us/step - loss: 0.5063 - acc: 0.7708
Epoch 7/150
768/768 [==============================] - 0s 366us/step - loss: 0.5107 - acc: 0.7487
Epoch 8/150
768/768 [==============================] - 0s 338us/step - loss: 0.5117 - acc: 0.7409
Epoch 9/150
768/768 [==============================] - 0s 349us/step - loss: 0.5027 - acc: 0.7487
Epoch 10/150
768/768 [==============================] - 0s 346us/step - loss: 0.5008 - acc: 0.7604
Epoch 11/150
768/768 [==============================] - 0s 361us/step - loss: 0.5021 - acc: 0.7617
Epoch 12/150
768/768 [==============================] - 0s 353us/step - loss: 0.4994 - acc: 0.7565
Epoch 13/150
768/768 [==============================] - 0s 348us/step - loss: 0.5132 - acc: 0.7474
Epoch 14/150
768/768 [==============================] - 0s 342us/step - loss: 0.4934 - acc: 0.7747
Epoch 15/150
768/768 [==============================] - 0s 350us/step - loss: 0.4877 - acc: 0.7682
Epoch 16/150
768/768 [==============================] - 0s 328us/step - loss: 0.5093 - acc: 0.7526
Epoch 17/150
768/768 [==============================] - 0s 327us/step - loss: 0.5231 - acc: 0.7422
Epoch 18/150
768/768 [==============================] - 0s 349us/step - loss: 0.4977 - acc: 0.7669
Epoch 19/150
768/768 [==============================] - 0s 349us/step - loss: 0.4998 - acc: 0.7630
Epoch 20/150
768/768 [==============================] - 0s 344us/step - loss: 0.4934 - acc: 0.7630
Epoch 21/150
768/768 [==============================] - 0s 353us/step - loss: 0.4985 - acc: 0.7604
Epoch 22/150
768/768 [==============================] - 0s 358us/step - loss: 0.5037 - acc: 0.7695
Epoch 23/150
768/768 [==============================] - 0s 359us/step - loss: 0.4868 - acc: 0.7656
Epoch 24/150
768/768 [==============================] - 0s 368us/step - loss: 0.5001 - acc: 0.7656
Epoch 25/150
768/768 [==============================] - 0s 374us/step - loss: 0.4929 - acc: 0.7461
Epoch 26/150
768/768 [==============================] - 0s 372us/step - loss: 0.4953 - acc: 0.7721
Epoch 27/150
768/768 [==============================] - 0s 354us/step - loss: 0.4897 - acc: 0.7721
Epoch 28/150
768/768 [==============================] - 0s 367us/step - loss: 0.4877 - acc: 0.7643
Epoch 29/150
768/768 [==============================] - 0s 349us/step - loss: 0.4896 - acc: 0.7578
Epoch 30/150
768/768 [==============================] - 0s 363us/step - loss: 0.4871 - acc: 0.7786
Epoch 31/150
768/768 [==============================] - 0s 351us/step - loss: 0.4974 - acc: 0.7539
Epoch 32/150
768/768 [==============================] - 0s 342us/step - loss: 0.4804 - acc: 0.7747
Epoch 33/150
768/768 [==============================] - 0s 347us/step - loss: 0.4863 - acc: 0.7721
Epoch 34/150
768/768 [==============================] - 0s 335us/step - loss: 0.4779 - acc: 0.7721
Epoch 35/150
768/768 [==============================] - 0s 338us/step - loss: 0.4714 - acc: 0.7891
Epoch 36/150
768/768 [==============================] - 0s 345us/step - loss: 0.4895 - acc: 0.7604
Epoch 37/150
768/768 [==============================] - 0s 355us/step - loss: 0.4859 - acc: 0.7656
Epoch 38/150
768/768 [==============================] - 0s 343us/step - loss: 0.4836 - acc: 0.7682
Epoch 39/150
768/768 [==============================] - 0s 369us/step - loss: 0.5042 - acc: 0.7656
Epoch 40/150
768/768 [==============================] - 0s 371us/step - loss: 0.4709 - acc: 0.7917
Epoch 41/150
768/768 [==============================] - 0s 367us/step - loss: 0.4850 - acc: 0.7812
Epoch 42/150
768/768 [==============================] - 0s 374us/step - loss: 0.4851 - acc: 0.7604
Epoch 43/150
768/768 [==============================] - 0s 345us/step - loss: 0.4756 - acc: 0.7734
Epoch 44/150
768/768 [==============================] - 0s 353us/step - loss: 0.4765 - acc: 0.7721
Epoch 45/150
768/768 [==============================] - 0s 349us/step - loss: 0.4812 - acc: 0.7695
Epoch 46/150
768/768 [==============================] - 0s 349us/step - loss: 0.4844 - acc: 0.7630
Epoch 47/150
768/768 [==============================] - 0s 353us/step - loss: 0.4709 - acc: 0.7839
Epoch 48/150
768/768 [==============================] - 0s 348us/step - loss: 0.4860 - acc: 0.7721
Epoch 49/150
768/768 [==============================] - 0s 353us/step - loss: 0.4741 - acc: 0.7669
Epoch 50/150
768/768 [==============================] - 0s 373us/step - loss: 0.4833 - acc: 0.7747
Epoch 51/150
768/768 [==============================] - 0s 389us/step - loss: 0.4753 - acc: 0.7799
Epoch 52/150
768/768 [==============================] - 0s 356us/step - loss: 0.4740 - acc: 0.7760
Epoch 53/150
768/768 [==============================] - 0s 344us/step - loss: 0.4847 - acc: 0.7747
Epoch 54/150
768/768 [==============================] - 0s 335us/step - loss: 0.4684 - acc: 0.7747
Epoch 55/150
768/768 [==============================] - 0s 332us/step - loss: 0.4698 - acc: 0.7812
Epoch 56/150
768/768 [==============================] - 0s 341us/step - loss: 0.4764 - acc: 0.7695
Epoch 57/150
768/768 [==============================] - 0s 341us/step - loss: 0.4672 - acc: 0.7721
Epoch 58/150
768/768 [==============================] - 0s 377us/step - loss: 0.4598 - acc: 0.7747
Epoch 59/150
768/768 [==============================] - 0s 394us/step - loss: 0.4746 - acc: 0.7682
Epoch 60/150
768/768 [==============================] - 0s 344us/step - loss: 0.4762 - acc: 0.7839
Epoch 61/150
768/768 [==============================] - 0s 344us/step - loss: 0.4669 - acc: 0.7747
Epoch 62/150
768/768 [==============================] - 0s 346us/step - loss: 0.4597 - acc: 0.7852
Epoch 63/150
768/768 [==============================] - 0s 359us/step - loss: 0.4578 - acc: 0.7904
Epoch 64/150
768/768 [==============================] - 0s 363us/step - loss: 0.4617 - acc: 0.7826
Epoch 65/150
768/768 [==============================] - 0s 350us/step - loss: 0.4622 - acc: 0.7812
Epoch 66/150
768/768 [==============================] - 0s 342us/step - loss: 0.4709 - acc: 0.7839
Epoch 67/150
768/768 [==============================] - 0s 334us/step - loss: 0.4549 - acc: 0.7865
Epoch 68/150
768/768 [==============================] - 0s 345us/step - loss: 0.4578 - acc: 0.7891
Epoch 69/150
768/768 [==============================] - 0s 347us/step - loss: 0.4575 - acc: 0.7826
Epoch 70/150
768/768 [==============================] - 0s 352us/step - loss: 0.4612 - acc: 0.7826
Epoch 71/150
768/768 [==============================] - 0s 340us/step - loss: 0.4592 - acc: 0.7773
Epoch 72/150
768/768 [==============================] - 0s 357us/step - loss: 0.4614 - acc: 0.7826
Epoch 73/150
768/768 [==============================] - 0s 364us/step - loss: 0.4555 - acc: 0.7826
Epoch 74/150
768/768 [==============================] - 0s 369us/step - loss: 0.4520 - acc: 0.7904
Epoch 75/150
768/768 [==============================] - 0s 346us/step - loss: 0.4654 - acc: 0.7943
Epoch 76/150
768/768 [==============================] - 0s 376us/step - loss: 0.4558 - acc: 0.7786
Epoch 77/150
768/768 [==============================] - 0s 385us/step - loss: 0.4489 - acc: 0.7773
Epoch 78/150
768/768 [==============================] - 0s 359us/step - loss: 0.4574 - acc: 0.7917
Epoch 79/150
768/768 [==============================] - 0s 362us/step - loss: 0.4557 - acc: 0.7812
Epoch 80/150
768/768 [==============================] - 0s 342us/step - loss: 0.4463 - acc: 0.7956
Epoch 81/150
768/768 [==============================] - 0s 349us/step - loss: 0.4467 - acc: 0.7982
Epoch 82/150
768/768 [==============================] - 0s 366us/step - loss: 0.4595 - acc: 0.7865
Epoch 83/150
768/768 [==============================] - 0s 336us/step - loss: 0.4549 - acc: 0.7956
Epoch 84/150
768/768 [==============================] - 0s 350us/step - loss: 0.4509 - acc: 0.7812
Epoch 85/150
768/768 [==============================] - 0s 348us/step - loss: 0.4490 - acc: 0.7891
Epoch 86/150
768/768 [==============================] - 0s 347us/step - loss: 0.4573 - acc: 0.8047
Epoch 87/150
768/768 [==============================] - 0s 336us/step - loss: 0.4465 - acc: 0.7969
Epoch 88/150
768/768 [==============================] - 0s 329us/step - loss: 0.4503 - acc: 0.7630
Epoch 89/150
768/768 [==============================] - 0s 350us/step - loss: 0.4557 - acc: 0.7930
Epoch 90/150
768/768 [==============================] - 0s 356us/step - loss: 0.4510 - acc: 0.7891
Epoch 91/150
768/768 [==============================] - 0s 356us/step - loss: 0.4547 - acc: 0.7917
Epoch 92/150
768/768 [==============================] - 0s 373us/step - loss: 0.4506 - acc: 0.8060
Epoch 93/150
768/768 [==============================] - 0s 344us/step - loss: 0.4486 - acc: 0.7826
Epoch 94/150
768/768 [==============================] - 0s 339us/step - loss: 0.4425 - acc: 0.7956
Epoch 95/150
768/768 [==============================] - 0s 366us/step - loss: 0.4654 - acc: 0.7865
Epoch 96/150
768/768 [==============================] - 0s 356us/step - loss: 0.4474 - acc: 0.7891
Epoch 97/150
768/768 [==============================] - 0s 358us/step - loss: 0.4413 - acc: 0.7943
Epoch 98/150
768/768 [==============================] - 0s 358us/step - loss: 0.4434 - acc: 0.8034
Epoch 99/150
768/768 [==============================] - 0s 358us/step - loss: 0.4457 - acc: 0.7982
Epoch 100/150
768/768 [==============================] - 0s 370us/step - loss: 0.4394 - acc: 0.7878
Epoch 101/150
768/768 [==============================] - 0s 357us/step - loss: 0.4480 - acc: 0.7891
Epoch 102/150
768/768 [==============================] - 0s 352us/step - loss: 0.4457 - acc: 0.7930
Epoch 103/150
768/768 [==============================] - 0s 350us/step - loss: 0.4298 - acc: 0.7956
Epoch 104/150
768/768 [==============================] - 0s 356us/step - loss: 0.4470 - acc: 0.7812
Epoch 105/150
768/768 [==============================] - 0s 358us/step - loss: 0.4504 - acc: 0.7826
Epoch 106/150
768/768 [==============================] - 0s 369us/step - loss: 0.4340 - acc: 0.7956
Epoch 107/150
768/768 [==============================] - 0s 347us/step - loss: 0.4339 - acc: 0.7878
Epoch 108/150
768/768 [==============================] - 0s 346us/step - loss: 0.4543 - acc: 0.7812
Epoch 109/150
768/768 [==============================] - 0s 347us/step - loss: 0.4363 - acc: 0.7930
Epoch 110/150
768/768 [==============================] - 0s 343us/step - loss: 0.4387 - acc: 0.7786
Epoch 111/150
768/768 [==============================] - 0s 354us/step - loss: 0.4409 - acc: 0.7891
Epoch 112/150
768/768 [==============================] - 0s 366us/step - loss: 0.4557 - acc: 0.7786
Epoch 113/150
768/768 [==============================] - 0s 352us/step - loss: 0.4395 - acc: 0.7995
Epoch 114/150
768/768 [==============================] - 0s 340us/step - loss: 0.4387 - acc: 0.8047
Epoch 115/150
768/768 [==============================] - 0s 355us/step - loss: 0.4431 - acc: 0.7839
Epoch 116/150
768/768 [==============================] - 0s 350us/step - loss: 0.4425 - acc: 0.7812
Epoch 117/150
768/768 [==============================] - 0s 373us/step - loss: 0.4329 - acc: 0.7904
Epoch 118/150
768/768 [==============================] - 0s 347us/step - loss: 0.4363 - acc: 0.7891
Epoch 119/150
768/768 [==============================] - 0s 336us/step - loss: 0.4532 - acc: 0.7734
Epoch 120/150
768/768 [==============================] - 0s 362us/step - loss: 0.4482 - acc: 0.7826
Epoch 121/150
768/768 [==============================] - 0s 362us/step - loss: 0.4305 - acc: 0.8021
Epoch 122/150
768/768 [==============================] - 0s 365us/step - loss: 0.4296 - acc: 0.7982
Epoch 123/150
768/768 [==============================] - 0s 359us/step - loss: 0.4330 - acc: 0.7904
Epoch 124/150
768/768 [==============================] - 0s 354us/step - loss: 0.4318 - acc: 0.7930
Epoch 125/150
768/768 [==============================] - 0s 353us/step - loss: 0.4385 - acc: 0.7891
Epoch 126/150
768/768 [==============================] - 0s 353us/step - loss: 0.4328 - acc: 0.7904
Epoch 127/150
768/768 [==============================] - 0s 356us/step - loss: 0.4430 - acc: 0.7904
Epoch 128/150
768/768 [==============================] - 0s 381us/step - loss: 0.4525 - acc: 0.7943
Epoch 129/150
768/768 [==============================] - 0s 357us/step - loss: 0.4371 - acc: 0.7969
Epoch 130/150
768/768 [==============================] - 0s 366us/step - loss: 0.4237 - acc: 0.7982
Epoch 131/150
768/768 [==============================] - 0s 353us/step - loss: 0.4318 - acc: 0.7969
Epoch 132/150
768/768 [==============================] - 0s 345us/step - loss: 0.4267 - acc: 0.8047
Epoch 133/150
768/768 [==============================] - 0s 377us/step - loss: 0.4386 - acc: 0.7956
Epoch 134/150
768/768 [==============================] - 0s 353us/step - loss: 0.4150 - acc: 0.8242
Epoch 135/150
768/768 [==============================] - 0s 346us/step - loss: 0.4354 - acc: 0.7943
Epoch 136/150
768/768 [==============================] - 0s 361us/step - loss: 0.4342 - acc: 0.7943
Epoch 137/150
768/768 [==============================] - 0s 358us/step - loss: 0.4333 - acc: 0.7826
Epoch 138/150
768/768 [==============================] - 0s 349us/step - loss: 0.4443 - acc: 0.7878
Epoch 139/150
768/768 [==============================] - 0s 359us/step - loss: 0.4229 - acc: 0.7969
Epoch 140/150
768/768 [==============================] - 0s 358us/step - loss: 0.4382 - acc: 0.7930
Epoch 141/150
768/768 [==============================] - 0s 345us/step - loss: 0.4379 - acc: 0.7982
Epoch 142/150
768/768 [==============================] - 0s 363us/step - loss: 0.4345 - acc: 0.7839
Epoch 143/150
768/768 [==============================] - 0s 352us/step - loss: 0.4330 - acc: 0.7917
Epoch 144/150
768/768 [==============================] - 0s 372us/step - loss: 0.4351 - acc: 0.7799
Epoch 145/150
768/768 [==============================] - 0s 363us/step - loss: 0.4214 - acc: 0.8099
Epoch 146/150
768/768 [==============================] - 0s 364us/step - loss: 0.4374 - acc: 0.7865
Epoch 147/150
768/768 [==============================] - 0s 356us/step - loss: 0.4249 - acc: 0.8086
Epoch 148/150
768/768 [==============================] - 0s 355us/step - loss: 0.4271 - acc: 0.7995
Epoch 149/150
768/768 [==============================] - 0s 363us/step - loss: 0.4330 - acc: 0.7995
Epoch 150/150
768/768 [==============================] - 0s 357us/step - loss: 0.4300 - acc: 0.7904
Out[26]:
<keras.callbacks.History at 0x7f575f7ad9e8>

The evaluation is available via the evaluate method. In our case we print out the accuracy:


In [11]:
scores = model.evaluate(X, Y)
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))


768/768 [==============================] - 0s 49us/step

acc: 78.52%

We can now make predictions by calling the predict method with the input matrix as a parameter. In this case we are using the training data to predict the output classifier. This is in general not a good idea. Here it just serves the purpose of showing how the methods are used:


In [12]:
predictions = model.predict(X)

In [13]:
rounded = [round(x[0]) for x in predictions]
print(rounded)


[1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0]

In [ ]:


In [ ]: