Intermediate Neural Network in Keras

Load dependencies


In [1]:
import keras
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import SGD


Using TensorFlow backend.

Load data


In [2]:
(X_train, y_train), (X_test, y_test) = mnist.load_data()

Preprocess Data


In [3]:
X_train = X_train.reshape(60000, 784).astype('float32')
X_test = X_test.reshape(10000, 784).astype('float32')

In [4]:
X_train /= 255
X_test /= 255

In [5]:
n_classes = 10
y_train = keras.utils.to_categorical(y_train, n_classes)
y_test = keras.utils.to_categorical(y_test, n_classes)

Design neural network architecture


In [6]:
model = Sequential()
model.add(Dense(64, activation='relu', input_shape=(784,)))
model.add(Dense(64, activation='relu'))
model.add(Dense(10, activation='softmax'))

In [7]:
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 64)                50240     
_________________________________________________________________
dense_2 (Dense)              (None, 64)                4160      
_________________________________________________________________
dense_3 (Dense)              (None, 10)                650       
=================================================================
Total params: 55,050
Trainable params: 55,050
Non-trainable params: 0
_________________________________________________________________

In [9]:
(64*784)+64


Out[9]:
50240

In [8]:
(64*64)+64


Out[8]:
4160

In [10]:
(10*64)+10


Out[10]:
650

Configure model


In [11]:
model.compile(loss='categorical_crossentropy', optimizer=SGD(lr=0.01), metrics=['accuracy'])

Train


In [12]:
model.fit(X_train, y_train, validation_data=(X_test, y_test), batch_size=128, epochs=10, verbose=1)


Train on 60000 samples, validate on 10000 samples
Epoch 1/10
60000/60000 [==============================] - 1s - loss: 1.3285 - acc: 0.6546 - val_loss: 0.6498 - val_acc: 0.8395
Epoch 2/10
60000/60000 [==============================] - 1s - loss: 0.5260 - acc: 0.8626 - val_loss: 0.4210 - val_acc: 0.8868
Epoch 3/10
60000/60000 [==============================] - 1s - loss: 0.4026 - acc: 0.8880 - val_loss: 0.3545 - val_acc: 0.9000
Epoch 4/10
60000/60000 [==============================] - 1s - loss: 0.3545 - acc: 0.8998 - val_loss: 0.3221 - val_acc: 0.9092
Epoch 5/10
60000/60000 [==============================] - 1s - loss: 0.3262 - acc: 0.9074 - val_loss: 0.2993 - val_acc: 0.9140
Epoch 6/10
60000/60000 [==============================] - 1s - loss: 0.3059 - acc: 0.9134 - val_loss: 0.2845 - val_acc: 0.9186
Epoch 7/10
60000/60000 [==============================] - 1s - loss: 0.2902 - acc: 0.9177 - val_loss: 0.2723 - val_acc: 0.9233
Epoch 8/10
60000/60000 [==============================] - 1s - loss: 0.2768 - acc: 0.9213 - val_loss: 0.2619 - val_acc: 0.9239
Epoch 9/10
60000/60000 [==============================] - 1s - loss: 0.2658 - acc: 0.9247 - val_loss: 0.2521 - val_acc: 0.9265
Epoch 10/10
60000/60000 [==============================] - 1s - loss: 0.2558 - acc: 0.9276 - val_loss: 0.2432 - val_acc: 0.9297
Out[12]:
<keras.callbacks.History at 0x7f51b5340da0>

In [ ]: