Deep Neural Network in Keras

Load dependencies


In [1]:
import keras
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout # new
from keras.layers.normalization import BatchNormalization # new 
from keras import regularizers # new
from keras.optimizers import SGD

Load data


In [2]:
(X_train, y_train), (X_test, y_test) = mnist.load_data()


Downloading data from https://s3.amazonaws.com/img-datasets/mnist.npz
11059200/11490434 [===========================>..] - ETA: 0s

Preprocess Data


In [3]:
X_train = X_train.reshape(60000, 784).astype('float32')
X_test = X_test.reshape(10000, 784).astype('float32')

In [4]:
X_train /= 255
X_test /= 255

In [5]:
n_classes = 10
y_train = keras.utils.to_categorical(y_train, n_classes)
y_test = keras.utils.to_categorical(y_test, n_classes)

Design neural network architecture


In [6]:
model = Sequential()

model.add(Dense(64, activation='relu', input_shape=(784,)))
model.add(BatchNormalization())
model.add(Dropout(0.5))

model.add(Dense(64, activation='relu'))
model.add(BatchNormalization())
model.add(Dropout(0.5))

model.add(Dense(64, activation='relu'))
model.add(BatchNormalization())
model.add(Dropout(0.5))

model.add(Dense(10, activation='softmax'))

In [7]:
model.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 64)                50240     
_________________________________________________________________
batch_normalization_1 (Batch (None, 64)                256       
_________________________________________________________________
dropout_1 (Dropout)          (None, 64)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 64)                4160      
_________________________________________________________________
batch_normalization_2 (Batch (None, 64)                256       
_________________________________________________________________
dropout_2 (Dropout)          (None, 64)                0         
_________________________________________________________________
dense_3 (Dense)              (None, 64)                4160      
_________________________________________________________________
batch_normalization_3 (Batch (None, 64)                256       
_________________________________________________________________
dropout_3 (Dropout)          (None, 64)                0         
_________________________________________________________________
dense_4 (Dense)              (None, 10)                650       
=================================================================
Total params: 59,978
Trainable params: 59,594
Non-trainable params: 384
_________________________________________________________________

Configure model


In [8]:
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

Train


In [9]:
model.fit(X_train, y_train, validation_data=(X_test, y_test), batch_size=128, epochs=10, verbose=1)


Train on 60000 samples, validate on 10000 samples
Epoch 1/10
60000/60000 [==============================] - 3s - loss: 1.1909 - acc: 0.6221 - val_loss: 0.3125 - val_acc: 0.9085
Epoch 2/10
60000/60000 [==============================] - 3s - loss: 0.5716 - acc: 0.8319 - val_loss: 0.2476 - val_acc: 0.9263
Epoch 3/10
60000/60000 [==============================] - 3s - loss: 0.4562 - acc: 0.8704 - val_loss: 0.2076 - val_acc: 0.9380
Epoch 4/10
60000/60000 [==============================] - 3s - loss: 0.4044 - acc: 0.8878 - val_loss: 0.1841 - val_acc: 0.9461
Epoch 5/10
60000/60000 [==============================] - 3s - loss: 0.3685 - acc: 0.8981 - val_loss: 0.1808 - val_acc: 0.9471
Epoch 6/10
60000/60000 [==============================] - 3s - loss: 0.3438 - acc: 0.9044 - val_loss: 0.1701 - val_acc: 0.9516
Epoch 7/10
60000/60000 [==============================] - 3s - loss: 0.3288 - acc: 0.9089 - val_loss: 0.1648 - val_acc: 0.9531
Epoch 8/10
60000/60000 [==============================] - 3s - loss: 0.3096 - acc: 0.9146 - val_loss: 0.1459 - val_acc: 0.9584
Epoch 9/10
60000/60000 [==============================] - 3s - loss: 0.2981 - acc: 0.9168 - val_loss: 0.1480 - val_acc: 0.9576
Epoch 10/10
60000/60000 [==============================] - 3s - loss: 0.2913 - acc: 0.9194 - val_loss: 0.1475 - val_acc: 0.9575
Out[9]:
<keras.callbacks.History at 0x7fcfb112ee48>

In [ ]: