In [1]:
!wget http://archive.ics.uci.edu/ml/machine-learning-databases/ionosphere/ionosphere.data


--2018-01-02 11:50:26--  http://archive.ics.uci.edu/ml/machine-learning-databases/ionosphere/ionosphere.data
Resolving archive.ics.uci.edu... 128.195.10.249
Connecting to archive.ics.uci.edu|128.195.10.249|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 76467 (75K) [text/plain]
Saving to: ‘ionosphere.data’

ionosphere.data     100%[===================>]  74.67K   197KB/s    in 0.4s    

2018-01-02 11:50:27 (197 KB/s) - ‘ionosphere.data’ saved [76467/76467]

  • SGDの decay は次式で学習率が変化する
  • LearningRate = LearningRate 1/(1 + decay epoch)
  • decayは以下の式で計算すると最適 Decay = LearningRate / Epochs
  • たとえば、初期学習率が0.1で50エポック学習するときは decay = 0.1 / 50 = 0.002

In [39]:
from pandas import read_csv
import numpy as np
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import SGD
from sklearn.preprocessing import LabelEncoder
from keras.utils.vis_utils import plot_model

seed = 7
np.random.seed(seed)

# load dataset
dataframe = read_csv('ionosphere.data', header=None)
dataset = dataframe.values
X = dataset[:, 0:34].astype(float)
Y = dataset[:, 34]  # g or b

# encoder class values as integers
# g or b => 0 or 1
encoder = LabelEncoder()
encoder.fit(Y)
Y = encoder.transform(Y)

# create model
model = Sequential()
model.add(Dense(34, input_dim=34, kernel_initializer='normal', activation='relu'))
model.add(Dense(1, kernel_initializer='normal', activation='sigmoid'))
#model.summary()
#plot_model(model, show_shapes=True)

# compile model
epochs = 50
learning_rate = 0.1
decay_rate = learning_rate / epochs
momentum = 0.8
sgd = SGD(lr=learning_rate, momentum=momentum, decay=decay_rate, nesterov=False)
model.compile(loss='binary_crossentropy', optimizer=sgd, metrics=['accuracy'])

model.fit(X, Y, validation_split=0.33, epochs=epochs, batch_size=28, verbose=2)


Train on 235 samples, validate on 116 samples
Epoch 1/50
0s - loss: 0.6813 - acc: 0.6468 - val_loss: 0.6373 - val_acc: 0.8621
Epoch 2/50
0s - loss: 0.6361 - acc: 0.7319 - val_loss: 0.5243 - val_acc: 0.8276
Epoch 3/50
0s - loss: 0.5557 - acc: 0.8213 - val_loss: 0.4741 - val_acc: 0.8448
Epoch 4/50
0s - loss: 0.4649 - acc: 0.8383 - val_loss: 0.4420 - val_acc: 0.9310
Epoch 5/50
0s - loss: 0.3822 - acc: 0.8681 - val_loss: 0.2777 - val_acc: 0.9483
Epoch 6/50
0s - loss: 0.3143 - acc: 0.8809 - val_loss: 0.3883 - val_acc: 0.8879
Epoch 7/50
0s - loss: 0.2749 - acc: 0.9106 - val_loss: 0.2288 - val_acc: 0.9483
Epoch 8/50
0s - loss: 0.2398 - acc: 0.9106 - val_loss: 0.1440 - val_acc: 0.9569
Epoch 9/50
0s - loss: 0.2443 - acc: 0.9064 - val_loss: 0.2047 - val_acc: 0.9569
Epoch 10/50
0s - loss: 0.1996 - acc: 0.9149 - val_loss: 0.2547 - val_acc: 0.9224
Epoch 11/50
0s - loss: 0.1910 - acc: 0.9191 - val_loss: 0.1917 - val_acc: 0.9483
Epoch 12/50
0s - loss: 0.1716 - acc: 0.9404 - val_loss: 0.1135 - val_acc: 0.9655
Epoch 13/50
0s - loss: 0.1789 - acc: 0.9319 - val_loss: 0.1030 - val_acc: 0.9741
Epoch 14/50
0s - loss: 0.1662 - acc: 0.9362 - val_loss: 0.1676 - val_acc: 0.9569
Epoch 15/50
0s - loss: 0.1422 - acc: 0.9489 - val_loss: 0.0959 - val_acc: 0.9828
Epoch 16/50
0s - loss: 0.1519 - acc: 0.9447 - val_loss: 0.1794 - val_acc: 0.9569
Epoch 17/50
0s - loss: 0.1468 - acc: 0.9489 - val_loss: 0.1461 - val_acc: 0.9741
Epoch 18/50
0s - loss: 0.1355 - acc: 0.9489 - val_loss: 0.1201 - val_acc: 0.9828
Epoch 19/50
0s - loss: 0.1276 - acc: 0.9532 - val_loss: 0.0909 - val_acc: 0.9914
Epoch 20/50
0s - loss: 0.1209 - acc: 0.9660 - val_loss: 0.1102 - val_acc: 0.9914
Epoch 21/50
0s - loss: 0.1151 - acc: 0.9660 - val_loss: 0.1045 - val_acc: 0.9914
Epoch 22/50
0s - loss: 0.1084 - acc: 0.9574 - val_loss: 0.1082 - val_acc: 0.9914
Epoch 23/50
0s - loss: 0.1092 - acc: 0.9617 - val_loss: 0.1056 - val_acc: 0.9828
Epoch 24/50
0s - loss: 0.1008 - acc: 0.9660 - val_loss: 0.0757 - val_acc: 0.9828
Epoch 25/50
0s - loss: 0.1109 - acc: 0.9617 - val_loss: 0.1068 - val_acc: 0.9914
Epoch 26/50
0s - loss: 0.0958 - acc: 0.9617 - val_loss: 0.0861 - val_acc: 0.9914
Epoch 27/50
0s - loss: 0.0951 - acc: 0.9660 - val_loss: 0.0895 - val_acc: 0.9828
Epoch 28/50
0s - loss: 0.0932 - acc: 0.9745 - val_loss: 0.0897 - val_acc: 0.9914
Epoch 29/50
0s - loss: 0.0847 - acc: 0.9787 - val_loss: 0.0885 - val_acc: 0.9914
Epoch 30/50
0s - loss: 0.0867 - acc: 0.9745 - val_loss: 0.0882 - val_acc: 0.9914
Epoch 31/50
0s - loss: 0.0845 - acc: 0.9745 - val_loss: 0.0840 - val_acc: 0.9914
Epoch 32/50
0s - loss: 0.0809 - acc: 0.9830 - val_loss: 0.0849 - val_acc: 0.9914
Epoch 33/50
0s - loss: 0.0765 - acc: 0.9830 - val_loss: 0.0872 - val_acc: 0.9914
Epoch 34/50
0s - loss: 0.0796 - acc: 0.9787 - val_loss: 0.0952 - val_acc: 0.9914
Epoch 35/50
0s - loss: 0.0753 - acc: 0.9830 - val_loss: 0.0762 - val_acc: 0.9828
Epoch 36/50
0s - loss: 0.0751 - acc: 0.9830 - val_loss: 0.0770 - val_acc: 0.9828
Epoch 37/50
0s - loss: 0.0704 - acc: 0.9830 - val_loss: 0.0884 - val_acc: 0.9828
Epoch 38/50
0s - loss: 0.0713 - acc: 0.9787 - val_loss: 0.0728 - val_acc: 0.9828
Epoch 39/50
0s - loss: 0.0740 - acc: 0.9787 - val_loss: 0.0874 - val_acc: 0.9828
Epoch 40/50
0s - loss: 0.0692 - acc: 0.9745 - val_loss: 0.0730 - val_acc: 0.9828
Epoch 41/50
0s - loss: 0.0652 - acc: 0.9787 - val_loss: 0.0836 - val_acc: 0.9828
Epoch 42/50
0s - loss: 0.0665 - acc: 0.9830 - val_loss: 0.0861 - val_acc: 0.9914
Epoch 43/50
0s - loss: 0.0663 - acc: 0.9830 - val_loss: 0.0757 - val_acc: 0.9828
Epoch 44/50
0s - loss: 0.0657 - acc: 0.9787 - val_loss: 0.0690 - val_acc: 0.9828
Epoch 45/50
0s - loss: 0.0619 - acc: 0.9830 - val_loss: 0.0916 - val_acc: 0.9828
Epoch 46/50
0s - loss: 0.0687 - acc: 0.9830 - val_loss: 0.0697 - val_acc: 0.9828
Epoch 47/50
0s - loss: 0.0666 - acc: 0.9872 - val_loss: 0.0617 - val_acc: 0.9828
Epoch 48/50
0s - loss: 0.0632 - acc: 0.9830 - val_loss: 0.0819 - val_acc: 0.9914
Epoch 49/50
0s - loss: 0.0588 - acc: 0.9830 - val_loss: 0.0776 - val_acc: 0.9828
Epoch 50/50
0s - loss: 0.0590 - acc: 0.9872 - val_loss: 0.0640 - val_acc: 0.9828
Out[39]:
<keras.callbacks.History at 0x1c1fbc9780>

In [28]:
from IPython.display import Image
Image('model.png')


Out[28]:

In [43]:
from pandas import read_csv
import numpy as np
import math
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import SGD
from sklearn.preprocessing import LabelEncoder
from keras.callbacks import LearningRateScheduler
from keras.utils.vis_utils import plot_model

def step_decay(epoch):
    initial_lrate = 0.1
    drop = 0.5
    epochs_drop = 10.0
    lrate = initial_lrate * math.pow(drop, math.floor((1 + epoch) / epochs_drop))
    return lrate

seed = 7
np.random.seed(seed)

# load dataset
dataframe = read_csv('ionosphere.data', header=None)
dataset = dataframe.values
X = dataset[:, 0:34].astype(float)
Y = dataset[:, 34]  # g or b

# encoder class values as integers
# g or b => 0 or 1
encoder = LabelEncoder()
encoder.fit(Y)
Y = encoder.transform(Y)

# create model
model = Sequential()
model.add(Dense(34, input_dim=34, kernel_initializer='normal', activation='relu'))
model.add(Dense(1, kernel_initializer='normal', activation='sigmoid'))
#model.summary()
#plot_model(model, show_shapes=True)

# compile model
sgd = SGD(lr=0.0, momentum=0.9, decay=0.0, nesterov=False)
model.compile(loss='binary_crossentropy', optimizer=sgd, metrics=['accuracy'])

# learning schedule callback
lrate = LearningRateScheduler(step_decay)

model.fit(X, Y, validation_split=0.33, epochs=epochs, batch_size=28,
          callbacks=[lrate], verbose=2)


Train on 235 samples, validate on 116 samples
Epoch 1/50
0s - loss: 0.6803 - acc: 0.6468 - val_loss: 0.6189 - val_acc: 0.9138
Epoch 2/50
0s - loss: 0.6188 - acc: 0.7277 - val_loss: 0.4750 - val_acc: 0.8879
Epoch 3/50
0s - loss: 0.4954 - acc: 0.8255 - val_loss: 0.3643 - val_acc: 0.9483
Epoch 4/50
0s - loss: 0.3604 - acc: 0.8596 - val_loss: 0.3756 - val_acc: 0.8879
Epoch 5/50
0s - loss: 0.2794 - acc: 0.8809 - val_loss: 0.1522 - val_acc: 0.9655
Epoch 6/50
0s - loss: 0.2156 - acc: 0.9191 - val_loss: 0.2205 - val_acc: 0.9397
Epoch 7/50
0s - loss: 0.1791 - acc: 0.9362 - val_loss: 0.1463 - val_acc: 0.9655
Epoch 8/50
0s - loss: 0.1587 - acc: 0.9319 - val_loss: 0.0846 - val_acc: 0.9655
Epoch 9/50
0s - loss: 0.1761 - acc: 0.9362 - val_loss: 0.1572 - val_acc: 0.9655
Epoch 10/50
0s - loss: 0.1251 - acc: 0.9617 - val_loss: 0.0939 - val_acc: 0.9914
Epoch 11/50
0s - loss: 0.1123 - acc: 0.9574 - val_loss: 0.0971 - val_acc: 0.9914
Epoch 12/50
0s - loss: 0.1040 - acc: 0.9574 - val_loss: 0.0807 - val_acc: 0.9914
Epoch 13/50
0s - loss: 0.1089 - acc: 0.9617 - val_loss: 0.1002 - val_acc: 0.9914
Epoch 14/50
0s - loss: 0.0976 - acc: 0.9702 - val_loss: 0.0827 - val_acc: 0.9914
Epoch 15/50
0s - loss: 0.0952 - acc: 0.9660 - val_loss: 0.0757 - val_acc: 0.9914
Epoch 16/50
0s - loss: 0.0986 - acc: 0.9660 - val_loss: 0.0871 - val_acc: 0.9914
Epoch 17/50
0s - loss: 0.0872 - acc: 0.9702 - val_loss: 0.0875 - val_acc: 0.9914
Epoch 18/50
0s - loss: 0.0814 - acc: 0.9787 - val_loss: 0.0778 - val_acc: 0.9914
Epoch 19/50
0s - loss: 0.0780 - acc: 0.9745 - val_loss: 0.0798 - val_acc: 0.9914
Epoch 20/50
0s - loss: 0.0762 - acc: 0.9830 - val_loss: 0.0617 - val_acc: 0.9914
Epoch 21/50
0s - loss: 0.0752 - acc: 0.9787 - val_loss: 0.0832 - val_acc: 0.9914
Epoch 22/50
0s - loss: 0.0715 - acc: 0.9830 - val_loss: 0.0749 - val_acc: 0.9914
Epoch 23/50
0s - loss: 0.0710 - acc: 0.9830 - val_loss: 0.0676 - val_acc: 0.9914
Epoch 24/50
0s - loss: 0.0691 - acc: 0.9830 - val_loss: 0.0801 - val_acc: 0.9914
Epoch 25/50
0s - loss: 0.0667 - acc: 0.9830 - val_loss: 0.0678 - val_acc: 0.9914
Epoch 26/50
0s - loss: 0.0663 - acc: 0.9830 - val_loss: 0.0742 - val_acc: 0.9914
Epoch 27/50
0s - loss: 0.0638 - acc: 0.9830 - val_loss: 0.0713 - val_acc: 0.9914
Epoch 28/50
0s - loss: 0.0637 - acc: 0.9830 - val_loss: 0.0701 - val_acc: 0.9914
Epoch 29/50
0s - loss: 0.0622 - acc: 0.9830 - val_loss: 0.0718 - val_acc: 0.9914
Epoch 30/50
0s - loss: 0.0613 - acc: 0.9830 - val_loss: 0.0672 - val_acc: 0.9914
Epoch 31/50
0s - loss: 0.0607 - acc: 0.9872 - val_loss: 0.0719 - val_acc: 0.9914
Epoch 32/50
0s - loss: 0.0598 - acc: 0.9872 - val_loss: 0.0709 - val_acc: 0.9914
Epoch 33/50
0s - loss: 0.0591 - acc: 0.9830 - val_loss: 0.0679 - val_acc: 0.9914
Epoch 34/50
0s - loss: 0.0591 - acc: 0.9830 - val_loss: 0.0689 - val_acc: 0.9914
Epoch 35/50
0s - loss: 0.0590 - acc: 0.9872 - val_loss: 0.0738 - val_acc: 0.9914
Epoch 36/50
0s - loss: 0.0577 - acc: 0.9872 - val_loss: 0.0695 - val_acc: 0.9914
Epoch 37/50
0s - loss: 0.0574 - acc: 0.9830 - val_loss: 0.0662 - val_acc: 0.9914
Epoch 38/50
0s - loss: 0.0575 - acc: 0.9830 - val_loss: 0.0684 - val_acc: 0.9914
Epoch 39/50
0s - loss: 0.0570 - acc: 0.9830 - val_loss: 0.0635 - val_acc: 0.9914
Epoch 40/50
0s - loss: 0.0562 - acc: 0.9830 - val_loss: 0.0654 - val_acc: 0.9914
Epoch 41/50
0s - loss: 0.0555 - acc: 0.9872 - val_loss: 0.0700 - val_acc: 0.9914
Epoch 42/50
0s - loss: 0.0556 - acc: 0.9872 - val_loss: 0.0685 - val_acc: 0.9914
Epoch 43/50
0s - loss: 0.0554 - acc: 0.9872 - val_loss: 0.0670 - val_acc: 0.9914
Epoch 44/50
0s - loss: 0.0553 - acc: 0.9872 - val_loss: 0.0670 - val_acc: 0.9914
Epoch 45/50
0s - loss: 0.0552 - acc: 0.9872 - val_loss: 0.0647 - val_acc: 0.9914
Epoch 46/50
0s - loss: 0.0551 - acc: 0.9872 - val_loss: 0.0646 - val_acc: 0.9914
Epoch 47/50
0s - loss: 0.0555 - acc: 0.9872 - val_loss: 0.0696 - val_acc: 0.9914
Epoch 48/50
0s - loss: 0.0544 - acc: 0.9872 - val_loss: 0.0690 - val_acc: 0.9914
Epoch 49/50
0s - loss: 0.0543 - acc: 0.9872 - val_loss: 0.0654 - val_acc: 0.9914
Epoch 50/50
0s - loss: 0.0541 - acc: 0.9872 - val_loss: 0.0685 - val_acc: 0.9914
Out[43]:
<keras.callbacks.History at 0x1c22c78a58>

In [ ]: