In [0]:
# generate circles
X, y = make_circles(n_samples=1000, noise=0.1, random_state=1)

We can create a graph of the dataset, plotting the x and y coordinates of the input variables (X) and coloring each point by the class value (0 or 1).


In [3]:
# scatter plot of the circles dataset with points colored by class
from sklearn.datasets import make_circles
from numpy import where
from matplotlib import pyplot
# generate circles
X, y = make_circles(n_samples=1000, noise=0.1, random_state=1)
# select indices of points with each class label
for i in range(2):
	samples_ix = where(y == i)
	pyplot.scatter(X[samples_ix, 0], X[samples_ix, 1], label=str(i))
pyplot.legend()
pyplot.show()


Normally, we would prepare the data scaling using a training dataset and apply it to a test dataset. To keep things simple in this tutorial, we will scale all of the data together before splitting it into train and test sets.


In [0]:
# generate 2d classification dataset
X, y = make_circles(n_samples=1000, noise=0.1, random_state=1)
# scale input data to [-1,1]
scaler = MinMaxScaler(feature_range=(-1, 1))
X = scaler.fit_transform(X)

Half of the data will be used for training and the remaining 500 examples will be used as the test set. In this tutorial, the test set will also serve as the validation dataset so we can get an idea of how the model performs on the holdout set during training.


In [0]:
# split into train and test
n_train = 500
trainX, testX = X[:n_train, :], X[n_train:, :]
trainy, testy = y[:n_train], y[n_train:]

The model will have an input layer with two inputs, for the two variables in the dataset, one hidden layer with five nodes, and an output layer with one node used to predict the class probability. The hidden layer will use the hyperbolic tangent activation function (tanh) and the output layer will use the logistic activation function (sigmoid) to predict class 0 or class 1 or something in between.

Using the hyperbolic tangent activation function in hidden layers was the best practice in the 1990s and 2000s, performing generally better than the logistic function when used in the hidden layer. It was also good practice to initialize the network weights to small random values from a uniform distribution. Here, we will initialize weights randomly from the range [0.0, 1.0].


In [0]:
# define model
model = Sequential()
init = RandomUniform(minval=0, maxval=1)
model.add(Dense(5, input_dim=2, activation='tanh', kernel_initializer=init))
model.add(Dense(1, activation='sigmoid', kernel_initializer=init))

The model uses the binary cross entropy loss function and is optimized using stochastic gradient descent with a learning rate of 0.01 and a large momentum of 0.9.


In [0]:
# compile model
opt = SGD(lr=0.01, momentum=0.9)
model.compile(loss='binary_crossentropy', optimizer=opt, metrics=['accuracy'])

The model is trained for 500 training epochs and the test dataset is evaluated at the end of each epoch along with the training dataset.


In [11]:
# fit model
history = model.fit(trainX, trainy, validation_data=(testX, testy), epochs=500, verbose=0)
print(history)


<keras.callbacks.History object at 0x7f9f71c57b38>

In [14]:
print(history.history['val_acc'])


{'val_loss': [0.43011984848976137, 0.43057198119163514, 0.4280322415828705, 0.43002967882156373, 0.42999601340293886, 0.42833558106422426, 0.4280111365318298, 0.427346540927887, 0.4294126896858215, 0.42761437368392946, 0.4272139525413513, 0.4276174664497375, 0.4273353867530823, 0.42779114508628846, 0.4270653684139252, 0.4268846626281738, 0.42732132697105407, 0.4263404998779297, 0.42593969345092775, 0.42772060775756837, 0.42561402106285096, 0.4254393789768219, 0.4254589424133301, 0.4241148390769959, 0.42556182432174683, 0.426117116689682, 0.4252186465263367, 0.4242524378299713, 0.424337553024292, 0.42432182145118713, 0.42319016122817993, 0.4241304888725281, 0.42358189940452573, 0.4240081079006195, 0.4229042341709137, 0.4245725827217102, 0.4239231297969818, 0.4230211102962494, 0.42267962265014647, 0.4224630720615387, 0.4229019567966461, 0.42357458305358886, 0.4218672480583191, 0.4212719476222992, 0.42271635389328005, 0.42125189423561094, 0.42141840696334837, 0.4231243495941162, 0.4221935045719147, 0.4210884211063385, 0.4229128019809723, 0.42148863458633423, 0.4193074703216553, 0.42238997483253476, 0.4220678770542145, 0.4242210900783539, 0.42051838254928586, 0.41931243300437926, 0.41888614630699156, 0.41908328485488894, 0.42143217420578005, 0.4184512462615967, 0.42026142644882203, 0.41983631896972656, 0.41828853178024294, 0.4182149827480316, 0.4204205987453461, 0.4195990977287293, 0.41821458292007446, 0.41779394197463987, 0.4191861076354981, 0.4174310965538025, 0.417885998249054, 0.4178313255310059, 0.4165824055671692, 0.4194113621711731, 0.41770838141441347, 0.41677812218666077, 0.41736098456382753, 0.41648125505447386, 0.41974452543258667, 0.41596537923812865, 0.4182234539985657, 0.4175756525993347, 0.4163946862220764, 0.4163644413948059, 0.4158966519832611, 0.4169094617366791, 0.4166735653877258, 0.4170000910758972, 0.41510841369628904, 0.4146670458316803, 0.4164108395576477, 0.4170852563381195, 0.41542808365821837, 0.41586766481399534, 0.41543954229354857, 0.4183193850517273, 0.4156461911201477, 0.4142953722476959, 0.4154598867893219, 0.41309352064132693, 0.41353707313537597, 0.41675911283493045, 0.4131622006893158, 0.4136947901248932, 0.4161478955745697, 0.4144837934970856, 0.4135006422996521, 0.4144051079750061, 0.4143778328895569, 0.41331724548339843, 0.4124622917175293, 0.41332714796066283, 0.4121026711463928, 0.4123138771057129, 0.4157394404411316, 0.4127749333381653, 0.4127385175228119, 0.4109905259609222, 0.41140255308151247, 0.41068374586105344, 0.41209257340431216, 0.4122040271759033, 0.4113838381767273, 0.4107655394077301, 0.4112778949737549, 0.4103920056819916, 0.4118495402336121, 0.41064164400100706, 0.41151352643966677, 0.40980668425559996, 0.412247266292572, 0.41216918182373047, 0.4106038780212402, 0.41217545938491823, 0.4107028384208679, 0.41060137367248534, 0.40973060560226443, 0.41167179083824157, 0.4092238047122955, 0.4116791133880615, 0.40944430327415465, 0.4090556998252869, 0.41054677391052247, 0.4104935302734375, 0.40883085823059084, 0.4106455762386322, 0.4104017949104309, 0.4082315559387207, 0.409571742773056, 0.4095006730556488, 0.40866838693618773, 0.4074945397377014, 0.41012706327438353, 0.4078360803127289, 0.4079875509738922, 0.4065663547515869, 0.41117380738258363, 0.4078250527381897, 0.40693607902526857, 0.4076268434524536, 0.41355199241638185, 0.4075638828277588, 0.41021729254722594, 0.40778457856178285, 0.4072789742946625, 0.40835053658485415, 0.40839605903625487, 0.41154289770126345, 0.41332107520103456, 0.4084523015022278, 0.40628459191322325, 0.40676003670692445, 0.40706687164306643, 0.40607666540145876, 0.4083135237693787, 0.4058819181919098, 0.4065640902519226, 0.40600471687316897, 0.40747157669067385, 0.4061166093349457, 0.4096806104183197, 0.4071262722015381, 0.4034809012413025, 0.40533148574829103, 0.4067722945213318, 0.405932436466217, 0.4057833616733551, 0.4097926115989685, 0.4054653549194336, 0.4047522873878479, 0.4058828237056732, 0.4052708203792572, 0.40434831619262696, 0.40453729796409604, 0.40552801179885867, 0.40439604258537293, 0.4043206121921539, 0.4046443021297455, 0.4053993685245514, 0.40337270498275757, 0.4046584496498108, 0.40484199905395507, 0.40469615411758425, 0.40368851256370547, 0.407005943775177, 0.40695839166641234, 0.40563694334030154, 0.4042704803943634, 0.40569207453727724, 0.4035028386116028, 0.4051368601322174, 0.40452250933647155, 0.40518599867820737, 0.40181365156173704, 0.40135832619667056, 0.40321751809120177, 0.40239707589149476, 0.40186323976516725, 0.4025533628463745, 0.4019033393859863, 0.40289814019203185, 0.40110485219955444, 0.4022939682006836, 0.4080640869140625, 0.40397745513916017, 0.40292775130271913, 0.4019098336696625, 0.40218831586837767, 0.4015259783267975, 0.40276931834220886, 0.4037612328529358, 0.4010275263786316, 0.4011414713859558, 0.39936894154548647, 0.40221213722229004, 0.40024152970314025, 0.4008279695510864, 0.403349041223526, 0.4001360049247742, 0.4012374322414398, 0.40078705763816835, 0.4039536645412445, 0.40519729828834533, 0.40316162061691285, 0.4028991663455963, 0.4003692684173584, 0.4002140429019928, 0.40034948515892027, 0.399355352640152, 0.4031881310939789, 0.39836904931068423, 0.39916984367370606, 0.39850311064720156, 0.3994899027347565, 0.4006524579524994, 0.4043533451557159, 0.39879303216934203, 0.39824588131904604, 0.3992121350765228, 0.39962132716178894, 0.39915925908088684, 0.39968320298194887, 0.399211101770401, 0.3998641738891602, 0.3978565034866333, 0.39726488637924195, 0.39820343112945555, 0.39844442319869994, 0.39948717737197875, 0.39702933526039125, 0.39829315614700317, 0.399285897731781, 0.3970765736103058, 0.4009062123298645, 0.39721005463600156, 0.40002348184585573, 0.39902727222442624, 0.3970395815372467, 0.401980771780014, 0.3955142800807953, 0.40179931044578554, 0.3961661322116852, 0.39850304222106936, 0.39740250730514526, 0.39728622007369996, 0.4015444700717926, 0.3961856427192688, 0.39695564985275267, 0.39597611951828005, 0.39806649255752563, 0.39618940782546996, 0.397775794506073, 0.3958254709243774, 0.39534006094932556, 0.3969547588825226, 0.39767133378982544, 0.3967670164108276, 0.39543919277191164, 0.3959842894077301, 0.39528106355667114, 0.3951454575061798, 0.3955294873714447, 0.3948938212394714, 0.3970710718631744, 0.39581174063682556, 0.3995988039970398, 0.3950605826377869, 0.3940645577907562, 0.3956179761886597, 0.39553452062606814, 0.39598203682899474, 0.39454842495918274, 0.3959978473186493, 0.3949114241600037, 0.3948921866416931, 0.394782879114151, 0.3943114879131317, 0.3935942208766937, 0.3948671271800995, 0.39554265928268434, 0.39594606232643126, 0.3988215606212616, 0.4017740309238434, 0.3957636260986328, 0.39336722493171694, 0.4041586389541626, 0.40087805199623106, 0.3984840142726898, 0.39441476106643675, 0.3944854316711426, 0.3913156018257141, 0.3968399085998535, 0.39339334464073183, 0.39542359399795535, 0.3923949728012085, 0.3931206936836243, 0.39388384318351743, 0.39340878653526307, 0.3912754719257355, 0.3918791370391846, 0.3925331003665924, 0.39342948150634766, 0.3936176164150238, 0.391931440114975, 0.39252324748039247, 0.39232648158073424, 0.39318964529037476, 0.3945507740974426, 0.396398805141449, 0.39340402150154113, 0.3919884488582611, 0.39383851957321164, 0.3941219515800476, 0.39585833954811095, 0.3922943243980408, 0.39268487906455996, 0.3917564978599548, 0.3906417088508606, 0.39413376784324644, 0.3913959274291992, 0.3902242467403412, 0.3923031687736511, 0.3933717324733734, 0.391569034576416, 0.3924052712917328, 0.3917869846820831, 0.39309368562698366, 0.3894603018760681, 0.3941614899635315, 0.3908105757236481, 0.3923764789104462, 0.39556018614768984, 0.3925126633644104, 0.39309435296058653, 0.3904132080078125, 0.3935641701221466, 0.3904661524295807, 0.389948504447937, 0.39113253927230834, 0.3907767276763916, 0.3907600808143616, 0.3902393317222595, 0.39255585861206055, 0.3920722739696503, 0.38964904832839964, 0.39188601064682005, 0.38857529807090757, 0.39073530554771424, 0.38862195229530333, 0.39023126363754274, 0.3917512817382813, 0.38876607275009156, 0.3922752375602722, 0.39016998767852784, 0.389415783405304, 0.39733663988113405, 0.3928802087306976, 0.39005296778678894, 0.3892160291671753, 0.3883548228740692, 0.38875397920608523, 0.3892013077735901, 0.3886918413639069, 0.3873630986213684, 0.3897934732437134, 0.38896175169944763, 0.39105911254882814, 0.3930108895301819, 0.3870426735877991, 0.3933533537387848, 0.38832697200775146, 0.3890192542076111, 0.3885640554428101, 0.3874630057811737, 0.38735585260391237, 0.3917605586051941, 0.3878185396194458, 0.38872694516181944, 0.3892905440330505, 0.38464023447036744, 0.3906979744434357, 0.38969129300117494, 0.39376833248138426, 0.38880479788780214, 0.3880778322219849, 0.3870134835243225, 0.38582199764251707, 0.3905634801387787, 0.38541579914093016, 0.3871151480674744, 0.38717487835884096, 0.38621790552139285, 0.39125106954574584, 0.38559346103668213, 0.3845572967529297, 0.388795624256134, 0.3847466306686401, 0.3869751009941101, 0.3876092143058777, 0.38625459027290343, 0.38971371626853946, 0.39379774713516236, 0.38790283155441285, 0.3866020932197571, 0.3854830677509308, 0.38501389050483703, 0.38658404183387757, 0.3878939538002014, 0.38820786905288696, 0.38516448450088503, 0.38515230083465574, 0.38471086502075197, 0.3855249879360199, 0.3851297183036804, 0.3844055905342102, 0.3868647162914276, 0.3857248203754425, 0.3845213794708252, 0.3844241292476654, 0.3830185511112213, 0.3842582006454468, 0.38411597490310667, 0.3871797511577606, 0.3840324239730835, 0.38496655702590943, 0.38588358855247495, 0.3822281141281128, 0.384665256023407, 0.3878334834575653, 0.38877684259414674, 0.3853733651638031, 0.384616140127182, 0.3856375877857208, 0.38274402141571046, 0.38244430541992186, 0.3855442023277283, 0.3827960991859436, 0.38310035872459414, 0.38323660254478453, 0.3838471899032593, 0.38254461431503295, 0.38301357221603394, 0.38273538327217105, 0.38355938911437987, 0.384292738199234, 0.3827219843864441, 0.3826396508216858, 0.3843685641288757, 0.38334794878959655, 0.3837800803184509, 0.3832994918823242, 0.38345083832740784, 0.3825871889591217, 0.3828009960651398, 0.3820343146324158, 0.38204163789749146, 0.38266069555282595, 0.3805992741584778], 'val_acc': [0.8320000009536743, 0.8280000009536743, 0.8320000009536743, 0.8340000004768372, 0.8260000009536743, 0.8380000009536743, 0.8360000009536743, 0.8340000009536743, 0.8400000009536743, 0.8340000009536743, 0.8340000009536743, 0.8320000009536743, 0.8360000009536743, 0.8360000009536743, 0.8340000009536743, 0.8340000009536743, 0.8360000009536743, 0.8360000009536743, 0.8320000009536743, 0.8380000004768372, 0.8380000009536743, 0.8360000009536743, 0.8400000009536743, 0.8340000009536743, 0.8360000009536743, 0.8380000009536743, 0.8340000009536743, 0.8340000009536743, 0.8380000009536743, 0.8380000009536743, 0.8380000009536743, 0.8400000009536743, 0.8340000009536743, 0.8360000009536743, 0.8360000009536743, 0.8360000009536743, 0.8400000009536743, 0.8400000009536743, 0.8340000009536743, 0.8380000009536743, 0.8340000009536743, 0.8400000009536743, 0.8380000009536743, 0.8380000009536743, 0.8400000009536743, 0.8400000009536743, 0.8380000009536743, 0.8360000009536743, 0.8400000009536743, 0.8360000009536743, 0.8360000009536743, 0.8440000009536743, 0.8380000004768372, 0.8400000009536743, 0.8360000004768372, 0.8360000004768372, 0.8380000009536743, 0.8340000009536743, 0.8380000009536743, 0.8400000009536743, 0.8400000009536743, 0.8340000009536743, 0.8400000009536743, 0.8380000009536743, 0.8400000009536743, 0.8360000009536743, 0.8360000009536743, 0.8400000009536743, 0.8340000009536743, 0.8340000009536743, 0.8380000009536743, 0.8380000009536743, 0.8380000009536743, 0.8360000009536743, 0.8420000009536743, 0.8440000009536743, 0.8340000009536743, 0.8380000009536743, 0.8400000009536743, 0.8380000009536743, 0.8420000004768372, 0.8340000009536743, 0.8420000009536743, 0.8400000009536743, 0.8360000009536743, 0.8400000009536743, 0.8420000009536743, 0.8440000009536743, 0.8400000009536743, 0.8420000009536743, 0.8340000009536743, 0.8360000009536743, 0.8460000009536743, 0.8420000009536743, 0.8380000009536743, 0.8380000009536743, 0.8340000009536743, 0.8260000004768372, 0.8340000009536743, 0.8380000009536743, 0.8420000009536743, 0.8340000009536743, 0.8360000009536743, 0.8420000004768372, 0.8420000009536743, 0.8380000009536743, 0.8380000009536743, 0.8400000009536743, 0.8360000009536743, 0.8400000009536743, 0.8380000009536743, 0.8380000004768372, 0.8360000009536743, 0.8400000009536743, 0.8380000009536743, 0.8320000009536743, 0.8440000009536743, 0.8380000009536743, 0.8440000009536743, 0.8360000009536743, 0.8360000009536743, 0.8380000009536743, 0.8400000009536743, 0.8380000009536743, 0.8380000004768372, 0.8360000009536743, 0.8380000009536743, 0.8380000009536743, 0.8400000004768372, 0.8360000009536743, 0.8400000004768372, 0.8360000009536743, 0.8440000009536743, 0.8360000009536743, 0.8440000009536743, 0.8399999990463257, 0.8400000004768372, 0.8399999990463257, 0.8380000004768372, 0.8360000009536743, 0.8400000009536743, 0.8440000009536743, 0.8340000009536743, 0.8360000004768372, 0.8400000009536743, 0.8400000009536743, 0.8440000009536743, 0.8400000009536743, 0.8420000009536743, 0.8360000004768372, 0.8460000009536743, 0.8480000009536743, 0.8340000004768372, 0.8440000009536743, 0.8420000009536743, 0.8380000009536743, 0.8420000009536743, 0.8400000009536743, 0.8420000004768372, 0.8360000009536743, 0.8400000004768372, 0.8379999990463257, 0.8220000004768372, 0.8420000009536743, 0.8420000004768372, 0.8380000009536743, 0.8380000009536743, 0.8340000009536743, 0.8400000004768372, 0.8339999990463257, 0.8260000004768372, 0.8360000004768372, 0.8340000009536743, 0.8460000009536743, 0.8400000004768372, 0.8340000009536743, 0.8360000004768372, 0.8380000004768372, 0.8379999990463257, 0.8420000004768372, 0.8420000009536743, 0.8400000004768372, 0.8420000004768372, 0.8380000009536743, 0.8380000004768372, 0.8480000004768372, 0.8380000009536743, 0.8400000004768372, 0.8380000009536743, 0.8399999990463257, 0.8440000004768372, 0.8400000009536743, 0.8380000004768372, 0.8440000004768372, 0.8380000004768372, 0.8340000004768372, 0.8440000009536743, 0.8379999990463257, 0.8460000004768372, 0.8399999990463257, 0.8400000004768372, 0.8380000004768372, 0.8440000004768372, 0.8400000004768372, 0.8379999990463257, 0.8420000004768372, 0.8299999990463257, 0.8300000004768372, 0.8380000004768372, 0.8380000004768372, 0.8280000009536743, 0.8380000004768372, 0.8360000004768372, 0.8399999990463257, 0.8399999990463257, 0.8440000009536743, 0.8380000004768372, 0.8460000004768372, 0.8439999990463257, 0.8440000004768372, 0.8399999990463257, 0.8420000004768372, 0.8440000004768372, 0.8420000004768372, 0.8420000004768372, 0.8259999995231628, 0.8340000004768372, 0.8380000004768372, 0.8360000004768372, 0.8440000009536743, 0.8460000004768372, 0.8379999990463257, 0.8340000009536743, 0.8420000004768372, 0.8440000009536743, 0.8420000004768372, 0.8420000004768372, 0.8420000004768372, 0.8360000009536743, 0.8420000004768372, 0.8400000004768372, 0.8380000004768372, 0.8419999990463257, 0.8400000004768372, 0.8339999990463257, 0.8320000004768372, 0.8399999990463257, 0.8440000004768372, 0.8380000004768372, 0.8340000004768372, 0.8460000004768372, 0.8420000004768372, 0.8360000004768372, 0.8420000004768372, 0.8460000004768372, 0.8420000004768372, 0.8380000004768372, 0.8259999995231628, 0.8440000004768372, 0.8460000009536743, 0.8400000004768372, 0.8400000009536743, 0.8420000004768372, 0.8419999990463257, 0.8420000004768372, 0.8380000004768372, 0.8440000004768372, 0.8460000004768372, 0.8400000004768372, 0.8439999990463257, 0.8439999990463257, 0.8400000004768372, 0.8400000004768372, 0.8440000009536743, 0.8440000004768372, 0.8300000004768372, 0.8420000004768372, 0.8400000004768372, 0.8420000004768372, 0.8360000004768372, 0.8360000009536743, 0.8360000004768372, 0.8379999995231628, 0.8379999990463257, 0.8420000009536743, 0.8420000004768372, 0.8380000004768372, 0.8359999995231628, 0.8380000004768372, 0.8400000004768372, 0.8440000004768372, 0.8380000009536743, 0.8340000004768372, 0.8420000004768372, 0.8420000004768372, 0.8440000004768372, 0.8379999990463257, 0.8400000004768372, 0.8379999990463257, 0.8420000009536743, 0.8420000004768372, 0.8440000009536743, 0.8440000004768372, 0.8440000009536743, 0.8419999990463257, 0.8400000004768372, 0.8340000004768372, 0.8380000009536743, 0.8360000004768372, 0.8420000004768372, 0.8400000009536743, 0.8360000009536743, 0.8440000004768372, 0.8440000004768372, 0.8399999990463257, 0.8460000004768372, 0.8360000004768372, 0.8480000004768372, 0.8420000009536743, 0.8440000004768372, 0.8480000004768372, 0.8379999990463257, 0.8420000004768372, 0.8259999990463257, 0.8340000004768372, 0.8379999990463257, 0.8460000004768372, 0.8240000009536743, 0.8219999990463257, 0.8339999990463257, 0.8360000004768372, 0.8440000004768372, 0.8400000004768372, 0.8400000004768372, 0.8400000004768372, 0.8400000004768372, 0.8440000009536743, 0.8420000009536743, 0.8460000004768372, 0.8440000009536743, 0.8400000004768372, 0.8440000004768372, 0.8400000009536743, 0.8400000009536743, 0.8440000009536743, 0.8520000009536743, 0.8440000004768372, 0.8420000004768372, 0.8460000004768372, 0.8379999990463257, 0.8420000004768372, 0.8340000009536743, 0.8460000004768372, 0.8460000004768372, 0.8339999990463257, 0.8420000004768372, 0.8379999990463257, 0.8440000004768372, 0.8460000009536743, 0.8480000004768372, 0.8339999990463257, 0.8400000004768372, 0.8500000009536743, 0.8440000009536743, 0.8319999990463257, 0.8420000004768372, 0.8440000004768372, 0.8419999990463257, 0.8440000004768372, 0.8460000009536743, 0.8440000009536743, 0.8420000009536743, 0.8340000009536743, 0.8380000004768372, 0.8339999990463257, 0.8380000004768372, 0.8400000004768372, 0.8360000009536743, 0.8440000009536743, 0.8380000009536743, 0.8420000004768372, 0.8340000009536743, 0.8480000009536743, 0.8340000009536743, 0.8440000004768372, 0.8400000009536743, 0.8340000004768372, 0.8400000009536743, 0.8420000009536743, 0.8460000009536743, 0.8440000004768372, 0.8420000004768372, 0.8460000009536743, 0.8420000004768372, 0.8420000004768372, 0.8460000009536743, 0.8380000004768372, 0.8299999990463257, 0.8420000004768372, 0.8340000009536743, 0.8460000004768372, 0.8400000009536743, 0.8360000009536743, 0.8440000004768372, 0.8380000009536743, 0.8460000004768372, 0.8540000009536743, 0.8460000004768372, 0.8340000004768372, 0.8359999990463257, 0.8400000004768372, 0.8339999990463257, 0.8500000009536743, 0.8440000004768372, 0.8440000009536743, 0.8400000009536743, 0.8440000004768372, 0.8460000009536743, 0.8460000004768372, 0.8460000004768372, 0.8380000009536743, 0.8420000009536743, 0.8320000009536743, 0.8420000004768372, 0.8319999990463257, 0.8440000004768372, 0.8399999990463257, 0.8460000009536743, 0.8480000009536743, 0.8380000004768372, 0.8460000009536743, 0.8380000004768372, 0.8540000009536743, 0.8480000009536743, 0.8379999990463257, 0.8420000009536743, 0.8420000004768372, 0.8440000009536743, 0.8480000009536743, 0.8500000009536743, 0.8460000009536743, 0.8500000004768371, 0.8399999990463257, 0.8380000004768372, 0.8400000009536743, 0.8420000009536743, 0.8440000009536743, 0.8420000009536743, 0.8420000009536743, 0.8399999990463257, 0.8440000004768372, 0.8460000009536743, 0.8400000009536743, 0.8500000009536743, 0.8400000009536743, 0.8420000009536743, 0.8440000009536743, 0.8420000009536743, 0.8400000009536743, 0.8480000009536743, 0.8480000004768372, 0.8480000009536743, 0.8440000009536743, 0.8400000009536743, 0.8440000009536743, 0.8440000009536743, 0.8420000004768372, 0.8380000009536743, 0.8460000009536743, 0.8440000009536743, 0.8280000009536743, 0.8380000004768372, 0.8340000009536743, 0.8440000009536743, 0.8360000009536743, 0.8540000009536743, 0.8420000009536743, 0.8380000009536743, 0.8380000009536743, 0.8500000009536743, 0.8520000009536743, 0.8480000009536743, 0.8440000009536743, 0.8520000009536743, 0.8360000009536743, 0.8420000009536743, 0.8420000009536743, 0.8400000009536743, 0.8420000009536743, 0.8420000004768372, 0.8340000009536743, 0.8440000004768372, 0.8400000009536743, 0.8440000009536743, 0.8420000009536743, 0.8479999990463257, 0.8440000009536743, 0.8380000009536743, 0.8400000009536743, 0.8420000009536743], 'loss': [0.44884377694129945, 0.44434822273254393, 0.4480580677986145, 0.4486347222328186, 0.44967810487747195, 0.44780182242393496, 0.4475244960784912, 0.44717738580703736, 0.451808465719223, 0.44535824155807496, 0.44536800479888916, 0.4464696412086487, 0.4457065601348877, 0.4444123191833496, 0.4458509650230408, 0.4429353985786438, 0.44451590967178345, 0.44378695464134216, 0.4441294801235199, 0.4450573515892029, 0.4444057559967041, 0.44409118366241457, 0.44609983038902284, 0.44717463064193724, 0.4455764527320862, 0.4425411703586578, 0.44587704277038576, 0.4421818079948425, 0.44225063514709473, 0.44453713059425354, 0.4437902829647064, 0.44107100343704225, 0.44226502990722655, 0.4412850375175476, 0.4408605628013611, 0.4404151041507721, 0.44397340512275696, 0.44545135045051576, 0.44158281111717224, 0.4391782250404358, 0.4400182557106018, 0.4402790575027466, 0.44007614374160764, 0.4444621376991272, 0.4407850019931793, 0.44076870775222776, 0.44134748077392577, 0.4375405731201172, 0.4394499664306641, 0.44211320161819456, 0.4405555477142334, 0.4391664123535156, 0.43905896949768064, 0.44135640883445737, 0.4383032524585724, 0.44196416449546816, 0.4382039556503296, 0.4383084638118744, 0.438821674823761, 0.43800499057769776, 0.4397451596260071, 0.4376300082206726, 0.4391070957183838, 0.43611508202552796, 0.43624461126327513, 0.43848231315612796, 0.4373677594661713, 0.44253912019729613, 0.4383454818725586, 0.4398190474510193, 0.43952294111251833, 0.43534437370300294, 0.4364470982551575, 0.4366639618873596, 0.43805613613128663, 0.43842250275611877, 0.43438474130630494, 0.43536301469802857, 0.43672273588180544, 0.43806942987442016, 0.43772589063644407, 0.4362443861961365, 0.4355529673099518, 0.4355966286659241, 0.4342899899482727, 0.43363034081459045, 0.43405559158325197, 0.43487109351158143, 0.43674322271347044, 0.43312984418869016, 0.4323387508392334, 0.43359256076812747, 0.43284679555892946, 0.4313437633514404, 0.43339978289604186, 0.43313305616378783, 0.4318572173118591, 0.43462323999404906, 0.43259123849868775, 0.43280252051353457, 0.4321481852531433, 0.4311196327209473, 0.4354870438575745, 0.4342679162025452, 0.43081508803367613, 0.4303966221809387, 0.4367838876247406, 0.42863185501098633, 0.4315548114776611, 0.43023316478729245, 0.4312270617485046, 0.4331765480041504, 0.4309088053703308, 0.42967060232162474, 0.4308157391548157, 0.43132571959495547, 0.4322433695793152, 0.43232662868499755, 0.43066488933563235, 0.4328933820724487, 0.4296471939086914, 0.4289793643951416, 0.4323319010734558, 0.43054564094543457, 0.42861693143844604, 0.4292027487754822, 0.4342550299167633, 0.43021178674697874, 0.4392182202339172, 0.4335389738082886, 0.42793488335609436, 0.42759813594818114, 0.43268073654174805, 0.4285739064216614, 0.4281888799667358, 0.4329910473823547, 0.42965745329856875, 0.4268684086799622, 0.4317776987552643, 0.4322252562046051, 0.4322362706661224, 0.4332112500667572, 0.4323251280784607, 0.42673174500465394, 0.4255365295410156, 0.4282579426765442, 0.4268763995170593, 0.42881597352027895, 0.4268534450531006, 0.4274894995689392, 0.42655962657928465, 0.426478716135025, 0.4292442157268524, 0.42517269563674925, 0.428648561000824, 0.42433456230163574, 0.42557573461532594, 0.4269914984703064, 0.42585212564468383, 0.4258175663948059, 0.42789474964141844, 0.42649345684051515, 0.4247331862449646, 0.4273085632324219, 0.42603253149986264, 0.4246987385749817, 0.42609381294250487, 0.427557532787323, 0.4248659129142761, 0.4264886727333069, 0.4266101281642914, 0.4286967368125916, 0.4231198143959045, 0.42522257995605467, 0.42450430512428283, 0.4257959315776825, 0.4238424892425537, 0.42438920426368715, 0.42882518815994264, 0.42637853598594666, 0.42630308246612547, 0.4278514404296875, 0.4216923072338104, 0.42275210046768186, 0.42261669921875, 0.4236332223415375, 0.4243754754066467, 0.4265084321498871, 0.4211135199069977, 0.4216006474494934, 0.42510463881492616, 0.4205217702388763, 0.42084132170677185, 0.4259366431236267, 0.42044435906410216, 0.4234801912307739, 0.42042944765090945, 0.4214733216762543, 0.42361462831497193, 0.4241578507423401, 0.4214449098110199, 0.42135967683792114, 0.4191905171871185, 0.42163001728057864, 0.41881740164756776, 0.41820785689353945, 0.42202901458740233, 0.431629269361496, 0.4211281657218933, 0.420147869348526, 0.4187995572090149, 0.42516533136367796, 0.4213539614677429, 0.42409447956085206, 0.4198923761844635, 0.421177768945694, 0.4207581739425659, 0.42187482213974, 0.42000967979431153, 0.4193674416542053, 0.42183975982666017, 0.41825954031944274, 0.4206991648674011, 0.41632240676879884, 0.42003089570999147, 0.4301879663467407, 0.4225964410305023, 0.4263741340637207, 0.42013016891479493, 0.42166856932640073, 0.42275869798660276, 0.41900090885162355, 0.4180522150993347, 0.4189097719192505, 0.41690555596351625, 0.4191503801345825, 0.41586029243469236, 0.415093780040741, 0.42103129005432127, 0.41598992586135863, 0.415881147146225, 0.4156774785518646, 0.4157061047554016, 0.4193860795497894, 0.42415197896957396, 0.42126913475990296, 0.41695847272872927, 0.4153935301303864, 0.41613569498062136, 0.41926452350616455, 0.42181278944015504, 0.4233870525360107, 0.4164373550415039, 0.41392497420310975, 0.41561128211021425, 0.4149064173698425, 0.4196122055053711, 0.41923037481307984, 0.4213645567893982, 0.41471865892410276, 0.4183536355495453, 0.41347763848304747, 0.41389066648483275, 0.4172374937534332, 0.41646866178512576, 0.41726630449295044, 0.4158887403011322, 0.4138705835342407, 0.413761549949646, 0.4136226649284363, 0.4133449192047119, 0.41519897389411925, 0.41328975319862366, 0.4118527431488037, 0.41603166246414186, 0.42202870798110964, 0.4168914110660553, 0.41755024552345277, 0.4158432946205139, 0.4134663515090942, 0.41323637342453, 0.41548758459091184, 0.4195089590549469, 0.4194840588569641, 0.4148901677131653, 0.4128577585220337, 0.4151229250431061, 0.41675556993484497, 0.4131352615356445, 0.41596412229537966, 0.4123459119796753, 0.4124702169895172, 0.4143700225353241, 0.41386749792099, 0.41220597410202026, 0.41113172149658206, 0.4156484777927399, 0.4164969143867493, 0.40938879156112673, 0.41199841570854184, 0.4142757376432419, 0.4149783763885498, 0.41282576060295106, 0.41162710523605345, 0.4139617953300476, 0.4139372763633728, 0.41278877782821655, 0.413316442489624, 0.41487180709838867, 0.41043112230300904, 0.4112623958587647, 0.41180680418014526, 0.4104765067100525, 0.41032307481765745, 0.4115538077354431, 0.4106971039772034, 0.4115782651901245, 0.4076194281578064, 0.4095973393917084, 0.4101976146697998, 0.41190558350086215, 0.40895492053031923, 0.4109641778469086, 0.41455798006057737, 0.4133733658790588, 0.4109874184131622, 0.4148107707500458, 0.4192650182247162, 0.4173676609992981, 0.41016732215881346, 0.4163088254928589, 0.4127987093925476, 0.4125638964176178, 0.4086933424472809, 0.410665296792984, 0.4107584969997406, 0.4090251893997192, 0.40944012451171874, 0.40870972466468813, 0.40820997190475466, 0.4075208735466003, 0.40677762269973755, 0.40706834840774536, 0.40705754661560056, 0.4104935050010681, 0.4073339102268219, 0.4109715094566345, 0.40818224382400514, 0.4106055450439453, 0.4108154845237732, 0.41187564086914064, 0.412083842754364, 0.40902288603782655, 0.41006256532669066, 0.4111403520107269, 0.41698567938804626, 0.4066694087982178, 0.405914035320282, 0.4109771785736084, 0.40995141649246214, 0.40727454710006716, 0.4100667042732239, 0.40720993757247925, 0.4064027976989746, 0.41160671520233155, 0.4111932981014252, 0.4109036355018616, 0.4125116350650787, 0.41138386487960815, 0.4094408886432648, 0.4049138162136078, 0.4073311214447021, 0.4138616027832031, 0.41433427023887637, 0.4084737708568573, 0.41529793572425844, 0.4106252760887146, 0.41026474618911746, 0.404661336183548, 0.407319176197052, 0.4069071855545044, 0.4048494484424591, 0.4086914005279541, 0.41011020994186403, 0.4043136630058289, 0.4100429949760437, 0.4076050226688385, 0.40344184279441836, 0.40638829374313357, 0.4069635851383209, 0.406017258644104, 0.40310633134841917, 0.40444022679328917, 0.4052834188938141, 0.4040566649436951, 0.40623607540130613, 0.40568059229850767, 0.411429811000824, 0.40812016344070434, 0.40416624236106874, 0.4048048129081726, 0.40596784138679504, 0.4060556983947754, 0.41062584924697876, 0.4037566146850586, 0.40622968673706056, 0.4008046123981476, 0.4066551058292389, 0.4051918435096741, 0.4144439663887024, 0.4158699038028717, 0.40848221063613893, 0.4048428931236267, 0.4036929020881653, 0.4027268793582916, 0.4029424250125885, 0.408836936712265, 0.4095616874694824, 0.40419105541706085, 0.4066926267147064, 0.4037224025726318, 0.4053439121246338, 0.4027122857570648, 0.40053937196731565, 0.4053420400619507, 0.4147425765991211, 0.4011023209095001, 0.40091567850112914, 0.40133816289901736, 0.4060850577354431, 0.40372444677352903, 0.3998127131462097, 0.40046384477615354, 0.40330838441848754, 0.40353616046905516, 0.3996767380237579, 0.40059909200668337, 0.40326568031311033, 0.4019211802482605, 0.4040134106874466, 0.4001256539821625, 0.3988133683204651, 0.4024208800792694, 0.4033730192184448, 0.40518756198883055, 0.4041669557094574, 0.3999807969331741, 0.39832637572288515, 0.39812623119354246, 0.4048573606014252, 0.4087936098575592, 0.4042465353012085, 0.4020412495136261, 0.39863489151000975, 0.3986151716709137, 0.4043668994903564, 0.40251717185974123, 0.3978027548789978, 0.39968896460533143, 0.3995158548355103, 0.39785705494880674, 0.401716187953949, 0.3986141972541809, 0.3981891016960144, 0.3968900856971741, 0.39786979579925535, 0.40286386251449585, 0.3999240550994873, 0.4002824194431305, 0.39710293340682984, 0.40019183325767516, 0.40268492555618285, 0.40057028126716615, 0.40278093934059145, 0.3953946328163147, 0.400158988237381, 0.3967309980392456, 0.39562345695495604, 0.3970991542339325, 0.39631393837928774, 0.39867462134361265, 0.4041049728393555, 0.3988000485897064, 0.4026320669651032, 0.39774824619293214, 0.396080849647522, 0.3983400197029114, 0.3968271770477295, 0.3962887647151947, 0.39968864631652834, 0.3988882784843445, 0.3964841799736023, 0.3949674246311188, 0.39675387382507327, 0.39831042814254763, 0.4023392553329468, 0.4036339890956879, 0.39351901173591614, 0.3956096420288086, 0.3949788217544556], 'acc': [0.8100000009536743, 0.8080000004768372, 0.8079999990463257, 0.8100000009536743, 0.8059999995231628, 0.8079999990463257, 0.8119999990463257, 0.8100000004768372, 0.8220000009536743, 0.8120000009536743, 0.8059999995231628, 0.8160000004768372, 0.8160000004768372, 0.8140000004768372, 0.8079999995231628, 0.812, 0.816, 0.8100000009536743, 0.812, 0.8160000009536743, 0.8159999995231628, 0.8139999990463257, 0.8219999990463257, 0.8160000004768372, 0.8160000004768372, 0.8200000004768372, 0.8139999995231628, 0.8160000009536743, 0.8140000009536743, 0.8099999995231628, 0.812, 0.8139999990463257, 0.8140000004768372, 0.81, 0.808, 0.8199999990463257, 0.8120000009536743, 0.8180000009536743, 0.8160000004768372, 0.8100000009536743, 0.8140000004768372, 0.8219999990463257, 0.8079999990463257, 0.8140000004768372, 0.8160000004768372, 0.8120000009536743, 0.816, 0.8220000004768372, 0.8079999995231628, 0.8080000004768372, 0.808, 0.8139999995231628, 0.8140000004768372, 0.8140000004768372, 0.8159999990463257, 0.822, 0.8160000004768372, 0.824, 0.8160000009536743, 0.8160000004768372, 0.8080000004768372, 0.8179999990463257, 0.8220000009536743, 0.8180000004768372, 0.8219999990463257, 0.8139999995231628, 0.8160000004768372, 0.8039999995231628, 0.8180000009536743, 0.8259999990463257, 0.826, 0.8200000009536743, 0.82, 0.8239999990463257, 0.8199999995231628, 0.8279999990463257, 0.8160000004768372, 0.8199999995231628, 0.8220000004768372, 0.8179999995231628, 0.8159999990463257, 0.8180000004768372, 0.8220000009536743, 0.8039999995231628, 0.8219999995231628, 0.8180000004768372, 0.8219999995231628, 0.816, 0.8139999995231628, 0.8220000009536743, 0.824, 0.8220000004768372, 0.8200000004768372, 0.818, 0.8160000009536743, 0.8219999990463257, 0.8220000009536743, 0.8139999990463257, 0.8120000009536743, 0.8219999995231628, 0.822, 0.8099999990463257, 0.8179999995231628, 0.8159999995231628, 0.8279999995231628, 0.8219999995231628, 0.8220000004768372, 0.8240000009536743, 0.818, 0.8219999995231628, 0.8139999990463257, 0.8140000004768372, 0.8200000004768372, 0.8180000009536743, 0.8179999990463257, 0.8240000004768372, 0.81, 0.8279999990463257, 0.8219999995231628, 0.8200000009536743, 0.8240000004768372, 0.818, 0.8200000009536743, 0.82, 0.8360000009536743, 0.822, 0.8120000009536743, 0.8180000009536743, 0.8259999990463257, 0.8179999990463257, 0.8160000009536743, 0.8259999995231628, 0.822, 0.8260000004768372, 0.8260000004768372, 0.8180000004768372, 0.8220000009536743, 0.8180000004768372, 0.8140000004768372, 0.8219999995231628, 0.8239999990463257, 0.8240000004768372, 0.8140000009536743, 0.8200000004768372, 0.8300000009536743, 0.8160000004768372, 0.8220000009536743, 0.8199999995231628, 0.8180000004768372, 0.828, 0.8279999995231628, 0.8220000009536743, 0.8219999990463257, 0.828, 0.8220000009536743, 0.8240000004768372, 0.8220000009536743, 0.8140000009536743, 0.8200000009536743, 0.8180000009536743, 0.8239999995231628, 0.8219999995231628, 0.82, 0.8219999995231628, 0.8219999990463257, 0.8100000004768372, 0.8200000009536743, 0.8179999990463257, 0.8180000009536743, 0.8199999990463257, 0.8179999995231628, 0.8139999990463257, 0.8259999990463257, 0.8240000009536743, 0.8240000004768372, 0.8279999990463257, 0.8219999990463257, 0.8260000004768372, 0.8119999995231628, 0.8140000009536743, 0.8220000009536743, 0.8260000004768372, 0.8260000009536743, 0.8179999995231628, 0.8179999990463257, 0.824, 0.8260000009536743, 0.8160000004768372, 0.8259999995231628, 0.8140000009536743, 0.8300000009536743, 0.8259999990463257, 0.8260000004768372, 0.8159999995231628, 0.8239999995231628, 0.8259999995231628, 0.8240000004768372, 0.8180000004768372, 0.8219999990463257, 0.8160000009536743, 0.8100000004768372, 0.8260000009536743, 0.8279999990463257, 0.8259999995231628, 0.8260000004768372, 0.824, 0.8159999990463257, 0.818, 0.8160000009536743, 0.8140000009536743, 0.8240000004768372, 0.8140000004768372, 0.8300000004768372, 0.8200000004768372, 0.816, 0.8240000004768372, 0.824, 0.8200000004768372, 0.8219999990463257, 0.8239999995231628, 0.8180000009536743, 0.8279999995231628, 0.8219999995231628, 0.8160000009536743, 0.8220000004768372, 0.806, 0.8219999995231628, 0.8059999990463257, 0.8260000009536743, 0.8179999990463257, 0.822, 0.816, 0.8279999995231628, 0.8239999990463257, 0.8239999995231628, 0.816, 0.824, 0.8260000009536743, 0.826, 0.8259999990463257, 0.828, 0.8299999990463257, 0.8159999990463257, 0.8260000009536743, 0.81, 0.82, 0.8320000009536743, 0.82, 0.8260000004768372, 0.82, 0.814, 0.8260000004768372, 0.8299999995231628, 0.8239999990463257, 0.83, 0.8259999995231628, 0.8180000004768372, 0.8239999990463257, 0.8199999995231628, 0.8279999995231628, 0.8239999995231628, 0.8320000004768372, 0.828, 0.8160000009536743, 0.8239999990463257, 0.8199999995231628, 0.82, 0.8239999995231628, 0.8299999995231628, 0.8260000009536743, 0.8160000004768372, 0.8160000004768372, 0.828, 0.818, 0.8140000009536743, 0.8240000004768372, 0.814, 0.826, 0.8179999995231628, 0.8299999990463257, 0.8240000009536743, 0.82, 0.8179999990463257, 0.8139999995231628, 0.8280000009536743, 0.826, 0.8260000009536743, 0.8139999995231628, 0.8240000009536743, 0.8200000004768372, 0.83, 0.8179999990463257, 0.8280000004768372, 0.8199999995231628, 0.8339999990463257, 0.8280000009536743, 0.8220000004768372, 0.8320000004768372, 0.8300000009536743, 0.82, 0.8199999990463257, 0.8199999990463257, 0.8200000009536743, 0.8179999990463257, 0.8159999995231628, 0.814, 0.8280000009536743, 0.8140000009536743, 0.8159999990463257, 0.8260000009536743, 0.824, 0.8200000004768372, 0.8319999990463257, 0.826, 0.818, 0.8320000009536743, 0.8279999990463257, 0.828, 0.8200000004768372, 0.8199999990463257, 0.826, 0.8260000004768372, 0.8279999995231628, 0.8279999990463257, 0.822, 0.8220000004768372, 0.8119999990463257, 0.8360000009536743, 0.8199999990463257, 0.8240000004768372, 0.8240000009536743, 0.808, 0.8239999995231628, 0.8140000009536743, 0.8240000004768372, 0.8279999995231628, 0.8259999995231628, 0.8159999995231628, 0.822, 0.8159999995231628, 0.8160000009536743, 0.8299999990463257, 0.8260000004768372, 0.8220000009536743, 0.828, 0.8279999995231628, 0.8239999995231628, 0.8240000009536743, 0.8279999995231628, 0.8240000004768372, 0.8240000009536743, 0.8199999990463257, 0.8220000009536743, 0.8300000009536743, 0.8260000009536743, 0.8120000004768372, 0.8280000009536743, 0.8300000004768372, 0.8160000004768372, 0.8300000004768372, 0.8220000004768372, 0.8260000009536743, 0.8280000009536743, 0.8220000009536743, 0.8279999995231628, 0.8140000004768372, 0.8220000004768372, 0.8140000009536743, 0.8179999990463257, 0.8239999990463257, 0.832, 0.8260000009536743, 0.826, 0.8080000004768372, 0.8280000004768372, 0.8240000009536743, 0.8200000004768372, 0.8200000009536743, 0.8159999995231628, 0.8199999990463257, 0.8220000004768372, 0.8319999990463257, 0.8200000009536743, 0.822, 0.8320000009536743, 0.82, 0.8280000004768372, 0.8280000009536743, 0.8220000009536743, 0.8260000004768372, 0.8179999990463257, 0.832, 0.8200000009536743, 0.8300000009536743, 0.834, 0.8119999995231628, 0.8199999990463257, 0.8299999995231628, 0.8279999995231628, 0.8240000004768372, 0.8360000004768372, 0.8259999995231628, 0.8199999990463257, 0.8240000004768372, 0.8200000004768372, 0.8260000009536743, 0.822, 0.8040000004768372, 0.8280000004768372, 0.8120000009536743, 0.8079999990463257, 0.8259999990463257, 0.8180000009536743, 0.8240000004768372, 0.8280000009536743, 0.8220000009536743, 0.824, 0.8200000004768372, 0.8239999995231628, 0.8300000004768372, 0.8219999995231628, 0.8239999990463257, 0.8300000009536743, 0.8199999990463257, 0.8239999995231628, 0.8120000009536743, 0.8220000004768372, 0.8240000009536743, 0.8279999995231628, 0.8320000009536743, 0.8220000009536743, 0.832, 0.8359999995231628, 0.8299999995231628, 0.8320000004768372, 0.8220000004768372, 0.8199999990463257, 0.826, 0.8219999995231628, 0.828, 0.8239999990463257, 0.824, 0.8260000009536743, 0.8220000004768372, 0.8300000004768372, 0.8239999995231628, 0.8319999995231628, 0.8240000009536743, 0.8260000009536743, 0.8280000004768372, 0.8259999990463257, 0.8300000004768372, 0.8280000009536743, 0.8219999990463257, 0.8260000009536743, 0.816, 0.83, 0.8260000004768372, 0.8280000009536743, 0.8259999995231628, 0.8319999995231628, 0.8260000004768372, 0.8339999990463257, 0.8279999990463257, 0.838, 0.8200000004768372, 0.8260000009536743, 0.8079999990463257, 0.8280000009536743, 0.8260000004768372, 0.8280000004768372, 0.8200000004768372, 0.8300000004768372, 0.838, 0.82, 0.8140000004768372, 0.824, 0.8240000004768372, 0.8300000009536743, 0.8180000009536743, 0.8299999990463257, 0.8159999990463257, 0.824, 0.8180000009536743, 0.8160000009536743, 0.8279999990463257, 0.818, 0.826, 0.83, 0.8160000009536743, 0.8140000009536743, 0.8199999995231628, 0.824, 0.8240000009536743, 0.8280000004768372, 0.824, 0.8280000004768372, 0.8239999995231628, 0.8299999995231628, 0.826]}

After the model is fit, it is evaluated on both the train and test dataset and the accuracy scores are displayed.


In [15]:
# evaluate the model
_, train_acc = model.evaluate(trainX, trainy, verbose=0)
_, test_acc = model.evaluate(testX, testy, verbose=0)
print('Train: %.3f, Test: %.3f' % (train_acc, test_acc))


Train: 0.834, Test: 0.842

Finally, the accuracy of the model during each step of training is graphed as a line plot, showing the dynamics of the model as it learned the problem.


In [19]:
# plot training history
pyplot.plot(history.history['acc'], label='train')
pyplot.plot(history.history['val_acc'], label='test')
pyplot.legend()
pyplot.show()


The complete example of the deeper MLP is listed below.


In [27]:
# deeper mlp for the two circles classification problem
from sklearn.datasets import make_circles
from sklearn.preprocessing import MinMaxScaler
from keras.layers import Dense
from keras.models import Sequential
from keras.optimizers import SGD
from keras.initializers import RandomUniform
from matplotlib import pyplot
# generate 2d classification dataset
X, y = make_circles(n_samples=1000, noise=0.1, random_state=1)
scaler = MinMaxScaler(feature_range=(-1, 1))
X = scaler.fit_transform(X)
# split into train and test
n_train = 500
trainX, testX = X[:n_train, :], X[n_train:, :]
trainy, testy = y[:n_train], y[n_train:]
# define model
init = RandomUniform(minval=0, maxval=1)
model = Sequential()
model.add(Dense(5, input_dim=2, activation='tanh', kernel_initializer=init))
for i in range(5):
  model.add(Dense(5, activation='tanh', kernel_initializer=init))
model.add(Dense(1, activation='sigmoid', kernel_initializer=init))
# compile model
opt = SGD(lr=0.01, momentum=0.9)
model.compile(loss='binary_crossentropy', optimizer=opt, metrics=['accuracy'])
# fit model
history = model.fit(trainX, trainy, validation_data=(testX, testy), epochs=500, verbose=0)
# evaluate the model
_, train_acc = model.evaluate(trainX, trainy, verbose=0)
_, test_acc = model.evaluate(testX, testy, verbose=0)
print('Train: %.3f, Test: %.3f' % (train_acc, test_acc))
# plot training history
pyplot.plot(history.history['acc'], label='train')
pyplot.plot(history.history['val_acc'], label='test')
pyplot.legend()
pyplot.show()


Train: 0.514, Test: 0.484

Deeper MLP Model with ReLU for Two Circles Problem

When using the rectified linear activation function (or ReLU for short), it is good practice to use the He weight initialization scheme. We can define the MLP with five hidden layers using ReLU and He initialization, listed below.


In [26]:
# deeper mlp with relu for the two circles classification problem
from sklearn.datasets import make_circles
from sklearn.preprocessing import MinMaxScaler
from keras.layers import Dense
from keras.models import Sequential
from keras.optimizers import SGD
from keras.initializers import RandomUniform
from matplotlib import pyplot
# generate 2d classification dataset
X, y = make_circles(n_samples=1000, noise=0.1, random_state=1)
scaler = MinMaxScaler(feature_range=(-1, 1))
X = scaler.fit_transform(X)
# split into train and test
n_train = 500
trainX, testX = X[:n_train, :], X[n_train:, :]
trainy, testy = y[:n_train], y[n_train:]
# define model
model = Sequential()
model.add(Dense(5, input_dim=2, activation='relu', kernel_initializer='he_uniform'))

for i in range(10):
  model.add(Dense(5, activation='relu', kernel_initializer='he_uniform'))

model.add(Dense(1, activation='sigmoid'))
# compile model
opt = SGD(lr=0.01, momentum=0.9)
model.compile(loss='binary_crossentropy', optimizer=opt, metrics=['accuracy'])
# fit model
history = model.fit(trainX, trainy, validation_data=(testX, testy), epochs=500, verbose=0)
# evaluate the model
_, train_acc = model.evaluate(trainX, trainy, verbose=0)
_, test_acc = model.evaluate(testX, testy, verbose=0)
print('Train: %.3f, Test: %.3f' % (train_acc, test_acc))
# plot training history
pyplot.plot(history.history['acc'], label='train')
pyplot.plot(history.history['val_acc'], label='test')
pyplot.legend()
pyplot.show()


Train: 0.828, Test: 0.840