/home/prajwal/anaconda3/lib/python3.5/site-packages/category_encoders/ordinal.py:178: FutureWarning: reshape is deprecated and will raise in a subsequent release. Please use .values.reshape(...) instead
X[col] = X[col].astype(int).reshape(-1, )
/home/prajwal/anaconda3/lib/python3.5/site-packages/category_encoders/ordinal.py:167: FutureWarning: reshape is deprecated and will raise in a subsequent release. Please use .values.reshape(...) instead
X[switch.get('col')] = X[switch.get('col')].astype(int).reshape(-1, )
Training Data (41188, 21)
Test Data (12357, 21)
TRAINING BASE MODELS
Epoch 1/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 2/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 3/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 4/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 5/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 6/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 7/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 8/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 9/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 10/10
9609/9609 [==============================] - 0s - loss: 1.8149 - acc: 0.8874
Epoch 1/10
9610/9610 [==============================] - 0s - loss: 0.4594 - acc: 0.8292
Epoch 2/10
9610/9610 [==============================] - 0s - loss: 0.2607 - acc: 0.9048
Epoch 3/10
9610/9610 [==============================] - 0s - loss: 0.2545 - acc: 0.9055
Epoch 4/10
9610/9610 [==============================] - 1s - loss: 0.2627 - acc: 0.9062
Epoch 5/10
9610/9610 [==============================] - 0s - loss: 0.2462 - acc: 0.9046
Epoch 6/10
9610/9610 [==============================] - 1s - loss: 0.2495 - acc: 0.9067
Epoch 7/10
9610/9610 [==============================] - 0s - loss: 0.2429 - acc: 0.9088
Epoch 8/10
9610/9610 [==============================] - 0s - loss: 0.2408 - acc: 0.9086
Epoch 9/10
9610/9610 [==============================] - 0s - loss: 0.2443 - acc: 0.9084
Epoch 10/10
9610/9610 [==============================] - 0s - loss: 0.2404 - acc: 0.9101
Epoch 1/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 2/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 3/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 4/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 5/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 6/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 7/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 8/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 9/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 10/10
9611/9611 [==============================] - 0s - loss: 1.8162 - acc: 0.8873
Epoch 1/10
14415/14415 [==============================] - 0s - loss: 0.2591 - acc: 0.9014
Epoch 2/10
14415/14415 [==============================] - 0s - loss: 0.2504 - acc: 0.9075
Epoch 3/10
14415/14415 [==============================] - 0s - loss: 0.2448 - acc: 0.9095
Epoch 4/10
14415/14415 [==============================] - 0s - loss: 0.2367 - acc: 0.9118
Epoch 5/10
14415/14415 [==============================] - 0s - loss: 0.2347 - acc: 0.9073
Epoch 6/10
14415/14415 [==============================] - 0s - loss: 0.2398 - acc: 0.9109
Epoch 7/10
14415/14415 [==============================] - 0s - loss: 0.2352 - acc: 0.9083
Epoch 8/10
14415/14415 [==============================] - 1s - loss: 0.2438 - acc: 0.9100
Epoch 9/10
14415/14415 [==============================] - 1s - loss: 0.2311 - acc: 0.9072
Epoch 10/10
14415/14415 [==============================] - 0s - loss: 0.2394 - acc: 0.9077
13344/14415 [==========================>...] - ETA: 0s
/home/prajwal/anaconda3/lib/python3.5/site-packages/keras/models.py:815: UserWarning: Network returning invalid probability values. The last layer might not normalize predictions into probabilities (like softmax or sigmoid would).
warnings.warn('Network returning invalid probability values. '
TESTING/CROSS VALIDATION BASE MODELS
12512/14416 [=========================>....] - ETA: 0s
/home/prajwal/anaconda3/lib/python3.5/site-packages/keras/models.py:815: UserWarning: Network returning invalid probability values. The last layer might not normalize predictions into probabilities (like softmax or sigmoid would).
warnings.warn('Network returning invalid probability values. '
random_forest
0.940931586899
multi_layer_perceptron
0.920753152178
gradient_boosting
0.946383804379
logistic_regression
0.924926235455
linear_regression
0.923427646435
decision_tree
0.921747654592
CPU times: user 2.56 s, sys: 196 ms, total: 2.76 s
Wall time: 1min 13s