/home/prajwal/anaconda3/lib/python3.5/site-packages/category_encoders/ordinal.py:178: FutureWarning: reshape is deprecated and will raise in a subsequent release. Please use .values.reshape(...) instead
X[col] = X[col].astype(int).reshape(-1, )
/home/prajwal/anaconda3/lib/python3.5/site-packages/category_encoders/ordinal.py:167: FutureWarning: reshape is deprecated and will raise in a subsequent release. Please use .values.reshape(...) instead
X[switch.get('col')] = X[switch.get('col')].astype(int).reshape(-1, )
Training Data (41188, 21)
Test Data (12357, 21)
TRAINING BASE MODELS
Epoch 1/10
9609/9609 [==============================] - 0s - loss: 0.3407 - acc: 0.8952
Epoch 2/10
9609/9609 [==============================] - 1s - loss: 0.2440 - acc: 0.9034
Epoch 3/10
9609/9609 [==============================] - 1s - loss: 0.2497 - acc: 0.9082
Epoch 4/10
9609/9609 [==============================] - 0s - loss: 0.2515 - acc: 0.9064
Epoch 5/10
9609/9609 [==============================] - 1s - loss: 0.2440 - acc: 0.9084
Epoch 6/10
9609/9609 [==============================] - 1s - loss: 0.2670 - acc: 0.9093
Epoch 7/10
9609/9609 [==============================] - 0s - loss: 0.2622 - acc: 0.9086
Epoch 8/10
9609/9609 [==============================] - 0s - loss: 0.2418 - acc: 0.9076
Epoch 9/10
9609/9609 [==============================] - 0s - loss: 0.2472 - acc: 0.9108
Epoch 10/10
9609/9609 [==============================] - 0s - loss: 0.2579 - acc: 0.9069
Epoch 1/10
9610/9610 [==============================] - 0s - loss: 0.3361 - acc: 0.8874
Epoch 2/10
9610/9610 [==============================] - 0s - loss: 0.2491 - acc: 0.8997
Epoch 3/10
9610/9610 [==============================] - 0s - loss: 0.2414 - acc: 0.9026
Epoch 4/10
9610/9610 [==============================] - 0s - loss: 0.2353 - acc: 0.9032
Epoch 5/10
9610/9610 [==============================] - 0s - loss: 0.2474 - acc: 0.9053
Epoch 6/10
9610/9610 [==============================] - 0s - loss: 0.2289 - acc: 0.9043
Epoch 7/10
9610/9610 [==============================] - 0s - loss: 0.2271 - acc: 0.9056
Epoch 8/10
9610/9610 [==============================] - 0s - loss: 0.2435 - acc: 0.9070
Epoch 9/10
9610/9610 [==============================] - 0s - loss: 0.2419 - acc: 0.9062
Epoch 10/10
9610/9610 [==============================] - 1s - loss: 0.2319 - acc: 0.9074
Epoch 1/10
9611/9611 [==============================] - 1s - loss: 0.3014 - acc: 0.8901
Epoch 2/10
9611/9611 [==============================] - 0s - loss: 0.2435 - acc: 0.9017
Epoch 3/10
9611/9611 [==============================] - 1s - loss: 0.2571 - acc: 0.9027
Epoch 4/10
9611/9611 [==============================] - 0s - loss: 0.2448 - acc: 0.9053
Epoch 5/10
9611/9611 [==============================] - 0s - loss: 0.2539 - acc: 0.9076
Epoch 6/10
9611/9611 [==============================] - 0s - loss: 0.2638 - acc: 0.9072
Epoch 7/10
9611/9611 [==============================] - 0s - loss: 0.2491 - acc: 0.9075
Epoch 8/10
9611/9611 [==============================] - 0s - loss: 0.2560 - acc: 0.9071
Epoch 9/10
9611/9611 [==============================] - 0s - loss: 0.2490 - acc: 0.9058
Epoch 10/10
9611/9611 [==============================] - 0s - loss: 0.2713 - acc: 0.9053
Epoch 1/10
14415/14415 [==============================] - 1s - loss: 2.5823 - acc: 0.7328
Epoch 2/10
14415/14415 [==============================] - 0s - loss: 0.2403 - acc: 0.9031
Epoch 3/10
14415/14415 [==============================] - 1s - loss: 0.2601 - acc: 0.9049
Epoch 4/10
14415/14415 [==============================] - 0s - loss: 0.2591 - acc: 0.9052
Epoch 5/10
14415/14415 [==============================] - 0s - loss: 0.2534 - acc: 0.9068
Epoch 6/10
14415/14415 [==============================] - 0s - loss: 0.2553 - acc: 0.9095
Epoch 7/10
14415/14415 [==============================] - 0s - loss: 0.2601 - acc: 0.9047
Epoch 8/10
14415/14415 [==============================] - 0s - loss: 0.2329 - acc: 0.9069
Epoch 9/10
14415/14415 [==============================] - 0s - loss: 0.2594 - acc: 0.9041
Epoch 10/10
14415/14415 [==============================] - 0s - loss: 0.2301 - acc: 0.9054
14368/14415 [============================>.] - ETA: 0s
/home/prajwal/anaconda3/lib/python3.5/site-packages/keras/models.py:815: UserWarning: Network returning invalid probability values. The last layer might not normalize predictions into probabilities (like softmax or sigmoid would).
warnings.warn('Network returning invalid probability values. '
TESTING/CROSS VALIDATION BASE MODELS
12640/14416 [=========================>....] - ETA: 0s
/home/prajwal/anaconda3/lib/python3.5/site-packages/keras/models.py:815: UserWarning: Network returning invalid probability values. The last layer might not normalize predictions into probabilities (like softmax or sigmoid would).
warnings.warn('Network returning invalid probability values. '
random_forest
0.94123934833
multi_layer_perceptron
0.93083182762
gradient_boosting
0.947415299779
logistic_regression
0.926925830337
linear_regression
0.927305291253
decision_tree
0.918983144869
TRAINING ENSEMBLE MODELS
Weighted Average
Weight [5, 2, 5, 0, 0, 0]
Metric Score 0.946883847509
TESTING PHASE
TESTING/CROSS VALIDATION BASE MODELS
11392/12357 [==========================>...] - ETA: 0s
/home/prajwal/anaconda3/lib/python3.5/site-packages/keras/models.py:815: UserWarning: Network returning invalid probability values. The last layer might not normalize predictions into probabilities (like softmax or sigmoid would).
warnings.warn('Network returning invalid probability values. '
random_forest
0.940707239859
multi_layer_perceptron
0.929103901652
gradient_boosting
0.946040497193
logistic_regression
0.925873796458
linear_regression
0.923094904896
decision_tree
0.922129483309
TESTING ENSEMBLE MODELS
Stacking gradient_boosting
0.945079203159
Stacking logistic_regression
0.944500592271
Blending gradient_boosting
0.946022152512
Blending decision_tree
0.93713955978
Blending logistic_regression
0.943660274856
Weighted Average [5, 2, 5, 0, 0, 0]
0.945867598576
CPU times: user 5.22 s, sys: 528 ms, total: 5.75 s
Wall time: 11min 39s