Introduction

This notebook is a very basic and simple introductory primer to the method of ensembling models, in particular the variant of ensembling known as Stacking. In a nutshell stacking uses as a first-level (base), the predictions of a few basic machine learning models (classifiers) and then uses another model at the second-level to predict the output from the earlier first-level predictions.

The Titanic dataset is a prime candidate for introducing this concept as many newcomers to Kaggle start out here. Furthermore even though stacking has been responsible for many a team winning Kaggle competitions there seems to be a dearth of kernels on this topic so I hope this notebook can fill somewhat of that void.

I myself am quite a newcomer to the Kaggle scene as well and the first proper ensembling/stacking script that I managed to chance upon and study was one written in the AllState Severity Claims competition by the great Faron. The material in this notebook borrows heavily from Faron's script although ported to factor in ensembles of classifiers whilst his was ensembles of regressors. Anyway please check out his script here:

Stacking Starter : by Faron

Now onto the notebook at hand and I hope that it manages to do justice and convey the concept of ensembling in an intuitive and concise manner. My other standalone Kaggle script which implements exactly the same ensembling steps (albeit with different parameters) discussed below gives a Public LB score of 0.808 which is good enough to get to the top 9% and runs just under 4 minutes. Therefore I am pretty sure there is a lot of room to improve and add on to that script. Anyways please feel free to leave me any comments with regards to how I can improve

``````

In [1]:

import pandas as pd
import numpy as np
import re
import sklearn
import xgboost as xgb
import seaborn as sns
import matplotlib.pyplot as plt
%matplotlib inline

import plotly.offline as py
py.init_notebook_mode(connected=True)
import plotly.graph_objs as go
import plotly.tools as tls

import warnings
warnings.filterwarnings('ignore')

# Going to use these 5 base models for the stacking
from sklearn.svm import SVC
from sklearn.cross_validation import KFold
from sklearn.preprocessing import StandardScaler
from sklearn.learning_curve import validation_curve
from sklearn.grid_search import GridSearchCV

``````
``````

requirejs.config({paths: { 'plotly': ['https://cdn.plot.ly/plotly-latest.min']},});if(!window.Plotly) {{require(['plotly'],function(plotly) {window.Plotly=plotly;});}}

``````

Feature Exploration, Engineering and Cleaning

Now we will proceed much like how most kernels in general are structured, and that is to first explore the data on hand, identify possible feature engineering opportunities as well as numerically encode any categorical features.

``````

In [2]:

# Load in the train and test datasets

# Store our passenger ID for easy access
PassengerId = test['PassengerId']

``````
``````

In [3]:

``````
``````

Out[3]:

text-align: right;
}

text-align: left;
}

.dataframe tbody tr th {
vertical-align: top;
}

PassengerId
Survived
Pclass
Name
Sex
Age
SibSp
Parch
Ticket
Fare
Cabin
Embarked

0
1
0
3
Braund, Mr. Owen Harris
male
22.0
1
0
A/5 21171
7.2500
NaN
S

1
2
1
1
Cumings, Mrs. John Bradley (Florence Briggs Th...
female
38.0
1
0
PC 17599
71.2833
C85
C

2
3
1
3
Heikkinen, Miss. Laina
female
26.0
0
0
STON/O2. 3101282
7.9250
NaN
S

3
4
1
1
Futrelle, Mrs. Jacques Heath (Lily May Peel)
female
35.0
1
0
113803
53.1000
C123
S

4
5
0
3
Allen, Mr. William Henry
male
35.0
0
0
373450
8.0500
NaN
S

``````

Well it is no surprise that our task is to somehow extract the information out of the categorical variables

Feature Engineering

Here, credit must be extended to Sina's very comprehensive and well-thought out notebook for the feature engineering ideas so please check out his work

Titanic Best Working Classfier : by Sina

``````

In [4]:

def get_Cabin_Class(name):
if(type(name) == float):
name = 'None'
title_search = re.search('[A-Z]', name)
if title_search:
return title_search.group(0)
return 'None'

train.Cabin.apply(get_Cabin_Class).value_counts().to_dict()
#train[train['Cabin'] == 'F G73']

``````
``````

Out[4]:

{'A': 15,
'B': 47,
'C': 59,
'D': 33,
'E': 32,
'F': 13,
'G': 4,
'N': 687,
'T': 1}

``````
``````

In [5]:

full_data = [train, test]

# Some features of my own that I have added in
# Gives the length of the name
train['Name_length'] = train['Name'].apply(len)
test['Name_length'] = test['Name'].apply(len)
# Feature that tells whether a passenger had a cabin on the Titanic
train['Has_Cabin'] = train["Cabin"].apply(lambda x: 0 if type(x) == float else 1)
test['Has_Cabin'] = test["Cabin"].apply(lambda x: 0 if type(x) == float else 1)

# Feature engineering steps taken from Sina
# Create new feature FamilySize as a combination of SibSp and Parch
for dataset in full_data:
dataset['FamilySize'] = dataset['SibSp'] + dataset['Parch'] + 1
# Create new feature IsAlone from FamilySize
for dataset in full_data:
dataset['IsAlone'] = 0
dataset.loc[dataset['FamilySize'] == 1, 'IsAlone'] = 1
# Remove all NULLS in the Embarked column
for dataset in full_data:
dataset['Embarked'] = dataset['Embarked'].fillna('S')
# Remove all NULLS in the Fare column and create a new feature CategoricalFare
for dataset in full_data:
dataset['Fare'] = dataset['Fare'].fillna(train['Fare'].median())

# Create a New feature CategoricalAge
for dataset in full_data:
age_avg = dataset['Age'].mean()
age_std = dataset['Age'].std()
age_null_count = dataset['Age'].isnull().sum()
age_null_random_list = np.random.randint(age_avg - age_std, age_avg + age_std, size=age_null_count)
dataset['Age'][np.isnan(dataset['Age'])] = age_null_random_list
dataset['Age'] = dataset['Age'].astype(int)

# Define function to extract titles from passenger names
def get_title(name):
title_search = re.search(' ([A-Za-z]+)\.', name)
# If the title exists, extract and return it.
if title_search:
return title_search.group(1)
return ""
# Create a new feature Title, containing the titles of passenger names
for dataset in full_data:
dataset['Title'] = dataset['Name'].apply(get_title)

# def get_Cabin_Class(name):
#     if(type(name) == float):
#         return 'None'
#     title_search = re.search('[A-Z]', name).group(0)
#     if (title_search):
#         if(title_search == 'T'):
#             return 'None'
#         return title_search
#     return 'None'

# for dataset in full_data:
#     dataset['Cabin'] = dataset['Cabin'].apply(get_Cabin_Class)

# Group all non-common titles into one single grouping "Rare"
for dataset in full_data:
dataset['Title'] = dataset['Title'].replace(['Lady', 'Countess','Capt', 'Col','Don', 'Dr', 'Major', 'Rev', 'Sir', 'Jonkheer', 'Dona'], 'Rare')
dataset['Title'] = dataset['Title'].replace('Mlle', 'Miss')
dataset['Title'] = dataset['Title'].replace('Ms', 'Miss')
dataset['Title'] = dataset['Title'].replace('Mme', 'Mrs')

def data_mapping(dataset):
#Mapping Cabin
#cabin = pd.get_dummies(dataset['Cabin'], prefix='Cabin')

# Mapping Sex
#dataset['Sex'] = dataset['Sex'].map( {'female': 0, 'male': 1} ).astype(int)
sex = pd.get_dummies(dataset['Sex'],prefix='Sex')

# Mapping titles
#     title_mapping = {"Mr": 1, "Miss": 2, "Mrs": 3, "Master": 4, "Rare": 5}
#     dataset['Title'] = dataset['Title'].map(title_mapping)
#     dataset['Title'] = dataset['Title'].fillna(0)
title = pd.get_dummies(dataset['Title'],prefix='Title')

# Mapping Embarked
#dataset['Embarked'] = dataset['Embarked'].map( {'S': 0, 'C': 1, 'Q': 2} ).astype(int)
embarked = pd.get_dummies(dataset['Embarked'],prefix='Embarked')

# Mapping Fare
dataset.loc[ dataset['Fare'] <= 7.91, 'Fare'] 						        = 0
dataset.loc[(dataset['Fare'] > 7.91) & (dataset['Fare'] <= 14.454), 'Fare'] = 1
dataset.loc[(dataset['Fare'] > 14.454) & (dataset['Fare'] <= 31), 'Fare']   = 2
dataset.loc[ dataset['Fare'] > 31, 'Fare'] 							        = 3
dataset['Fare'] = dataset['Fare'].astype(int)
#dataset['CategoricalFare'] = pd.qcut(train['Fare'], 4) #Lu's comment:  Mapping base on cut result on train set
fare = pd.get_dummies(dataset['Fare'],prefix='Fare')

# Mapping Age
dataset.loc[ dataset['Age'] <= 16, 'Age']  				           = 0
dataset.loc[(dataset['Age'] > 16) & (dataset['Age'] <= 32), 'Age'] = 1
dataset.loc[(dataset['Age'] > 32) & (dataset['Age'] <= 48), 'Age'] = 2
dataset.loc[(dataset['Age'] > 48) & (dataset['Age'] <= 64), 'Age'] = 3
dataset.loc[ dataset['Age'] > 64, 'Age']                           = 4;
dataset['Age'] = dataset['Age'].astype(int)
#dataset['Age'] = pd.cut(dataset['Age'], 5)  #Lu's comment:  Mapping base on cut result on train set
age = pd.get_dummies(dataset['Age'],prefix='Age')

# Mapping Pclass
pclass = pd.get_dummies(dataset['Pclass'],prefix='Pclass')

#dataset.join([sex,title,embarked,fare,age])
dataset = pd.concat([dataset,sex,title,embarked,fare,age,pclass],axis= 1)
dataset.drop(['Sex','Title','Embarked','Fare','Age','Pclass'], axis=1, inplace=True)
return dataset

train = data_mapping(train)
test = data_mapping(test)
#print(dataset)

``````
``````

In [6]:

# Feature selection
drop_elements = ['PassengerId', 'Name', 'Ticket', 'Cabin', 'SibSp']
train = train.drop(drop_elements, axis = 1)
#train = train.drop(['CategoricalAge', 'CategoricalFare'], axis = 1)
test  = test.drop(drop_elements, axis = 1)

``````
``````

In [7]:

train.columns.size

``````
``````

Out[7]:

28

``````
``````

In [8]:

test.columns.size

``````
``````

Out[8]:

27

``````

All right so now having cleaned the features and extracted relevant information and dropped the categorical columns our features should now all be numeric, a format suitable to feed into our Machine Learning models. However before we proceed let us generate some simple correlation and distribution plots of our transformed dataset to observe ho

Visualisations

Pearson Correlation Heatmap

let us generate some correlation plots of the features to see how related one feature is to the next. To do so, we will utilise the Seaborn plotting package which allows us to plot heatmaps very conveniently as follows

``````

In [9]:

colormap = plt.cm.coolwarm
plt.figure(figsize=(22,22))
plt.title('Pearson Correlation of Features', y=1.05, size=15)
sns.heatmap(train.astype(float).corr(),linewidths=0.1,vmax=1.0, square=True, cmap=colormap, linecolor='white', annot=True)

``````
``````

Out[9]:

<matplotlib.axes._subplots.AxesSubplot at 0xc9bbe10>

``````

Takeaway from the Plots

One thing that that the Pearson Correlation plot can tell us is that there are not too many features strongly correlated with one another. This is good from a point of view of feeding these features into your learning model because this means that there isn't much redundant or superfluous data in our training set and we are happy that each feature carries with it some unique information. Here are two most correlated features are that of Family size and Parch (Parents and Children). I'll still leave both features in for the purposes of this exercise.

Pairplots

Finally let us generate some pairplots to observe the distribution of data from one feature to the other. Once again we use Seaborn to help us.

``````

In [10]:

g = sns.pairplot(train[['Survived','Name_length','Sex_female','Title_Mr','Fare_3']], hue='Survived', palette = 'seismic',size=1.3,diag_kind = 'kde',diag_kws=dict(shade=True),plot_kws=dict(s=10) )
g.set(xticklabels=[])

``````
``````

Out[10]:

<seaborn.axisgrid.PairGrid at 0xc995fd0>

``````

Ensembling & Stacking models

Finally after that brief whirlwind detour with regards to feature engineering and formatting, we finally arrive at the meat and gist of the this notebook.

Creating a Stacking ensemble

Helpers via Python Classes

Here we invoke the use of Python's classes to help make it more convenient for us. For any newcomers to programming, one normally hears Classes being used in conjunction with Object-Oriented Programming (OOP). In short, a class helps to extend some code/program for creating objects (variables for old-school peeps) as well as to implement functions and methods specific to that class.

In the section of code below, we essentially write a class SklearnHelper that allows one to extend the inbuilt methods (such as train, predict and fit) common to all the Sklearn classifiers. Therefore this cuts out redundancy as won't need to write the same methods five times if we wanted to invoke five different classifiers.

``````

In [11]:

# Some useful parameters which will come in handy later on
ntrain = train.shape[0]
ntest = test.shape[0]
SEED = 0 # for reproducibility
NFOLDS = 5 # set folds for out-of-fold prediction
kf = KFold(ntrain, n_folds= NFOLDS, random_state=SEED)

# Class to extend the Sklearn classifier
class SklearnHelper(object):
def __init__(self, clf, seed=0, params=None):
params['random_state'] = seed
self.clf = clf(**params)

def train(self, x_train, y_train):
self.clf.fit(x_train, y_train)

def predict(self, x):
return self.clf.predict(x)

def fit(self,x,y):
return self.clf.fit(x,y)

def feature_importances(self,x,y):
result = self.clf.fit(x,y).feature_importances_
print(result)
return result
# Class to extend XGboost classifer

``````
``````

In [12]:

ntrain

``````
``````

Out[12]:

891

``````

Bear with me for those who already know this but for people who have not created classes or objects in Python before, let me explain what the code given above does. In creating my base classifiers, I will only use the models already present in the Sklearn library and therefore only extend the class for that.

def init : Python standard for invoking the default constructor for the class. This means that when you want to create an object (classifier), you have to give it the parameters of clf (what sklearn classifier you want), seed (random seed) and params (parameters for the classifiers).

The rest of the code are simply methods of the class which simply call the corresponding methods already existing within the sklearn classifiers.

Out-of-Fold Predictions

Now as alluded to above in the introductory section, stacking uses predictions of base classifiers as input for training to a second-level model. However one cannot simply train the base models on the full training data, generate predictions on the full test set and then output these for the second-level training. This runs the risk of your base model predictions already having "seen" the test set and therefore overfitting when feeding these predictions.

``````

In [13]:

def get_oof(clf, x_train, y_train, x_test):
oof_train = np.zeros((ntrain,)) # n * 1
oof_test = np.zeros((ntest,))
oof_test_skf = np.empty((NFOLDS, ntest))

for i, (train_index, test_index) in enumerate(kf):
x_tr = x_train[train_index]
y_tr = y_train[train_index]
x_te = x_train[test_index]

clf.train(x_tr, y_tr)

oof_train[test_index] = clf.predict(x_te)
oof_test_skf[i, :] = clf.predict(x_test)

oof_test[:] = oof_test_skf.mean(axis=0)
return oof_train.reshape(-1, 1), oof_test.reshape(-1, 1) # 1 * n

``````

Generating our Base First-Level Models

So now let us prepare five learning models as our first level classification. These models can all be conveniently invoked via the Sklearn library and are listed as follows:

1. Random Forest classifier
2. Extra Trees classifier
5. Support Vector Machine

Parameters

Just a quick summary of the parameters that we will be listing here for completeness,

n_jobs : Number of cores used for the training process. If set to -1, all cores are used.

n_estimators : Number of classification trees in your learning model ( set to 10 per default)

max_depth : Maximum depth of tree, or how much a node should be expanded. Beware if set to too high a number would run the risk of overfitting as one would be growing the tree too deep

verbose : Controls whether you want to output any text during the learning process. A value of 0 suppresses all text while a value of 3 outputs the tree learning process at every iteration.

Please check out the full description via the official Sklearn website. There you will find that there are a whole host of other useful parameters that you can play around with.

``````

In [14]:

# Put in our parameters for said classifiers
# Random Forest parameters
rf_params = {
'n_jobs': -1,
'n_estimators': 500,
'warm_start': True,
#'max_features': 0.2,
'max_depth': 6,
'min_samples_leaf': 2,
'max_features' : 'sqrt',
'verbose': 0
}

# Extra Trees Parameters
et_params = {
'n_jobs': -1,
'n_estimators':500,
#'max_features': 0.5,
'max_depth': 8,
'min_samples_leaf': 2,
'verbose': 0
}

'n_estimators': 500,
'learning_rate' : 0.75
}

gb_params = {
'n_estimators': 500,
#'max_features': 0.2,
'max_depth': 5,
'min_samples_leaf': 2,
'verbose': 0
}

# Support Vector Classifier parameters
svc_params = {
'kernel' : 'linear',
'C' : 0.025
}

``````

Furthermore, since having mentioned about Objects and classes within the OOP framework, let us now create 5 objects that represent our 5 learning models via our Helper Sklearn Class we defined earlier.

``````

In [15]:

# Create 5 objects that represent our 4 models
rf = SklearnHelper(clf=RandomForestClassifier, seed=SEED, params=rf_params)
et = SklearnHelper(clf=ExtraTreesClassifier, seed=SEED, params=et_params)
svc = SklearnHelper(clf=SVC, seed=SEED, params=svc_params)

``````

Creating NumPy arrays out of our train and test sets

Great. Having prepared our first layer base models as such, we can now ready the training and test test data for input into our classifiers by generating NumPy arrays out of their original dataframes as follows:

``````

In [16]:

# Create Numpy arrays of train, test and target ( Survived) dataframes to feed into our models
y_train = train['Survived'].ravel()
train = train.drop(['Survived'], axis=1)
x_train = train.values # Creates an array of the train data
x_test = test.values # Creats an array of the test data

#standardization
stdsc = StandardScaler()
x_train = stdsc.fit_transform(x_train)
x_test = stdsc.transform(x_test)

``````

Output of the First level Predictions

We now feed the training and test data into our 5 base classifiers and use the Out-of-Fold prediction function we defined earlier to generate our first level predictions. Allow a handful of minutes for the chunk of code below to run.

``````

In [17]:

x_train.shape

``````
``````

Out[17]:

(891, 27)

``````
``````

In [18]:

# Create our OOF train and test predictions. These base results will be used as new features
et_oof_train, et_oof_test = get_oof(et, x_train, y_train, x_test) # Extra Trees
rf_oof_train, rf_oof_test = get_oof(rf,x_train, y_train, x_test) # Random Forest
gb_oof_train, gb_oof_test = get_oof(gb,x_train, y_train, x_test) # Gradient Boost
svc_oof_train, svc_oof_test = get_oof(svc,x_train, y_train, x_test) # Support Vector Classifier

print("Training is complete")

``````
``````

Training is complete

``````

Feature importances generated from the different classifiers

Now having learned our the first-level classifiers, we can utilise a very nifty feature of the Sklearn models and that is to output the importances of the various features in the training and test sets with one very simple line of code.

As per the Sklearn documentation, most of the classifiers are built in with an attribute which returns feature importances by simply typing in .featureimportances. Therefore we will invoke this very useful attribute via our function earliand plot the feature importances as such

``````

In [19]:

rf_features = rf.feature_importances(x_train,y_train)
et_features = et.feature_importances(x_train, y_train)
gb_features = gb.feature_importances(x_train,y_train)

``````
``````

[ 0.01837142  0.07203597  0.05126685  0.05609454  0.01056569  0.13279217
0.12635008  0.01732699  0.02599196  0.17876873  0.03342443  0.0081802
0.01455721  0.00405593  0.01238266  0.01751002  0.00899796  0.01195744
0.02016226  0.01035866  0.008013    0.00820178  0.00460383  0.00033066
0.05139877  0.02018447  0.07611632]
[ 0.0106126   0.02419606  0.05371307  0.03053497  0.01164536  0.15641115
0.15658288  0.02054471  0.03810489  0.16434628  0.04211618  0.00785841
0.01337742  0.00830554  0.01650421  0.01602595  0.00868333  0.01205105
0.0214685   0.0124521   0.00701013  0.01052549  0.00596251  0.00121863
0.04175242  0.02850483  0.07949133]
[ 0.066  0.732  0.012  0.058  0.004  0.012  0.004  0.016  0.     0.002  0.
0.01   0.002  0.006  0.006  0.006  0.01   0.006  0.002  0.004  0.004  0.
0.006  0.004  0.022  0.     0.006]
[ 0.03053437  0.3810562   0.01897071  0.06332489  0.0174776   0.00879666
0.00808426  0.00670961  0.03138891  0.0381847   0.00937706  0.0046495
0.01748061  0.01802765  0.01920671  0.04430624  0.02825531  0.03056373
0.01875443  0.00768525  0.05756515  0.04488932  0.01187372  0.00325413
0.01417199  0.02513518  0.04027612]

``````

So I have not yet figured out how to assign and store the feature importances outright. Therefore I'll print out the values from the code above and then simply copy and paste into Python lists as below (sorry for the lousy hack)

``````

In [20]:

# rf_features = [0.10474135,  0.21837029,  0.04432652,  0.02249159,  0.05432591,  0.02854371
#   ,0.07570305,  0.01088129 , 0.24247496,  0.13685733 , 0.06128402]
# et_features = [ 0.12165657,  0.37098307  ,0.03129623 , 0.01591611 , 0.05525811 , 0.028157
#   ,0.04589793 , 0.02030357 , 0.17289562 , 0.04853517,  0.08910063]
# ada_features = [0.028 ,   0.008  ,      0.012   ,     0.05866667,   0.032 ,       0.008
#   ,0.04666667 ,  0.     ,      0.05733333,   0.73866667,   0.01066667]
# gb_features = [ 0.06796144 , 0.03889349 , 0.07237845 , 0.02628645 , 0.11194395,  0.04778854
#   ,0.05965792 , 0.02774745,  0.07462718,  0.4593142 ,  0.01340093]

``````

Create a dataframe from the lists containing the feature importance data for easy plotting via the Plotly package.

``````

In [21]:

cols = train.columns.values
# Create a dataframe with features
feature_dataframe = pd.DataFrame( {'features': cols,
'Random Forest feature importances': rf_features,
'Extra Trees  feature importances': et_features,
})

``````

Interactive feature importances via Plotly scatterplots

I'll use the interactive Plotly package at this juncture to visualise the feature importances values of the different classifiers

``````

In [22]:

# Scatter plot
trace = go.Scatter(
y = feature_dataframe['Random Forest feature importances'].values,
x = feature_dataframe['features'].values,
mode='markers',
marker=dict(
sizemode = 'diameter',
sizeref = 1,
size = 25,
#color = np.random.randn(500), #set color equal to a variable
color = feature_dataframe['Random Forest feature importances'].values,
colorscale='Portland',
showscale=True
),
text = feature_dataframe['features'].values
)
data = [trace]

layout= go.Layout(
autosize= True,
title= 'Random Forest Feature Importance',
hovermode= 'closest',
#     xaxis= dict(
#         title= 'Pop',
#         ticklen= 5,
#         zeroline= False,
#         gridwidth= 2,
#     ),
yaxis=dict(
title= 'Feature Importance',
ticklen= 5,
gridwidth= 2
),
showlegend= False
)
fig = go.Figure(data=data, layout=layout)
py.iplot(fig,filename='scatter2010')

# Scatter plot
trace = go.Scatter(
y = feature_dataframe['Extra Trees  feature importances'].values,
x = feature_dataframe['features'].values,
mode='markers',
marker=dict(
sizemode = 'diameter',
sizeref = 1,
size = 25,
#color = np.random.randn(500), #set color equal to a variable
color = feature_dataframe['Extra Trees  feature importances'].values,
colorscale='Portland',
showscale=True
),
text = feature_dataframe['features'].values
)
data = [trace]

layout= go.Layout(
autosize= True,
title= 'Extra Trees Feature Importance',
hovermode= 'closest',
#     xaxis= dict(
#         title= 'Pop',
#         ticklen= 5,
#         zeroline= False,
#         gridwidth= 2,
#     ),
yaxis=dict(
title= 'Feature Importance',
ticklen= 5,
gridwidth= 2
),
showlegend= False
)
fig = go.Figure(data=data, layout=layout)
py.iplot(fig,filename='scatter2010')

# Scatter plot
trace = go.Scatter(
x = feature_dataframe['features'].values,
mode='markers',
marker=dict(
sizemode = 'diameter',
sizeref = 1,
size = 25,
#color = np.random.randn(500), #set color equal to a variable
colorscale='Portland',
showscale=True
),
text = feature_dataframe['features'].values
)
data = [trace]

layout= go.Layout(
autosize= True,
hovermode= 'closest',
#     xaxis= dict(
#         title= 'Pop',
#         ticklen= 5,
#         zeroline= False,
#         gridwidth= 2,
#     ),
yaxis=dict(
title= 'Feature Importance',
ticklen= 5,
gridwidth= 2
),
showlegend= False
)
fig = go.Figure(data=data, layout=layout)
py.iplot(fig,filename='scatter2010')

# Scatter plot
trace = go.Scatter(
y = feature_dataframe['Gradient Boost feature importances'].values,
x = feature_dataframe['features'].values,
mode='markers',
marker=dict(
sizemode = 'diameter',
sizeref = 1,
size = 25,
#color = np.random.randn(500), #set color equal to a variable
color = feature_dataframe['Gradient Boost feature importances'].values,
colorscale='Portland',
showscale=True
),
text = feature_dataframe['features'].values
)
data = [trace]

layout= go.Layout(
autosize= True,
hovermode= 'closest',
#     xaxis= dict(
#         title= 'Pop',
#         ticklen= 5,
#         zeroline= False,
#         gridwidth= 2,
#     ),
yaxis=dict(
title= 'Feature Importance',
ticklen= 5,
gridwidth= 2
),
showlegend= False
)
fig = go.Figure(data=data, layout=layout)
py.iplot(fig,filename='scatter2010')

``````
``````

require(["plotly"], function(Plotly) { window.PLOTLYENV=window.PLOTLYENV || {};window.PLOTLYENV.BASE_URL="https://plot.ly";Plotly.newPlot("17ab2a89-0a2c-4595-b421-236785a518d2", [{"type": "scatter", "y": [0.018371417115946257, 0.07203597126331657, 0.05126684735233133, 0.056094543808076436, 0.010565689969283629, 0.13279217347902195, 0.12635007856367167, 0.017326985126990532, 0.025991962949378666, 0.17876872657556617, 0.033424427633365394, 0.008180202069562735, 0.014557205042169435, 0.004055927635142402, 0.012382664492619351, 0.017510021936939565, 0.008997963661416619, 0.01195744453370628, 0.020162257272452796, 0.010358659401156342, 0.008013002440016266, 0.008201782043924636, 0.004603830920327182, 0.00033066203370316023, 0.05139876600608131, 0.02018447026011238, 0.07611631641372098], "x": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"], "mode": "markers", "marker": {"sizemode": "diameter", "sizeref": 1, "size": 25, "color": [0.018371417115946257, 0.07203597126331657, 0.05126684735233133, 0.056094543808076436, 0.010565689969283629, 0.13279217347902195, 0.12635007856367167, 0.017326985126990532, 0.025991962949378666, 0.17876872657556617, 0.033424427633365394, 0.008180202069562735, 0.014557205042169435, 0.004055927635142402, 0.012382664492619351, 0.017510021936939565, 0.008997963661416619, 0.01195744453370628, 0.020162257272452796, 0.010358659401156342, 0.008013002440016266, 0.008201782043924636, 0.004603830920327182, 0.00033066203370316023, 0.05139876600608131, 0.02018447026011238, 0.07611631641372098], "colorscale": "Portland", "showscale": true}, "text": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"]}], {"autosize": true, "title": "Random Forest Feature Importance", "hovermode": "closest", "yaxis": {"title": "Feature Importance", "ticklen": 5, "gridwidth": 2}, "showlegend": false}, {"showLink": true, "linkText": "Export to plot.ly"})});

require(["plotly"], function(Plotly) { window.PLOTLYENV=window.PLOTLYENV || {};window.PLOTLYENV.BASE_URL="https://plot.ly";Plotly.newPlot("e9fe2122-cc24-443a-9843-125a4b600465", [{"type": "scatter", "y": [0.010612601020661428, 0.024196056295149253, 0.053713067638982905, 0.030534966324359498, 0.01164536456593173, 0.15641114672151624, 0.15658288344216484, 0.020544708732836477, 0.038104890913132454, 0.1643462753617711, 0.04211618292627878, 0.007858411995402886, 0.013377420914958147, 0.008305536285959063, 0.016504210748714675, 0.016025948861330873, 0.008683330942189902, 0.012051050751435402, 0.021468500022494976, 0.012452101532326612, 0.0070101316311112875, 0.010525485209137694, 0.005962514407485825, 0.0012186347900071239, 0.0417524192781612, 0.028504828488384946, 0.07949133019811447], "x": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"], "mode": "markers", "marker": {"sizemode": "diameter", "sizeref": 1, "size": 25, "color": [0.010612601020661428, 0.024196056295149253, 0.053713067638982905, 0.030534966324359498, 0.01164536456593173, 0.15641114672151624, 0.15658288344216484, 0.020544708732836477, 0.038104890913132454, 0.1643462753617711, 0.04211618292627878, 0.007858411995402886, 0.013377420914958147, 0.008305536285959063, 0.016504210748714675, 0.016025948861330873, 0.008683330942189902, 0.012051050751435402, 0.021468500022494976, 0.012452101532326612, 0.0070101316311112875, 0.010525485209137694, 0.005962514407485825, 0.0012186347900071239, 0.0417524192781612, 0.028504828488384946, 0.07949133019811447], "colorscale": "Portland", "showscale": true}, "text": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"]}], {"autosize": true, "title": "Extra Trees Feature Importance", "hovermode": "closest", "yaxis": {"title": "Feature Importance", "ticklen": 5, "gridwidth": 2}, "showlegend": false}, {"showLink": true, "linkText": "Export to plot.ly"})});

require(["plotly"], function(Plotly) { window.PLOTLYENV=window.PLOTLYENV || {};window.PLOTLYENV.BASE_URL="https://plot.ly";Plotly.newPlot("adf699f3-cbb9-4844-8d42-8e5e480d1d56", [{"type": "scatter", "y": [0.066, 0.732, 0.012, 0.058, 0.004, 0.012, 0.004, 0.016, 0.0, 0.002, 0.0, 0.01, 0.002, 0.006, 0.006, 0.006, 0.01, 0.006, 0.002, 0.004, 0.004, 0.0, 0.006, 0.004, 0.022, 0.0, 0.006], "x": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"], "mode": "markers", "marker": {"sizemode": "diameter", "sizeref": 1, "size": 25, "color": [0.066, 0.732, 0.012, 0.058, 0.004, 0.012, 0.004, 0.016, 0.0, 0.002, 0.0, 0.01, 0.002, 0.006, 0.006, 0.006, 0.01, 0.006, 0.002, 0.004, 0.004, 0.0, 0.006, 0.004, 0.022, 0.0, 0.006], "colorscale": "Portland", "showscale": true}, "text": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"]}], {"autosize": true, "title": "AdaBoost Feature Importance", "hovermode": "closest", "yaxis": {"title": "Feature Importance", "ticklen": 5, "gridwidth": 2}, "showlegend": false}, {"showLink": true, "linkText": "Export to plot.ly"})});

require(["plotly"], function(Plotly) { window.PLOTLYENV=window.PLOTLYENV || {};window.PLOTLYENV.BASE_URL="https://plot.ly";Plotly.newPlot("a2290c06-d39f-4ca6-99c8-62b07c4d1201", [{"type": "scatter", "y": [0.030534366140923606, 0.3810562007749901, 0.018970707058143837, 0.06332488664488557, 0.017477604577166003, 0.008796659664895345, 0.008084261570922353, 0.006709608918725528, 0.03138891165513799, 0.03818469710878693, 0.009377061402065592, 0.004649496979966315, 0.017480613428317294, 0.01802764972846368, 0.01920670873660337, 0.04430624222695946, 0.028255308828163864, 0.03056373278827622, 0.018754425896561792, 0.007685249097641689, 0.05756514504579375, 0.04488931677636372, 0.011873723371937685, 0.0032541347312314566, 0.014171990174409987, 0.02513517728192745, 0.04027611939073988], "x": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"], "mode": "markers", "marker": {"sizemode": "diameter", "sizeref": 1, "size": 25, "color": [0.030534366140923606, 0.3810562007749901, 0.018970707058143837, 0.06332488664488557, 0.017477604577166003, 0.008796659664895345, 0.008084261570922353, 0.006709608918725528, 0.03138891165513799, 0.03818469710878693, 0.009377061402065592, 0.004649496979966315, 0.017480613428317294, 0.01802764972846368, 0.01920670873660337, 0.04430624222695946, 0.028255308828163864, 0.03056373278827622, 0.018754425896561792, 0.007685249097641689, 0.05756514504579375, 0.04488931677636372, 0.011873723371937685, 0.0032541347312314566, 0.014171990174409987, 0.02513517728192745, 0.04027611939073988], "colorscale": "Portland", "showscale": true}, "text": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"]}], {"autosize": true, "title": "Gradient Boosting Feature Importance", "hovermode": "closest", "yaxis": {"title": "Feature Importance", "ticklen": 5, "gridwidth": 2}, "showlegend": false}, {"showLink": true, "linkText": "Export to plot.ly"})});

``````

Now let us calculate the mean of all the feature importances and store it as a new column in the feature importance dataframe

``````

In [23]:

# Create the new column containing the average of values

feature_dataframe['mean'] = feature_dataframe.mean(axis= 1) # axis = 1 computes the mean row-wise

``````
``````

Out[23]:

text-align: right;
}

text-align: left;
}

.dataframe tbody tr th {
vertical-align: top;
}

Extra Trees  feature importances
Random Forest feature importances
features
mean

0
0.066
0.010613
0.030534
0.018371
Parch
0.031380

1
0.732
0.024196
0.381056
0.072036
Name_length
0.302322

2
0.012
0.053713
0.018971
0.051267
Has_Cabin
0.033988

``````

Plotly Barplot of Average Feature Importances

Having obtained the mean feature importance across all our classifiers, we can plot them into a Plotly bar plot as follows:

``````

In [34]:

y = feature_dataframe['mean'].values
x = feature_dataframe['features'].values
data = [go.Bar(
x= x,
y= y,
width = 0.5,
marker=dict(
color = feature_dataframe['mean'].values,
colorscale='Portland',
showscale=True,
reversescale = False
),
opacity=0.6
)]

layout= go.Layout(
autosize= True,
title= 'Barplots of Mean Feature Importance',
hovermode= 'closest',
#     xaxis= dict(
#         title= 'Pop',
#         ticklen= 5,
#         zeroline= False,
#         gridwidth= 2,
#     ),
yaxis=dict(
title= 'Feature Importance',
ticklen= 5,
gridwidth= 2
),
showlegend= False
)
fig = go.Figure(data=data, layout=layout)
py.iplot(fig, filename='bar-direct-labels')

``````
``````

require(["plotly"], function(Plotly) { window.PLOTLYENV=window.PLOTLYENV || {};window.PLOTLYENV.BASE_URL="https://plot.ly";Plotly.newPlot("81c16a34-5324-4d6f-97af-a466dbf76a22", [{"type": "bar", "x": ["Parch", "Name_length", "Has_Cabin", "FamilySize", "IsAlone", "Sex_female", "Sex_male", "Title_Master", "Title_Miss", "Title_Mr", "Title_Mrs", "Title_Rare", "Embarked_C", "Embarked_Q", "Embarked_S", "Fare_0", "Fare_1", "Fare_2", "Fare_3", "Age_0", "Age_1", "Age_2", "Age_3", "Age_4", "Pclass_1", "Pclass_2", "Pclass_3"], "y": [0.03137959606938282, 0.30232205708336396, 0.03398765551236452, 0.05198859919433038, 0.01092216477809534, 0.07749999496635838, 0.07375430589418971, 0.015145325694638135, 0.02387144137941228, 0.09582492476153104, 0.021229417990427442, 0.007672027761232984, 0.01185380984636122, 0.009097278412391286, 0.01352339599448435, 0.020960553256307475, 0.013984150857942596, 0.015143057018354474, 0.015596295797877392, 0.00862400250778116, 0.019147069779230325, 0.015904146007356512, 0.007110017174937673, 0.002200857888735435, 0.03233079386466312, 0.018456119007606193, 0.05047094150064384], "width": 0.5, "marker": {"color": [0.03137959606938282, 0.30232205708336396, 0.03398765551236452, 0.05198859919433038, 0.01092216477809534, 0.07749999496635838, 0.07375430589418971, 0.015145325694638135, 0.02387144137941228, 0.09582492476153104, 0.021229417990427442, 0.007672027761232984, 0.01185380984636122, 0.009097278412391286, 0.01352339599448435, 0.020960553256307475, 0.013984150857942596, 0.015143057018354474, 0.015596295797877392, 0.00862400250778116, 0.019147069779230325, 0.015904146007356512, 0.007110017174937673, 0.002200857888735435, 0.03233079386466312, 0.018456119007606193, 0.05047094150064384], "colorscale": "Portland", "showscale": true, "reversescale": false}, "opacity": 0.6}], {"autosize": true, "title": "Barplots of Mean Feature Importance", "hovermode": "closest", "yaxis": {"title": "Feature Importance", "ticklen": 5, "gridwidth": 2}, "showlegend": false}, {"showLink": true, "linkText": "Export to plot.ly"})});

``````

Second-Level Predictions from the First-level Output

First-level output as new features

Having now obtained our first-level predictions, one can think of it as essentially building a new set of features to be used as training data for the next classifier. As per the code below, we are therefore having as our new columns the first-level predictions from our earlier classifiers and we train the next classifier on this.

``````

In [25]:

base_predictions_train = pd.DataFrame( {'RandomForest': rf_oof_train.ravel(),
'ExtraTrees': et_oof_train.ravel(),
})

``````
``````

Out[25]:

text-align: right;
}

text-align: left;
}

.dataframe tbody tr th {
vertical-align: top;
}

ExtraTrees
RandomForest

0
0.0
0.0
0.0
0.0

1
1.0
1.0
1.0
1.0

2
1.0
0.0
1.0
0.0

3
1.0
1.0
1.0
1.0

4
0.0
0.0
0.0
0.0

``````

Correlation Heatmap of the Second Level Training set

``````

In [26]:

data = [
go.Heatmap(
z= base_predictions_train.astype(float).corr().values ,
x=base_predictions_train.columns.values,
y= base_predictions_train.columns.values,
colorscale='Portland',
showscale=True,
reversescale = True
)
]
py.iplot(data, filename='labelled-heatmap')

``````
``````

require(["plotly"], function(Plotly) { window.PLOTLYENV=window.PLOTLYENV || {};window.PLOTLYENV.BASE_URL="https://plot.ly";Plotly.newPlot("2f4699a6-7914-49c3-b334-db0537c58da4", [{"type": "heatmap", "z": [[1.0, 0.7593270580663908, 0.6651549037028193, 0.7614335172451449], [0.7593270580663908, 1.0, 0.7288126782600043, 0.8586587946494644], [0.6651549037028193, 0.7288126782600043, 1.0, 0.7268961769911487], [0.7614335172451449, 0.8586587946494644, 0.7268961769911487, 1.0]], "x": ["AdaBoost", "ExtraTrees", "GradientBoost", "RandomForest"], "y": ["AdaBoost", "ExtraTrees", "GradientBoost", "RandomForest"], "colorscale": "Portland", "showscale": true, "reversescale": true}], {}, {"showLink": true, "linkText": "Export to plot.ly"})});

``````

There have been quite a few articles and Kaggle competition winner stories about the merits of having trained models that are more uncorrelated with one another producing better scores.

``````

In [27]:

x_train

``````
``````

Out[27]:

array([[-0.47367361, -0.42745127, -0.54492498, ..., -0.56568542,
-0.51015154,  0.90258736],
[-0.47367361,  2.59096206,  1.835115  , ...,  1.76776695,
-0.51015154, -1.10792599],
[-0.47367361, -0.53525175, -0.54492498, ..., -0.56568542,
-0.51015154,  0.90258736],
...,
[ 2.00893337,  1.40515682, -0.54492498, ..., -0.56568542,
-0.51015154,  0.90258736],
[-0.47367361, -0.64305222,  1.835115  , ...,  1.76776695,
-0.51015154, -1.10792599],
[-0.47367361, -0.85865317, -0.54492498, ..., -0.56568542,
-0.51015154,  0.90258736]])

``````
``````

In [28]:

x_train = np.concatenate(( et_oof_train, rf_oof_train, ada_oof_train, gb_oof_train, svc_oof_train), axis=1)
x_test = np.concatenate(( et_oof_test, rf_oof_test, ada_oof_test, gb_oof_test, svc_oof_test), axis=1)

``````
``````

In [29]:

x_train.shape

``````
``````

Out[29]:

(891, 5)

``````

Having now concatenated and joined both the first-level train and test predictions as x_train and x_test, we can now fit a second-level learning model.

Second level learning model via XGBoost

Here we choose the eXtremely famous library for boosted tree learning model, XGBoost. It was built to optimize large-scale boosted tree algorithms. For further information about the algorithm, check out the official documentation.

Anyways, we call an XGBClassifier and fit it to the first-level train and target data and use the learned model to predict the test data as follows:

``````

In [35]:

gbm = xgb.XGBClassifier(
#learning_rate = 0.02,
n_estimators= 2000,
max_depth= 4,
min_child_weight= 2,
#gamma=1,
gamma=0.9,
subsample=0.8,
colsample_bytree=0.8,
objective= 'binary:logistic',
scale_pos_weight=1).fit(
x_train, y_train,
eval_set=[(x_train, y_train)],
eval_metric='logloss',
verbose=True
)
predictions = gbm.predict(x_test)

``````
``````

[0]	validation_0-logloss:0.647256
[1]	validation_0-logloss:0.610148
[2]	validation_0-logloss:0.580113
[3]	validation_0-logloss:0.549875
[4]	validation_0-logloss:0.52374
[5]	validation_0-logloss:0.501913
[6]	validation_0-logloss:0.486422
[7]	validation_0-logloss:0.471168
[8]	validation_0-logloss:0.45794
[9]	validation_0-logloss:0.446693
[10]	validation_0-logloss:0.43662
[11]	validation_0-logloss:0.428077
[12]	validation_0-logloss:0.422295
[13]	validation_0-logloss:0.417579
[14]	validation_0-logloss:0.412294
[15]	validation_0-logloss:0.407352
[16]	validation_0-logloss:0.404565
[17]	validation_0-logloss:0.402231
[18]	validation_0-logloss:0.39912
[19]	validation_0-logloss:0.396872
[20]	validation_0-logloss:0.395439
[21]	validation_0-logloss:0.394125
[22]	validation_0-logloss:0.393267
[23]	validation_0-logloss:0.3917
[24]	validation_0-logloss:0.390037
[25]	validation_0-logloss:0.389189
[26]	validation_0-logloss:0.38815
[27]	validation_0-logloss:0.387214
[28]	validation_0-logloss:0.386644
[29]	validation_0-logloss:0.386116
[30]	validation_0-logloss:0.385576
[31]	validation_0-logloss:0.385346
[32]	validation_0-logloss:0.384878
[33]	validation_0-logloss:0.384606
[34]	validation_0-logloss:0.384514
[35]	validation_0-logloss:0.384447
[36]	validation_0-logloss:0.384029
[37]	validation_0-logloss:0.38393
[38]	validation_0-logloss:0.383773
[39]	validation_0-logloss:0.383651
[40]	validation_0-logloss:0.383385
[41]	validation_0-logloss:0.383148
[42]	validation_0-logloss:0.383021
[43]	validation_0-logloss:0.383006
[44]	validation_0-logloss:0.382999
[45]	validation_0-logloss:0.382719
[46]	validation_0-logloss:0.382595
[47]	validation_0-logloss:0.38251
[48]	validation_0-logloss:0.382342
[49]	validation_0-logloss:0.382339
[50]	validation_0-logloss:0.38234
[51]	validation_0-logloss:0.382285
[52]	validation_0-logloss:0.38228
[53]	validation_0-logloss:0.382281
[54]	validation_0-logloss:0.382282
[55]	validation_0-logloss:0.38228
[56]	validation_0-logloss:0.38211
[57]	validation_0-logloss:0.382107
[58]	validation_0-logloss:0.382105
[59]	validation_0-logloss:0.381983
[60]	validation_0-logloss:0.38193
[61]	validation_0-logloss:0.381929
[62]	validation_0-logloss:0.381857
[63]	validation_0-logloss:0.38183
[64]	validation_0-logloss:0.381778
[65]	validation_0-logloss:0.381776
[66]	validation_0-logloss:0.381689
[67]	validation_0-logloss:0.381687
[68]	validation_0-logloss:0.381691
[69]	validation_0-logloss:0.381691
[70]	validation_0-logloss:0.381679
[71]	validation_0-logloss:0.381684
[72]	validation_0-logloss:0.381513
[73]	validation_0-logloss:0.381521
[74]	validation_0-logloss:0.381518
[75]	validation_0-logloss:0.381523
[76]	validation_0-logloss:0.381496
[77]	validation_0-logloss:0.381503
[78]	validation_0-logloss:0.381388
[79]	validation_0-logloss:0.38136
[80]	validation_0-logloss:0.381062
[81]	validation_0-logloss:0.380998
[82]	validation_0-logloss:0.380992
[83]	validation_0-logloss:0.380964
[84]	validation_0-logloss:0.380958
[85]	validation_0-logloss:0.380961
[86]	validation_0-logloss:0.380962
[87]	validation_0-logloss:0.38096
[88]	validation_0-logloss:0.380968
[89]	validation_0-logloss:0.380854
[90]	validation_0-logloss:0.380853
[91]	validation_0-logloss:0.380854
[92]	validation_0-logloss:0.380856
[93]	validation_0-logloss:0.380852
[94]	validation_0-logloss:0.380858
[95]	validation_0-logloss:0.38086
[96]	validation_0-logloss:0.380857
[97]	validation_0-logloss:0.380828
[98]	validation_0-logloss:0.380707
[99]	validation_0-logloss:0.380691
[100]	validation_0-logloss:0.380722
[101]	validation_0-logloss:0.380654
[102]	validation_0-logloss:0.380609
[103]	validation_0-logloss:0.380604
[104]	validation_0-logloss:0.380563
[105]	validation_0-logloss:0.380555
[106]	validation_0-logloss:0.380557
[107]	validation_0-logloss:0.380551
[108]	validation_0-logloss:0.380446
[109]	validation_0-logloss:0.380448
[110]	validation_0-logloss:0.380421
[111]	validation_0-logloss:0.380427
[112]	validation_0-logloss:0.380432
[113]	validation_0-logloss:0.380429
[114]	validation_0-logloss:0.380433
[115]	validation_0-logloss:0.380436
[116]	validation_0-logloss:0.380379
[117]	validation_0-logloss:0.380366
[118]	validation_0-logloss:0.380266
[119]	validation_0-logloss:0.380256
[120]	validation_0-logloss:0.380209
[121]	validation_0-logloss:0.380206
[122]	validation_0-logloss:0.380212
[123]	validation_0-logloss:0.380206
[124]	validation_0-logloss:0.380211
[125]	validation_0-logloss:0.380204
[126]	validation_0-logloss:0.380206
[127]	validation_0-logloss:0.380206
[128]	validation_0-logloss:0.380165
[129]	validation_0-logloss:0.380164
[130]	validation_0-logloss:0.380119
[131]	validation_0-logloss:0.380122
[132]	validation_0-logloss:0.380126
[133]	validation_0-logloss:0.380136
[134]	validation_0-logloss:0.380104
[135]	validation_0-logloss:0.380016
[136]	validation_0-logloss:0.380002
[137]	validation_0-logloss:0.379904
[138]	validation_0-logloss:0.379903
[139]	validation_0-logloss:0.379903
[140]	validation_0-logloss:0.379904
[141]	validation_0-logloss:0.379905
[142]	validation_0-logloss:0.379906
[143]	validation_0-logloss:0.379906
[144]	validation_0-logloss:0.379905
[145]	validation_0-logloss:0.379909
[146]	validation_0-logloss:0.379854
[147]	validation_0-logloss:0.379863
[148]	validation_0-logloss:0.379806
[149]	validation_0-logloss:0.379823
[150]	validation_0-logloss:0.379664
[151]	validation_0-logloss:0.379629
[152]	validation_0-logloss:0.379615
[153]	validation_0-logloss:0.379627
[154]	validation_0-logloss:0.379628
[155]	validation_0-logloss:0.379616
[156]	validation_0-logloss:0.379614
[157]	validation_0-logloss:0.379611
[158]	validation_0-logloss:0.379615
[159]	validation_0-logloss:0.379606
[160]	validation_0-logloss:0.379606
[161]	validation_0-logloss:0.379606
[162]	validation_0-logloss:0.37961
[163]	validation_0-logloss:0.379607
[164]	validation_0-logloss:0.37961
[165]	validation_0-logloss:0.379582
[166]	validation_0-logloss:0.37958
[167]	validation_0-logloss:0.379581
[168]	validation_0-logloss:0.379596
[169]	validation_0-logloss:0.379595
[170]	validation_0-logloss:0.379537
[171]	validation_0-logloss:0.379532
[172]	validation_0-logloss:0.379521
[173]	validation_0-logloss:0.379522
[174]	validation_0-logloss:0.37953
[175]	validation_0-logloss:0.379523
[176]	validation_0-logloss:0.379524
[177]	validation_0-logloss:0.379485
[178]	validation_0-logloss:0.379482
[179]	validation_0-logloss:0.379476
[180]	validation_0-logloss:0.379476
[181]	validation_0-logloss:0.379474
[182]	validation_0-logloss:0.379472
[183]	validation_0-logloss:0.379472
[184]	validation_0-logloss:0.37946
[185]	validation_0-logloss:0.379463
[186]	validation_0-logloss:0.379462
[187]	validation_0-logloss:0.379462
[188]	validation_0-logloss:0.37946
[189]	validation_0-logloss:0.37946
[190]	validation_0-logloss:0.37946
[191]	validation_0-logloss:0.379466
[192]	validation_0-logloss:0.379424
[193]	validation_0-logloss:0.379423
[194]	validation_0-logloss:0.379422
[195]	validation_0-logloss:0.379423
[196]	validation_0-logloss:0.379422
[197]	validation_0-logloss:0.379363
[198]	validation_0-logloss:0.379363
[199]	validation_0-logloss:0.379363
[200]	validation_0-logloss:0.379362
[201]	validation_0-logloss:0.379349
[202]	validation_0-logloss:0.379349
[203]	validation_0-logloss:0.379349
[204]	validation_0-logloss:0.379276
[205]	validation_0-logloss:0.37926
[206]	validation_0-logloss:0.379263
[207]	validation_0-logloss:0.379195
[208]	validation_0-logloss:0.379192
[209]	validation_0-logloss:0.379187
[210]	validation_0-logloss:0.379186
[211]	validation_0-logloss:0.379189
[212]	validation_0-logloss:0.379189
[213]	validation_0-logloss:0.379185
[214]	validation_0-logloss:0.379186
[215]	validation_0-logloss:0.379186
[216]	validation_0-logloss:0.379185
[217]	validation_0-logloss:0.379185
[218]	validation_0-logloss:0.379186
[219]	validation_0-logloss:0.379185
[220]	validation_0-logloss:0.379147
[221]	validation_0-logloss:0.379152
[222]	validation_0-logloss:0.379149
[223]	validation_0-logloss:0.379156
[224]	validation_0-logloss:0.37916
[225]	validation_0-logloss:0.379187
[226]	validation_0-logloss:0.379198
[227]	validation_0-logloss:0.379198
[228]	validation_0-logloss:0.379189
[229]	validation_0-logloss:0.37921
[230]	validation_0-logloss:0.379192
[231]	validation_0-logloss:0.379182
[232]	validation_0-logloss:0.379175
[233]	validation_0-logloss:0.379192
[234]	validation_0-logloss:0.379173
[235]	validation_0-logloss:0.379163
[236]	validation_0-logloss:0.379159
[237]	validation_0-logloss:0.37912
[238]	validation_0-logloss:0.379118
[239]	validation_0-logloss:0.379118
[240]	validation_0-logloss:0.379119
[241]	validation_0-logloss:0.379107
[242]	validation_0-logloss:0.379107
[243]	validation_0-logloss:0.379108
[244]	validation_0-logloss:0.379108
[245]	validation_0-logloss:0.37911
[246]	validation_0-logloss:0.379115
[247]	validation_0-logloss:0.379108
[248]	validation_0-logloss:0.37911
[249]	validation_0-logloss:0.37911
[250]	validation_0-logloss:0.37911
[251]	validation_0-logloss:0.37912
[252]	validation_0-logloss:0.37911
[253]	validation_0-logloss:0.379109
[254]	validation_0-logloss:0.37911
[255]	validation_0-logloss:0.379121
[256]	validation_0-logloss:0.379121
[257]	validation_0-logloss:0.379115
[258]	validation_0-logloss:0.379112
[259]	validation_0-logloss:0.379112
[260]	validation_0-logloss:0.379115
[261]	validation_0-logloss:0.37906
[262]	validation_0-logloss:0.37906
[263]	validation_0-logloss:0.379067
[264]	validation_0-logloss:0.379092
[265]	validation_0-logloss:0.37909
[266]	validation_0-logloss:0.37909
[267]	validation_0-logloss:0.378978
[268]	validation_0-logloss:0.378958
[269]	validation_0-logloss:0.378943
[270]	validation_0-logloss:0.378942
[271]	validation_0-logloss:0.378948
[272]	validation_0-logloss:0.378935
[273]	validation_0-logloss:0.378932
[274]	validation_0-logloss:0.378934
[275]	validation_0-logloss:0.378931
[276]	validation_0-logloss:0.37888
[277]	validation_0-logloss:0.378872
[278]	validation_0-logloss:0.378875
[279]	validation_0-logloss:0.378855
[280]	validation_0-logloss:0.378862
[281]	validation_0-logloss:0.378871
[282]	validation_0-logloss:0.378867
[283]	validation_0-logloss:0.378854
[284]	validation_0-logloss:0.378859
[285]	validation_0-logloss:0.378855
[286]	validation_0-logloss:0.37883
[287]	validation_0-logloss:0.378834
[288]	validation_0-logloss:0.378842
[289]	validation_0-logloss:0.378841
[290]	validation_0-logloss:0.378847
[291]	validation_0-logloss:0.378723
[292]	validation_0-logloss:0.378719
[293]	validation_0-logloss:0.378667
[294]	validation_0-logloss:0.378659
[295]	validation_0-logloss:0.378658
[296]	validation_0-logloss:0.378658
[297]	validation_0-logloss:0.378658
[298]	validation_0-logloss:0.378661
[299]	validation_0-logloss:0.378661
[300]	validation_0-logloss:0.378659
[301]	validation_0-logloss:0.378662
[302]	validation_0-logloss:0.378666
[303]	validation_0-logloss:0.37866
[304]	validation_0-logloss:0.378599
[305]	validation_0-logloss:0.378609
[306]	validation_0-logloss:0.378617
[307]	validation_0-logloss:0.378626
[308]	validation_0-logloss:0.378634
[309]	validation_0-logloss:0.378653
[310]	validation_0-logloss:0.378651
[311]	validation_0-logloss:0.37864
[312]	validation_0-logloss:0.378631
[313]	validation_0-logloss:0.378612
[314]	validation_0-logloss:0.378607
[315]	validation_0-logloss:0.378567
[316]	validation_0-logloss:0.378565
[317]	validation_0-logloss:0.378565
[318]	validation_0-logloss:0.378569
[319]	validation_0-logloss:0.378567
[320]	validation_0-logloss:0.378565
[321]	validation_0-logloss:0.378565
[322]	validation_0-logloss:0.378565
[323]	validation_0-logloss:0.378572
[324]	validation_0-logloss:0.378571
[325]	validation_0-logloss:0.378457
[326]	validation_0-logloss:0.378458
[327]	validation_0-logloss:0.378456
[328]	validation_0-logloss:0.378458
[329]	validation_0-logloss:0.378462
[330]	validation_0-logloss:0.37847
[331]	validation_0-logloss:0.378457
[332]	validation_0-logloss:0.378459
[333]	validation_0-logloss:0.378462
[334]	validation_0-logloss:0.378457
[335]	validation_0-logloss:0.378456
[336]	validation_0-logloss:0.378456
[337]	validation_0-logloss:0.378459
[338]	validation_0-logloss:0.378461
[339]	validation_0-logloss:0.378458
[340]	validation_0-logloss:0.378456
[341]	validation_0-logloss:0.378457
[342]	validation_0-logloss:0.378383
[343]	validation_0-logloss:0.378353
[344]	validation_0-logloss:0.378353
[345]	validation_0-logloss:0.378352
[346]	validation_0-logloss:0.378287
[347]	validation_0-logloss:0.378288
[348]	validation_0-logloss:0.378289
[349]	validation_0-logloss:0.378289
[350]	validation_0-logloss:0.37829
[351]	validation_0-logloss:0.378289
[352]	validation_0-logloss:0.37829
[353]	validation_0-logloss:0.37829
[354]	validation_0-logloss:0.37829
[355]	validation_0-logloss:0.378291
[356]	validation_0-logloss:0.378288
[357]	validation_0-logloss:0.378289
[358]	validation_0-logloss:0.378294
[359]	validation_0-logloss:0.37829
[360]	validation_0-logloss:0.378297
[361]	validation_0-logloss:0.378288
[362]	validation_0-logloss:0.378292
[363]	validation_0-logloss:0.378309
[364]	validation_0-logloss:0.378308
[365]	validation_0-logloss:0.378222
[366]	validation_0-logloss:0.378217
[367]	validation_0-logloss:0.378216
[368]	validation_0-logloss:0.378208
[369]	validation_0-logloss:0.378209
[370]	validation_0-logloss:0.378168
[371]	validation_0-logloss:0.378163
[372]	validation_0-logloss:0.378162
[373]	validation_0-logloss:0.378166
[374]	validation_0-logloss:0.378161
[375]	validation_0-logloss:0.378168
[376]	validation_0-logloss:0.378165
[377]	validation_0-logloss:0.378166
[378]	validation_0-logloss:0.378176
[379]	validation_0-logloss:0.378173
[380]	validation_0-logloss:0.378165
[381]	validation_0-logloss:0.378146
[382]	validation_0-logloss:0.378145
[383]	validation_0-logloss:0.378152
[384]	validation_0-logloss:0.378156
[385]	validation_0-logloss:0.378148
[386]	validation_0-logloss:0.378147
[387]	validation_0-logloss:0.37812
[388]	validation_0-logloss:0.378122
[389]	validation_0-logloss:0.37812
[390]	validation_0-logloss:0.378118
[391]	validation_0-logloss:0.378115
[392]	validation_0-logloss:0.378113
[393]	validation_0-logloss:0.378115
[394]	validation_0-logloss:0.378115
[395]	validation_0-logloss:0.378114
[396]	validation_0-logloss:0.378113
[397]	validation_0-logloss:0.378113
[398]	validation_0-logloss:0.378115
[399]	validation_0-logloss:0.378066
[400]	validation_0-logloss:0.378068
[401]	validation_0-logloss:0.378074
[402]	validation_0-logloss:0.37806
[403]	validation_0-logloss:0.37806
[404]	validation_0-logloss:0.378064
[405]	validation_0-logloss:0.378056
[406]	validation_0-logloss:0.37806
[407]	validation_0-logloss:0.378062
[408]	validation_0-logloss:0.378058
[409]	validation_0-logloss:0.378069
[410]	validation_0-logloss:0.378062
[411]	validation_0-logloss:0.378062
[412]	validation_0-logloss:0.378057
[413]	validation_0-logloss:0.378063
[414]	validation_0-logloss:0.378061
[415]	validation_0-logloss:0.378062
[416]	validation_0-logloss:0.378061
[417]	validation_0-logloss:0.378063
[418]	validation_0-logloss:0.378062
[419]	validation_0-logloss:0.378068
[420]	validation_0-logloss:0.378064
[421]	validation_0-logloss:0.378059
[422]	validation_0-logloss:0.37806
[423]	validation_0-logloss:0.378056
[424]	validation_0-logloss:0.378062
[425]	validation_0-logloss:0.378061
[426]	validation_0-logloss:0.378041
[427]	validation_0-logloss:0.378046
[428]	validation_0-logloss:0.37804
[429]	validation_0-logloss:0.378039
[430]	validation_0-logloss:0.378048
[431]	validation_0-logloss:0.378048
[432]	validation_0-logloss:0.378059
[433]	validation_0-logloss:0.378048
[434]	validation_0-logloss:0.378048
[435]	validation_0-logloss:0.378048
[436]	validation_0-logloss:0.37805
[437]	validation_0-logloss:0.378031
[438]	validation_0-logloss:0.378031
[439]	validation_0-logloss:0.378017
[440]	validation_0-logloss:0.37801
[441]	validation_0-logloss:0.378009
[442]	validation_0-logloss:0.378017
[443]	validation_0-logloss:0.378028
[444]	validation_0-logloss:0.378017
[445]	validation_0-logloss:0.378007
[446]	validation_0-logloss:0.377963
[447]	validation_0-logloss:0.377964
[448]	validation_0-logloss:0.377964
[449]	validation_0-logloss:0.377964
[450]	validation_0-logloss:0.377963
[451]	validation_0-logloss:0.377963
[452]	validation_0-logloss:0.377964
[453]	validation_0-logloss:0.377963
[454]	validation_0-logloss:0.377964
[455]	validation_0-logloss:0.377963
[456]	validation_0-logloss:0.377963
[457]	validation_0-logloss:0.377964
[458]	validation_0-logloss:0.377969
[459]	validation_0-logloss:0.377968
[460]	validation_0-logloss:0.377972
[461]	validation_0-logloss:0.377978
[462]	validation_0-logloss:0.37798
[463]	validation_0-logloss:0.377998
[464]	validation_0-logloss:0.377996
[465]	validation_0-logloss:0.378015
[466]	validation_0-logloss:0.377997
[467]	validation_0-logloss:0.37797
[468]	validation_0-logloss:0.377926
[469]	validation_0-logloss:0.377914
[470]	validation_0-logloss:0.377915
[471]	validation_0-logloss:0.3779
[472]	validation_0-logloss:0.377893
[473]	validation_0-logloss:0.3779
[474]	validation_0-logloss:0.377888
[475]	validation_0-logloss:0.377889
[476]	validation_0-logloss:0.377907
[477]	validation_0-logloss:0.377894
[478]	validation_0-logloss:0.377891
[479]	validation_0-logloss:0.37789
[480]	validation_0-logloss:0.377892
[481]	validation_0-logloss:0.37789
[482]	validation_0-logloss:0.377888
[483]	validation_0-logloss:0.377888
[484]	validation_0-logloss:0.377887
[485]	validation_0-logloss:0.377888
[486]	validation_0-logloss:0.37789
[487]	validation_0-logloss:0.377888
[488]	validation_0-logloss:0.377888
[489]	validation_0-logloss:0.377888
[490]	validation_0-logloss:0.377889
[491]	validation_0-logloss:0.377888
[492]	validation_0-logloss:0.37789
[493]	validation_0-logloss:0.377887
[494]	validation_0-logloss:0.377885
[495]	validation_0-logloss:0.377883
[496]	validation_0-logloss:0.377884
[497]	validation_0-logloss:0.377856
[498]	validation_0-logloss:0.37786
[499]	validation_0-logloss:0.37787
[500]	validation_0-logloss:0.377871
[501]	validation_0-logloss:0.377883
[502]	validation_0-logloss:0.377887
[503]	validation_0-logloss:0.377881
[504]	validation_0-logloss:0.377862
[505]	validation_0-logloss:0.377855
[506]	validation_0-logloss:0.377878
[507]	validation_0-logloss:0.377907
[508]	validation_0-logloss:0.377893
[509]	validation_0-logloss:0.377909
[510]	validation_0-logloss:0.377895
[511]	validation_0-logloss:0.377897
[512]	validation_0-logloss:0.377872
[513]	validation_0-logloss:0.377883
[514]	validation_0-logloss:0.377877
[515]	validation_0-logloss:0.377856
[516]	validation_0-logloss:0.377848
[517]	validation_0-logloss:0.377852
[518]	validation_0-logloss:0.377862
[519]	validation_0-logloss:0.377879
[520]	validation_0-logloss:0.37787
[521]	validation_0-logloss:0.377879
[522]	validation_0-logloss:0.377903
[523]	validation_0-logloss:0.377872
[524]	validation_0-logloss:0.377864
[525]	validation_0-logloss:0.377882
[526]	validation_0-logloss:0.377855
[527]	validation_0-logloss:0.377768
[528]	validation_0-logloss:0.377766
[529]	validation_0-logloss:0.377795
[530]	validation_0-logloss:0.377757
[531]	validation_0-logloss:0.377758
[532]	validation_0-logloss:0.377754
[533]	validation_0-logloss:0.377749
[534]	validation_0-logloss:0.377746
[535]	validation_0-logloss:0.377742
[536]	validation_0-logloss:0.377739
[537]	validation_0-logloss:0.377739
[538]	validation_0-logloss:0.377738
[539]	validation_0-logloss:0.377739
[540]	validation_0-logloss:0.377781
[541]	validation_0-logloss:0.377782
[542]	validation_0-logloss:0.377784
[543]	validation_0-logloss:0.377799
[544]	validation_0-logloss:0.377798
[545]	validation_0-logloss:0.377817
[546]	validation_0-logloss:0.377821
[547]	validation_0-logloss:0.377818
[548]	validation_0-logloss:0.377818
[549]	validation_0-logloss:0.377817
[550]	validation_0-logloss:0.377818
[551]	validation_0-logloss:0.377814
[552]	validation_0-logloss:0.377799
[553]	validation_0-logloss:0.377828
[554]	validation_0-logloss:0.377829
[555]	validation_0-logloss:0.37782
[556]	validation_0-logloss:0.377765
[557]	validation_0-logloss:0.377772
[558]	validation_0-logloss:0.377765
[559]	validation_0-logloss:0.377765
[560]	validation_0-logloss:0.377731
[561]	validation_0-logloss:0.37773
[562]	validation_0-logloss:0.377731
[563]	validation_0-logloss:0.377732
[564]	validation_0-logloss:0.377655
[565]	validation_0-logloss:0.377655
[566]	validation_0-logloss:0.377658
[567]	validation_0-logloss:0.377663
[568]	validation_0-logloss:0.377671
[569]	validation_0-logloss:0.377657
[570]	validation_0-logloss:0.377672
[571]	validation_0-logloss:0.377665
[572]	validation_0-logloss:0.377679
[573]	validation_0-logloss:0.377664
[574]	validation_0-logloss:0.377658
[575]	validation_0-logloss:0.377668
[576]	validation_0-logloss:0.37767
[577]	validation_0-logloss:0.377687
[578]	validation_0-logloss:0.377687
[579]	validation_0-logloss:0.37768
[580]	validation_0-logloss:0.37769
[581]	validation_0-logloss:0.377693
[582]	validation_0-logloss:0.377685
[583]	validation_0-logloss:0.377679
[584]	validation_0-logloss:0.377627
[585]	validation_0-logloss:0.37765
[586]	validation_0-logloss:0.377666
[587]	validation_0-logloss:0.377647
[588]	validation_0-logloss:0.377627
[589]	validation_0-logloss:0.377625
[590]	validation_0-logloss:0.377598
[591]	validation_0-logloss:0.377598
[592]	validation_0-logloss:0.377596
[593]	validation_0-logloss:0.377598
[594]	validation_0-logloss:0.377596
[595]	validation_0-logloss:0.377596
[596]	validation_0-logloss:0.377596
[597]	validation_0-logloss:0.377589
[598]	validation_0-logloss:0.377589
[599]	validation_0-logloss:0.37759
[600]	validation_0-logloss:0.37759
[601]	validation_0-logloss:0.377591
[602]	validation_0-logloss:0.377598
[603]	validation_0-logloss:0.377603
[604]	validation_0-logloss:0.377602
[605]	validation_0-logloss:0.377601
[606]	validation_0-logloss:0.377601
[607]	validation_0-logloss:0.377601
[608]	validation_0-logloss:0.377517
[609]	validation_0-logloss:0.377525
[610]	validation_0-logloss:0.37756
[611]	validation_0-logloss:0.377557
[612]	validation_0-logloss:0.377547
[613]	validation_0-logloss:0.377519
[614]	validation_0-logloss:0.377513
[615]	validation_0-logloss:0.377519
[616]	validation_0-logloss:0.377523
[617]	validation_0-logloss:0.377512
[618]	validation_0-logloss:0.377511
[619]	validation_0-logloss:0.377513
[620]	validation_0-logloss:0.377516
[621]	validation_0-logloss:0.377428
[622]	validation_0-logloss:0.377401
[623]	validation_0-logloss:0.377403
[624]	validation_0-logloss:0.377401
[625]	validation_0-logloss:0.377398
[626]	validation_0-logloss:0.377409
[627]	validation_0-logloss:0.377394
[628]	validation_0-logloss:0.377393
[629]	validation_0-logloss:0.377393
[630]	validation_0-logloss:0.377395
[631]	validation_0-logloss:0.377393
[632]	validation_0-logloss:0.377393
[633]	validation_0-logloss:0.377393
[634]	validation_0-logloss:0.377393
[635]	validation_0-logloss:0.377393
[636]	validation_0-logloss:0.377393
[637]	validation_0-logloss:0.377393
[638]	validation_0-logloss:0.377394
[639]	validation_0-logloss:0.377393
[640]	validation_0-logloss:0.377394
[641]	validation_0-logloss:0.377394
[642]	validation_0-logloss:0.377393
[643]	validation_0-logloss:0.377382
[644]	validation_0-logloss:0.377384
[645]	validation_0-logloss:0.377413
[646]	validation_0-logloss:0.377411
[647]	validation_0-logloss:0.377392
[648]	validation_0-logloss:0.377392
[649]	validation_0-logloss:0.377398
[650]	validation_0-logloss:0.377397
[651]	validation_0-logloss:0.377383
[652]	validation_0-logloss:0.377383
[653]	validation_0-logloss:0.377383
[654]	validation_0-logloss:0.377382
[655]	validation_0-logloss:0.377392
[656]	validation_0-logloss:0.377392
[657]	validation_0-logloss:0.377392
[658]	validation_0-logloss:0.377393
[659]	validation_0-logloss:0.377391
[660]	validation_0-logloss:0.377391
[661]	validation_0-logloss:0.377394
[662]	validation_0-logloss:0.377394
[663]	validation_0-logloss:0.377391
[664]	validation_0-logloss:0.377393
[665]	validation_0-logloss:0.377365
[666]	validation_0-logloss:0.377365
[667]	validation_0-logloss:0.377365
[668]	validation_0-logloss:0.377366
[669]	validation_0-logloss:0.377365
[670]	validation_0-logloss:0.377365
[671]	validation_0-logloss:0.377365
[672]	validation_0-logloss:0.377365
[673]	validation_0-logloss:0.377368
[674]	validation_0-logloss:0.377366
[675]	validation_0-logloss:0.377365
[676]	validation_0-logloss:0.377366
[677]	validation_0-logloss:0.377372
[678]	validation_0-logloss:0.37737
[679]	validation_0-logloss:0.377375
[680]	validation_0-logloss:0.377377
[681]	validation_0-logloss:0.37737
[682]	validation_0-logloss:0.377375
[683]	validation_0-logloss:0.37738
[684]	validation_0-logloss:0.377382
[685]	validation_0-logloss:0.377387
[686]	validation_0-logloss:0.377377
[687]	validation_0-logloss:0.37738
[688]	validation_0-logloss:0.377386
[689]	validation_0-logloss:0.377395
[690]	validation_0-logloss:0.377393
[691]	validation_0-logloss:0.377393
[692]	validation_0-logloss:0.377393
[693]	validation_0-logloss:0.377388
[694]	validation_0-logloss:0.377388
[695]	validation_0-logloss:0.377389
[696]	validation_0-logloss:0.377364
[697]	validation_0-logloss:0.377363
[698]	validation_0-logloss:0.377367
[699]	validation_0-logloss:0.377371
[700]	validation_0-logloss:0.377438
[701]	validation_0-logloss:0.377429
[702]	validation_0-logloss:0.377408
[703]	validation_0-logloss:0.377338
[704]	validation_0-logloss:0.37735
[705]	validation_0-logloss:0.377381
[706]	validation_0-logloss:0.377292
[707]	validation_0-logloss:0.377325
[708]	validation_0-logloss:0.377325
[709]	validation_0-logloss:0.377276
[710]	validation_0-logloss:0.377274
[711]	validation_0-logloss:0.377266
[712]	validation_0-logloss:0.37726
[713]	validation_0-logloss:0.377259
[714]	validation_0-logloss:0.377262
[715]	validation_0-logloss:0.377264
[716]	validation_0-logloss:0.377261
[717]	validation_0-logloss:0.377211
[718]	validation_0-logloss:0.377212
[719]	validation_0-logloss:0.377209
[720]	validation_0-logloss:0.377201
[721]	validation_0-logloss:0.37721
[722]	validation_0-logloss:0.377198
[723]	validation_0-logloss:0.377194
[724]	validation_0-logloss:0.37723
[725]	validation_0-logloss:0.377202
[726]	validation_0-logloss:0.377204
[727]	validation_0-logloss:0.377214
[728]	validation_0-logloss:0.37721
[729]	validation_0-logloss:0.37719
[730]	validation_0-logloss:0.377191
[731]	validation_0-logloss:0.377196
[732]	validation_0-logloss:0.377198
[733]	validation_0-logloss:0.377191
[734]	validation_0-logloss:0.37719
[735]	validation_0-logloss:0.37719
[736]	validation_0-logloss:0.37719
[737]	validation_0-logloss:0.37719
[738]	validation_0-logloss:0.37719
[739]	validation_0-logloss:0.377191
[740]	validation_0-logloss:0.37719
[741]	validation_0-logloss:0.377213
[742]	validation_0-logloss:0.377213
[743]	validation_0-logloss:0.377211
[744]	validation_0-logloss:0.377211
[745]	validation_0-logloss:0.377211
[746]	validation_0-logloss:0.377211
[747]	validation_0-logloss:0.377211
[748]	validation_0-logloss:0.377212
[749]	validation_0-logloss:0.377212
[750]	validation_0-logloss:0.377212
[751]	validation_0-logloss:0.377184
[752]	validation_0-logloss:0.377185
[753]	validation_0-logloss:0.377182
[754]	validation_0-logloss:0.377182
[755]	validation_0-logloss:0.377182
[756]	validation_0-logloss:0.377184
[757]	validation_0-logloss:0.377185
[758]	validation_0-logloss:0.377182
[759]	validation_0-logloss:0.377182
[760]	validation_0-logloss:0.377182
[761]	validation_0-logloss:0.377182
[762]	validation_0-logloss:0.377182
[763]	validation_0-logloss:0.377186
[764]	validation_0-logloss:0.377197
[765]	validation_0-logloss:0.377197
[766]	validation_0-logloss:0.377185
[767]	validation_0-logloss:0.377197
[768]	validation_0-logloss:0.37719
[769]	validation_0-logloss:0.377188
[770]	validation_0-logloss:0.377189
[771]	validation_0-logloss:0.377187
[772]	validation_0-logloss:0.377177
[773]	validation_0-logloss:0.377172
[774]	validation_0-logloss:0.377179
[775]	validation_0-logloss:0.377178
[776]	validation_0-logloss:0.377185
[777]	validation_0-logloss:0.377194
[778]	validation_0-logloss:0.377228
[779]	validation_0-logloss:0.377218
[780]	validation_0-logloss:0.377208
[781]	validation_0-logloss:0.37721
[782]	validation_0-logloss:0.377208
[783]	validation_0-logloss:0.377209
[784]	validation_0-logloss:0.377204
[785]	validation_0-logloss:0.377205
[786]	validation_0-logloss:0.377203
[787]	validation_0-logloss:0.377212
[788]	validation_0-logloss:0.377209
[789]	validation_0-logloss:0.377174
[790]	validation_0-logloss:0.377179
[791]	validation_0-logloss:0.377176
[792]	validation_0-logloss:0.377172
[793]	validation_0-logloss:0.377174
[794]	validation_0-logloss:0.377167
[795]	validation_0-logloss:0.377172
[796]	validation_0-logloss:0.377139
[797]	validation_0-logloss:0.377136
[798]	validation_0-logloss:0.377134
[799]	validation_0-logloss:0.377144
[800]	validation_0-logloss:0.377148
[801]	validation_0-logloss:0.377149
[802]	validation_0-logloss:0.377135
[803]	validation_0-logloss:0.377113
[804]	validation_0-logloss:0.377122
[805]	validation_0-logloss:0.377117
[806]	validation_0-logloss:0.377114
[807]	validation_0-logloss:0.377095
[808]	validation_0-logloss:0.377087
[809]	validation_0-logloss:0.377086
[810]	validation_0-logloss:0.377085
[811]	validation_0-logloss:0.377109
[812]	validation_0-logloss:0.377123
[813]	validation_0-logloss:0.377124
[814]	validation_0-logloss:0.377123
[815]	validation_0-logloss:0.377079
[816]	validation_0-logloss:0.377087
[817]	validation_0-logloss:0.377091
[818]	validation_0-logloss:0.377091
[819]	validation_0-logloss:0.377091
[820]	validation_0-logloss:0.377091
[821]	validation_0-logloss:0.377093
[822]	validation_0-logloss:0.377093
[823]	validation_0-logloss:0.377091
[824]	validation_0-logloss:0.377096
[825]	validation_0-logloss:0.377098
[826]	validation_0-logloss:0.377112
[827]	validation_0-logloss:0.377117
[828]	validation_0-logloss:0.37714
[829]	validation_0-logloss:0.377131
[830]	validation_0-logloss:0.37714
[831]	validation_0-logloss:0.377128
[832]	validation_0-logloss:0.377103
[833]	validation_0-logloss:0.377095
[834]	validation_0-logloss:0.377093
[835]	validation_0-logloss:0.377086
[836]	validation_0-logloss:0.377086
[837]	validation_0-logloss:0.377086
[838]	validation_0-logloss:0.37709
[839]	validation_0-logloss:0.377101
[840]	validation_0-logloss:0.377095
[841]	validation_0-logloss:0.377096
[842]	validation_0-logloss:0.377093
[843]	validation_0-logloss:0.377089
[844]	validation_0-logloss:0.377011
[845]	validation_0-logloss:0.37702
[846]	validation_0-logloss:0.377041
[847]	validation_0-logloss:0.377048
[848]	validation_0-logloss:0.377037
[849]	validation_0-logloss:0.37703
[850]	validation_0-logloss:0.377045
[851]	validation_0-logloss:0.377039
[852]	validation_0-logloss:0.377052
[853]	validation_0-logloss:0.377052
[854]	validation_0-logloss:0.377043
[855]	validation_0-logloss:0.377041
[856]	validation_0-logloss:0.377039
[857]	validation_0-logloss:0.377038
[858]	validation_0-logloss:0.377043
[859]	validation_0-logloss:0.377036
[860]	validation_0-logloss:0.377036
[861]	validation_0-logloss:0.377036
[862]	validation_0-logloss:0.377036
[863]	validation_0-logloss:0.377036
[864]	validation_0-logloss:0.377037
[865]	validation_0-logloss:0.377036
[866]	validation_0-logloss:0.377036
[867]	validation_0-logloss:0.377036
[868]	validation_0-logloss:0.377037
[869]	validation_0-logloss:0.377039
[870]	validation_0-logloss:0.377036
[871]	validation_0-logloss:0.377036
[872]	validation_0-logloss:0.377042
[873]	validation_0-logloss:0.377036
[874]	validation_0-logloss:0.377036
[875]	validation_0-logloss:0.377039
[876]	validation_0-logloss:0.377039
[877]	validation_0-logloss:0.377041
[878]	validation_0-logloss:0.377045
[879]	validation_0-logloss:0.377053
[880]	validation_0-logloss:0.377033
[881]	validation_0-logloss:0.377016
[882]	validation_0-logloss:0.376997
[883]	validation_0-logloss:0.377
[884]	validation_0-logloss:0.376998
[885]	validation_0-logloss:0.376997
[886]	validation_0-logloss:0.376999
[887]	validation_0-logloss:0.377014
[888]	validation_0-logloss:0.377017
[889]	validation_0-logloss:0.377022
[890]	validation_0-logloss:0.377008
[891]	validation_0-logloss:0.377019
[892]	validation_0-logloss:0.377004
[893]	validation_0-logloss:0.376999
[894]	validation_0-logloss:0.376998
[895]	validation_0-logloss:0.376999
[896]	validation_0-logloss:0.377
[897]	validation_0-logloss:0.377025
[898]	validation_0-logloss:0.377021
[899]	validation_0-logloss:0.377019
[900]	validation_0-logloss:0.377019
[901]	validation_0-logloss:0.37702
[902]	validation_0-logloss:0.377019
[903]	validation_0-logloss:0.377019
[904]	validation_0-logloss:0.377021
[905]	validation_0-logloss:0.377019
[906]	validation_0-logloss:0.377025
[907]	validation_0-logloss:0.377024
[908]	validation_0-logloss:0.377026
[909]	validation_0-logloss:0.377029
[910]	validation_0-logloss:0.377014
[911]	validation_0-logloss:0.377006
[912]	validation_0-logloss:0.377003
[913]	validation_0-logloss:0.376998
[914]	validation_0-logloss:0.377
[915]	validation_0-logloss:0.377006
[916]	validation_0-logloss:0.376997
[917]	validation_0-logloss:0.377
[918]	validation_0-logloss:0.376988
[919]	validation_0-logloss:0.376989
[920]	validation_0-logloss:0.376989
[921]	validation_0-logloss:0.376987
[922]	validation_0-logloss:0.376985
[923]	validation_0-logloss:0.376942
[924]	validation_0-logloss:0.376943
[925]	validation_0-logloss:0.376942
[926]	validation_0-logloss:0.376941
[927]	validation_0-logloss:0.376941
[928]	validation_0-logloss:0.37694
[929]	validation_0-logloss:0.376941
[930]	validation_0-logloss:0.376939
[931]	validation_0-logloss:0.376941
[932]	validation_0-logloss:0.376938
[933]	validation_0-logloss:0.376938
[934]	validation_0-logloss:0.376938
[935]	validation_0-logloss:0.376941
[936]	validation_0-logloss:0.376938
[937]	validation_0-logloss:0.376945
[938]	validation_0-logloss:0.376957
[939]	validation_0-logloss:0.376956
[940]	validation_0-logloss:0.37695
[941]	validation_0-logloss:0.376938
[942]	validation_0-logloss:0.376939
[943]	validation_0-logloss:0.376942
[944]	validation_0-logloss:0.376966
[945]	validation_0-logloss:0.376981
[946]	validation_0-logloss:0.376969
[947]	validation_0-logloss:0.376963
[948]	validation_0-logloss:0.376962
[949]	validation_0-logloss:0.37698
[950]	validation_0-logloss:0.376977
[951]	validation_0-logloss:0.376981
[952]	validation_0-logloss:0.37698
[953]	validation_0-logloss:0.376977
[954]	validation_0-logloss:0.376987
[955]	validation_0-logloss:0.376987
[956]	validation_0-logloss:0.376973
[957]	validation_0-logloss:0.376979
[958]	validation_0-logloss:0.376973
[959]	validation_0-logloss:0.376967
[960]	validation_0-logloss:0.376966
[961]	validation_0-logloss:0.376971
[962]	validation_0-logloss:0.376961
[963]	validation_0-logloss:0.376954
[964]	validation_0-logloss:0.377003
[965]	validation_0-logloss:0.376998
[966]	validation_0-logloss:0.376994
[967]	validation_0-logloss:0.376995
[968]	validation_0-logloss:0.377008
[969]	validation_0-logloss:0.377012
[970]	validation_0-logloss:0.377003
[971]	validation_0-logloss:0.377009
[972]	validation_0-logloss:0.377018
[973]	validation_0-logloss:0.37701
[974]	validation_0-logloss:0.377011
[975]	validation_0-logloss:0.376997
[976]	validation_0-logloss:0.376996
[977]	validation_0-logloss:0.376996
[978]	validation_0-logloss:0.376995
[979]	validation_0-logloss:0.376993
[980]	validation_0-logloss:0.376996
[981]	validation_0-logloss:0.377007
[982]	validation_0-logloss:0.377011
[983]	validation_0-logloss:0.377005
[984]	validation_0-logloss:0.377005
[985]	validation_0-logloss:0.376994
[986]	validation_0-logloss:0.376992
[987]	validation_0-logloss:0.376988
[988]	validation_0-logloss:0.37699
[989]	validation_0-logloss:0.376988
[990]	validation_0-logloss:0.376988
[991]	validation_0-logloss:0.37699
[992]	validation_0-logloss:0.376996
[993]	validation_0-logloss:0.376946
[994]	validation_0-logloss:0.376945
[995]	validation_0-logloss:0.376945
[996]	validation_0-logloss:0.376924
[997]	validation_0-logloss:0.37693
[998]	validation_0-logloss:0.376935
[999]	validation_0-logloss:0.376933
[1000]	validation_0-logloss:0.376929
[1001]	validation_0-logloss:0.376932
[1002]	validation_0-logloss:0.376921
[1003]	validation_0-logloss:0.376929
[1004]	validation_0-logloss:0.376933
[1005]	validation_0-logloss:0.376929
[1006]	validation_0-logloss:0.376933
[1007]	validation_0-logloss:0.37693
[1008]	validation_0-logloss:0.376943
[1009]	validation_0-logloss:0.376936
[1010]	validation_0-logloss:0.376919
[1011]	validation_0-logloss:0.376912
[1012]	validation_0-logloss:0.376912
[1013]	validation_0-logloss:0.376912
[1014]	validation_0-logloss:0.376912
[1015]	validation_0-logloss:0.376912
[1016]	validation_0-logloss:0.376912
[1017]	validation_0-logloss:0.376915
[1018]	validation_0-logloss:0.37692
[1019]	validation_0-logloss:0.376923
[1020]	validation_0-logloss:0.376928
[1021]	validation_0-logloss:0.376932
[1022]	validation_0-logloss:0.376937
[1023]	validation_0-logloss:0.376929
[1024]	validation_0-logloss:0.376931
[1025]	validation_0-logloss:0.376923
[1026]	validation_0-logloss:0.376929
[1027]	validation_0-logloss:0.376918
[1028]	validation_0-logloss:0.376925
[1029]	validation_0-logloss:0.376917
[1030]	validation_0-logloss:0.376916
[1031]	validation_0-logloss:0.376908
[1032]	validation_0-logloss:0.376909
[1033]	validation_0-logloss:0.376908
[1034]	validation_0-logloss:0.376907
[1035]	validation_0-logloss:0.376908
[1036]	validation_0-logloss:0.376908
[1037]	validation_0-logloss:0.376888
[1038]	validation_0-logloss:0.37689
[1039]	validation_0-logloss:0.376894
[1040]	validation_0-logloss:0.376901
[1041]	validation_0-logloss:0.37691
[1042]	validation_0-logloss:0.376915
[1043]	validation_0-logloss:0.376893
[1044]	validation_0-logloss:0.376906
[1045]	validation_0-logloss:0.376906
[1046]	validation_0-logloss:0.376881
[1047]	validation_0-logloss:0.376895
[1048]	validation_0-logloss:0.376887
[1049]	validation_0-logloss:0.376887
[1050]	validation_0-logloss:0.376888
[1051]	validation_0-logloss:0.376888
[1052]	validation_0-logloss:0.376888
[1053]	validation_0-logloss:0.37689
[1054]	validation_0-logloss:0.376908
[1055]	validation_0-logloss:0.376882
[1056]	validation_0-logloss:0.376882
[1057]	validation_0-logloss:0.376884
[1058]	validation_0-logloss:0.376882
[1059]	validation_0-logloss:0.376881
[1060]	validation_0-logloss:0.376886
[1061]	validation_0-logloss:0.376884
[1062]	validation_0-logloss:0.37689
[1063]	validation_0-logloss:0.376892
[1064]	validation_0-logloss:0.376896
[1065]	validation_0-logloss:0.376885
[1066]	validation_0-logloss:0.376844
[1067]	validation_0-logloss:0.376845
[1068]	validation_0-logloss:0.376845
[1069]	validation_0-logloss:0.376861
[1070]	validation_0-logloss:0.376866
[1071]	validation_0-logloss:0.37687
[1072]	validation_0-logloss:0.376883
[1073]	validation_0-logloss:0.376876
[1074]	validation_0-logloss:0.376864
[1075]	validation_0-logloss:0.376843
[1076]	validation_0-logloss:0.376843
[1077]	validation_0-logloss:0.376841
[1078]	validation_0-logloss:0.376841
[1079]	validation_0-logloss:0.376842
[1080]	validation_0-logloss:0.376847
[1081]	validation_0-logloss:0.376857
[1082]	validation_0-logloss:0.376856
[1083]	validation_0-logloss:0.376844
[1084]	validation_0-logloss:0.376849
[1085]	validation_0-logloss:0.376863
[1086]	validation_0-logloss:0.376845
[1087]	validation_0-logloss:0.376841
[1088]	validation_0-logloss:0.376834
[1089]	validation_0-logloss:0.376835
[1090]	validation_0-logloss:0.376834
[1091]	validation_0-logloss:0.376834
[1092]	validation_0-logloss:0.376836
[1093]	validation_0-logloss:0.376845
[1094]	validation_0-logloss:0.376877
[1095]	validation_0-logloss:0.376876
[1096]	validation_0-logloss:0.376867
[1097]	validation_0-logloss:0.376821
[1098]	validation_0-logloss:0.376825
[1099]	validation_0-logloss:0.376835
[1100]	validation_0-logloss:0.376832
[1101]	validation_0-logloss:0.37684
[1102]	validation_0-logloss:0.376847
[1103]	validation_0-logloss:0.376858
[1104]	validation_0-logloss:0.376858
[1105]	validation_0-logloss:0.376863
[1106]	validation_0-logloss:0.376874
[1107]	validation_0-logloss:0.376906
[1108]	validation_0-logloss:0.37689
[1109]	validation_0-logloss:0.376883
[1110]	validation_0-logloss:0.37688
[1111]	validation_0-logloss:0.376886
[1112]	validation_0-logloss:0.376891
[1113]	validation_0-logloss:0.376881
[1114]	validation_0-logloss:0.376878
[1115]	validation_0-logloss:0.376868
[1116]	validation_0-logloss:0.376866
[1117]	validation_0-logloss:0.376863
[1118]	validation_0-logloss:0.376864
[1119]	validation_0-logloss:0.376902
[1120]	validation_0-logloss:0.376885
[1121]	validation_0-logloss:0.376879
[1122]	validation_0-logloss:0.376879
[1123]	validation_0-logloss:0.376855
[1124]	validation_0-logloss:0.376855
[1125]	validation_0-logloss:0.376855
[1126]	validation_0-logloss:0.376855
[1127]	validation_0-logloss:0.376854
[1128]	validation_0-logloss:0.376854
[1129]	validation_0-logloss:0.376854
[1130]	validation_0-logloss:0.376792
[1131]	validation_0-logloss:0.376793
[1132]	validation_0-logloss:0.376791
[1133]	validation_0-logloss:0.376797
[1134]	validation_0-logloss:0.376797
[1135]	validation_0-logloss:0.376797
[1136]	validation_0-logloss:0.376797
[1137]	validation_0-logloss:0.376772
[1138]	validation_0-logloss:0.376773
[1139]	validation_0-logloss:0.376773
[1140]	validation_0-logloss:0.376773
[1141]	validation_0-logloss:0.376773
[1142]	validation_0-logloss:0.376751
[1143]	validation_0-logloss:0.37674
[1144]	validation_0-logloss:0.37674
[1145]	validation_0-logloss:0.376747
[1146]	validation_0-logloss:0.376747
[1147]	validation_0-logloss:0.376747
[1148]	validation_0-logloss:0.376747
[1149]	validation_0-logloss:0.376775
[1150]	validation_0-logloss:0.376775
[1151]	validation_0-logloss:0.376752
[1152]	validation_0-logloss:0.376766
[1153]	validation_0-logloss:0.376767
[1154]	validation_0-logloss:0.376766
[1155]	validation_0-logloss:0.376769
[1156]	validation_0-logloss:0.376766
[1157]	validation_0-logloss:0.376771
[1158]	validation_0-logloss:0.376771
[1159]	validation_0-logloss:0.376766
[1160]	validation_0-logloss:0.376761
[1161]	validation_0-logloss:0.376763
[1162]	validation_0-logloss:0.376762
[1163]	validation_0-logloss:0.376762
[1164]	validation_0-logloss:0.376756
[1165]	validation_0-logloss:0.376759
[1166]	validation_0-logloss:0.376758
[1167]	validation_0-logloss:0.376762
[1168]	validation_0-logloss:0.376756
[1169]	validation_0-logloss:0.376756
[1170]	validation_0-logloss:0.376756
[1171]	validation_0-logloss:0.376756
[1172]	validation_0-logloss:0.376756
[1173]	validation_0-logloss:0.376757
[1174]	validation_0-logloss:0.376756
[1175]	validation_0-logloss:0.376781
[1176]	validation_0-logloss:0.376782
[1177]	validation_0-logloss:0.376782
[1178]	validation_0-logloss:0.376787
[1179]	validation_0-logloss:0.376787
[1180]	validation_0-logloss:0.37679
[1181]	validation_0-logloss:0.3768
[1182]	validation_0-logloss:0.376802
[1183]	validation_0-logloss:0.37682
[1184]	validation_0-logloss:0.37681
[1185]	validation_0-logloss:0.376795
[1186]	validation_0-logloss:0.376782
[1187]	validation_0-logloss:0.376785
[1188]	validation_0-logloss:0.376779
[1189]	validation_0-logloss:0.376777
[1190]	validation_0-logloss:0.376778
[1191]	validation_0-logloss:0.376778
[1192]	validation_0-logloss:0.376786
[1193]	validation_0-logloss:0.376793
[1194]	validation_0-logloss:0.37679
[1195]	validation_0-logloss:0.376787
[1196]	validation_0-logloss:0.376815
[1197]	validation_0-logloss:0.376804
[1198]	validation_0-logloss:0.376812
[1199]	validation_0-logloss:0.376815
[1200]	validation_0-logloss:0.376791
[1201]	validation_0-logloss:0.376783
[1202]	validation_0-logloss:0.376772
[1203]	validation_0-logloss:0.376756
[1204]	validation_0-logloss:0.376763
[1205]	validation_0-logloss:0.376751
[1206]	validation_0-logloss:0.376754
[1207]	validation_0-logloss:0.376752
[1208]	validation_0-logloss:0.376752
[1209]	validation_0-logloss:0.376768
[1210]	validation_0-logloss:0.376777
[1211]	validation_0-logloss:0.376768
[1212]	validation_0-logloss:0.376758
[1213]	validation_0-logloss:0.376759
[1214]	validation_0-logloss:0.376803
[1215]	validation_0-logloss:0.376798
[1216]	validation_0-logloss:0.376803
[1217]	validation_0-logloss:0.3768
[1218]	validation_0-logloss:0.376802
[1219]	validation_0-logloss:0.376795
[1220]	validation_0-logloss:0.376782
[1221]	validation_0-logloss:0.376779
[1222]	validation_0-logloss:0.37678
[1223]	validation_0-logloss:0.376776
[1224]	validation_0-logloss:0.376782
[1225]	validation_0-logloss:0.376778
[1226]	validation_0-logloss:0.376783
[1227]	validation_0-logloss:0.376782
[1228]	validation_0-logloss:0.376778
[1229]	validation_0-logloss:0.376778
[1230]	validation_0-logloss:0.376744
[1231]	validation_0-logloss:0.376746
[1232]	validation_0-logloss:0.376755
[1233]	validation_0-logloss:0.376745
[1234]	validation_0-logloss:0.376744
[1235]	validation_0-logloss:0.376744
[1236]	validation_0-logloss:0.376743
[1237]	validation_0-logloss:0.376698
[1238]	validation_0-logloss:0.376699
[1239]	validation_0-logloss:0.376709
[1240]	validation_0-logloss:0.376701
[1241]	validation_0-logloss:0.376698
[1242]	validation_0-logloss:0.376698
[1243]	validation_0-logloss:0.376705
[1244]	validation_0-logloss:0.376706
[1245]	validation_0-logloss:0.376716
[1246]	validation_0-logloss:0.376709
[1247]	validation_0-logloss:0.37671
[1248]	validation_0-logloss:0.376702
[1249]	validation_0-logloss:0.376698
[1250]	validation_0-logloss:0.376698
[1251]	validation_0-logloss:0.3767
[1252]	validation_0-logloss:0.376698
[1253]	validation_0-logloss:0.376698
[1254]	validation_0-logloss:0.376702
[1255]	validation_0-logloss:0.376702
[1256]	validation_0-logloss:0.376709
[1257]	validation_0-logloss:0.376717
[1258]	validation_0-logloss:0.376699
[1259]	validation_0-logloss:0.376691
[1260]	validation_0-logloss:0.376692
[1261]	validation_0-logloss:0.376691
[1262]	validation_0-logloss:0.376691
[1263]	validation_0-logloss:0.376697
[1264]	validation_0-logloss:0.376703
[1265]	validation_0-logloss:0.376716
[1266]	validation_0-logloss:0.376719
[1267]	validation_0-logloss:0.376715
[1268]	validation_0-logloss:0.376706
[1269]	validation_0-logloss:0.376706
[1270]	validation_0-logloss:0.376729
[1271]	validation_0-logloss:0.376738
[1272]	validation_0-logloss:0.376726
[1273]	validation_0-logloss:0.376712
[1274]	validation_0-logloss:0.376718
[1275]	validation_0-logloss:0.376719
[1276]	validation_0-logloss:0.376716
[1277]	validation_0-logloss:0.376713
[1278]	validation_0-logloss:0.376712
[1279]	validation_0-logloss:0.376714
[1280]	validation_0-logloss:0.376712
[1281]	validation_0-logloss:0.376715
[1282]	validation_0-logloss:0.376713
[1283]	validation_0-logloss:0.376719
[1284]	validation_0-logloss:0.376722
[1285]	validation_0-logloss:0.376712
[1286]	validation_0-logloss:0.376711
[1287]	validation_0-logloss:0.376711
[1288]	validation_0-logloss:0.376714
[1289]	validation_0-logloss:0.376715
[1290]	validation_0-logloss:0.376728
[1291]	validation_0-logloss:0.376716
[1292]	validation_0-logloss:0.376715
[1293]	validation_0-logloss:0.376718
[1294]	validation_0-logloss:0.376715
[1295]	validation_0-logloss:0.376693
[1296]	validation_0-logloss:0.376696
[1297]	validation_0-logloss:0.376699
[1298]	validation_0-logloss:0.376689
[1299]	validation_0-logloss:0.376692
[1300]	validation_0-logloss:0.37669
[1301]	validation_0-logloss:0.376688
[1302]	validation_0-logloss:0.376695
[1303]	validation_0-logloss:0.37669
[1304]	validation_0-logloss:0.376683
[1305]	validation_0-logloss:0.376679
[1306]	validation_0-logloss:0.376683
[1307]	validation_0-logloss:0.376687
[1308]	validation_0-logloss:0.376675
[1309]	validation_0-logloss:0.376655
[1310]	validation_0-logloss:0.376656
[1311]	validation_0-logloss:0.376657
[1312]	validation_0-logloss:0.376638
[1313]	validation_0-logloss:0.376642
[1314]	validation_0-logloss:0.376646
[1315]	validation_0-logloss:0.376641
[1316]	validation_0-logloss:0.376583
[1317]	validation_0-logloss:0.376582
[1318]	validation_0-logloss:0.376582
[1319]	validation_0-logloss:0.376582
[1320]	validation_0-logloss:0.376582
[1321]	validation_0-logloss:0.376582
[1322]	validation_0-logloss:0.376582
[1323]	validation_0-logloss:0.376597
[1324]	validation_0-logloss:0.3766
[1325]	validation_0-logloss:0.376596
[1326]	validation_0-logloss:0.376598
[1327]	validation_0-logloss:0.376598
[1328]	validation_0-logloss:0.376601
[1329]	validation_0-logloss:0.376611
[1330]	validation_0-logloss:0.376603
[1331]	validation_0-logloss:0.376618
[1332]	validation_0-logloss:0.376628
[1333]	validation_0-logloss:0.376626
[1334]	validation_0-logloss:0.376632
[1335]	validation_0-logloss:0.376638
[1336]	validation_0-logloss:0.376619
[1337]	validation_0-logloss:0.376621
[1338]	validation_0-logloss:0.376612
[1339]	validation_0-logloss:0.376612
[1340]	validation_0-logloss:0.376603
[1341]	validation_0-logloss:0.376602
[1342]	validation_0-logloss:0.376603
[1343]	validation_0-logloss:0.376602
[1344]	validation_0-logloss:0.376603
[1345]	validation_0-logloss:0.376601
[1346]	validation_0-logloss:0.376606
[1347]	validation_0-logloss:0.376608
[1348]	validation_0-logloss:0.376614
[1349]	validation_0-logloss:0.376605
[1350]	validation_0-logloss:0.376605
[1351]	validation_0-logloss:0.376612
[1352]	validation_0-logloss:0.376596
[1353]	validation_0-logloss:0.3766
[1354]	validation_0-logloss:0.376592
[1355]	validation_0-logloss:0.376601
[1356]	validation_0-logloss:0.376602
[1357]	validation_0-logloss:0.376599
[1358]	validation_0-logloss:0.3766
[1359]	validation_0-logloss:0.376583
[1360]	validation_0-logloss:0.376588
[1361]	validation_0-logloss:0.376581
[1362]	validation_0-logloss:0.376589
[1363]	validation_0-logloss:0.376599
[1364]	validation_0-logloss:0.376546
[1365]	validation_0-logloss:0.376547
[1366]	validation_0-logloss:0.376571
[1367]	validation_0-logloss:0.376573
[1368]	validation_0-logloss:0.376576
[1369]	validation_0-logloss:0.376574
[1370]	validation_0-logloss:0.376576
[1371]	validation_0-logloss:0.376573
[1372]	validation_0-logloss:0.376574
[1373]	validation_0-logloss:0.37657
[1374]	validation_0-logloss:0.376571
[1375]	validation_0-logloss:0.376572
[1376]	validation_0-logloss:0.37657
[1377]	validation_0-logloss:0.37657
[1378]	validation_0-logloss:0.37657
[1379]	validation_0-logloss:0.37657
[1380]	validation_0-logloss:0.37657
[1381]	validation_0-logloss:0.37657
[1382]	validation_0-logloss:0.37657
[1383]	validation_0-logloss:0.37657
[1384]	validation_0-logloss:0.376575
[1385]	validation_0-logloss:0.376579
[1386]	validation_0-logloss:0.376575
[1387]	validation_0-logloss:0.376585
[1388]	validation_0-logloss:0.376578
[1389]	validation_0-logloss:0.37658
[1390]	validation_0-logloss:0.376578
[1391]	validation_0-logloss:0.376554
[1392]	validation_0-logloss:0.376549
[1393]	validation_0-logloss:0.376544
[1394]	validation_0-logloss:0.376542
[1395]	validation_0-logloss:0.376543
[1396]	validation_0-logloss:0.376542
[1397]	validation_0-logloss:0.376542
[1398]	validation_0-logloss:0.376542
[1399]	validation_0-logloss:0.376543
[1400]	validation_0-logloss:0.376553
[1401]	validation_0-logloss:0.376558
[1402]	validation_0-logloss:0.37658
[1403]	validation_0-logloss:0.376585
[1404]	validation_0-logloss:0.376585
[1405]	validation_0-logloss:0.376592
[1406]	validation_0-logloss:0.376563
[1407]	validation_0-logloss:0.376561
[1408]	validation_0-logloss:0.376559
[1409]	validation_0-logloss:0.376568
[1410]	validation_0-logloss:0.376585
[1411]	validation_0-logloss:0.376595
[1412]	validation_0-logloss:0.376557
[1413]	validation_0-logloss:0.376527
[1414]	validation_0-logloss:0.376524
[1415]	validation_0-logloss:0.376522
[1416]	validation_0-logloss:0.376522
[1417]	validation_0-logloss:0.37653
[1418]	validation_0-logloss:0.376529
[1419]	validation_0-logloss:0.376528
[1420]	validation_0-logloss:0.376512
[1421]	validation_0-logloss:0.376512
[1422]	validation_0-logloss:0.376512
[1423]	validation_0-logloss:0.376513
[1424]	validation_0-logloss:0.376451
[1425]	validation_0-logloss:0.37645
[1426]	validation_0-logloss:0.376455
[1427]	validation_0-logloss:0.376456
[1428]	validation_0-logloss:0.37645
[1429]	validation_0-logloss:0.376452
[1430]	validation_0-logloss:0.376452
[1431]	validation_0-logloss:0.376451
[1432]	validation_0-logloss:0.376451
[1433]	validation_0-logloss:0.376452
[1434]	validation_0-logloss:0.376454
[1435]	validation_0-logloss:0.376451
[1436]	validation_0-logloss:0.37645
[1437]	validation_0-logloss:0.376495
[1438]	validation_0-logloss:0.376498
[1439]	validation_0-logloss:0.376503
[1440]	validation_0-logloss:0.376495
[1441]	validation_0-logloss:0.376494
[1442]	validation_0-logloss:0.376496
[1443]	validation_0-logloss:0.376494
[1444]	validation_0-logloss:0.376492
[1445]	validation_0-logloss:0.376482
[1446]	validation_0-logloss:0.376481
[1447]	validation_0-logloss:0.376523
[1448]	validation_0-logloss:0.376521
[1449]	validation_0-logloss:0.376469
[1450]	validation_0-logloss:0.37647
[1451]	validation_0-logloss:0.376475
[1452]	validation_0-logloss:0.37648
[1453]	validation_0-logloss:0.376483
[1454]	validation_0-logloss:0.376473
[1455]	validation_0-logloss:0.376471
[1456]	validation_0-logloss:0.376417
[1457]	validation_0-logloss:0.376416
[1458]	validation_0-logloss:0.376415
[1459]	validation_0-logloss:0.376417
[1460]	validation_0-logloss:0.376415
[1461]	validation_0-logloss:0.376391
[1462]	validation_0-logloss:0.37639
[1463]	validation_0-logloss:0.376418
[1464]	validation_0-logloss:0.376418
[1465]	validation_0-logloss:0.376418
[1466]	validation_0-logloss:0.376418
[1467]	validation_0-logloss:0.376394
[1468]	validation_0-logloss:0.376393
[1469]	validation_0-logloss:0.376385
[1470]	validation_0-logloss:0.376384
[1471]	validation_0-logloss:0.376387
[1472]	validation_0-logloss:0.376384
[1473]	validation_0-logloss:0.376386
[1474]	validation_0-logloss:0.376384
[1475]	validation_0-logloss:0.376384
[1476]	validation_0-logloss:0.37639
[1477]	validation_0-logloss:0.37639
[1478]	validation_0-logloss:0.376388
[1479]	validation_0-logloss:0.376417
[1480]	validation_0-logloss:0.376382
[1481]	validation_0-logloss:0.376439
[1482]	validation_0-logloss:0.376425
[1483]	validation_0-logloss:0.376427
[1484]	validation_0-logloss:0.376422
[1485]	validation_0-logloss:0.37642
[1486]	validation_0-logloss:0.376418
[1487]	validation_0-logloss:0.376418
[1488]	validation_0-logloss:0.376409
[1489]	validation_0-logloss:0.376413
[1490]	validation_0-logloss:0.376408
[1491]	validation_0-logloss:0.376382
[1492]	validation_0-logloss:0.376377
[1493]	validation_0-logloss:0.376378
[1494]	validation_0-logloss:0.376386
[1495]	validation_0-logloss:0.376397
[1496]	validation_0-logloss:0.376371
[1497]	validation_0-logloss:0.376369
[1498]	validation_0-logloss:0.376372
[1499]	validation_0-logloss:0.376374
[1500]	validation_0-logloss:0.376369
[1501]	validation_0-logloss:0.376369
[1502]	validation_0-logloss:0.376379
[1503]	validation_0-logloss:0.376382
[1504]	validation_0-logloss:0.376352
[1505]	validation_0-logloss:0.376335
[1506]	validation_0-logloss:0.376319
[1507]	validation_0-logloss:0.376319
[1508]	validation_0-logloss:0.376315
[1509]	validation_0-logloss:0.376318
[1510]	validation_0-logloss:0.376314
[1511]	validation_0-logloss:0.376293
[1512]	validation_0-logloss:0.376291
[1513]	validation_0-logloss:0.37631
[1514]	validation_0-logloss:0.37631
[1515]	validation_0-logloss:0.376319
[1516]	validation_0-logloss:0.376318
[1517]	validation_0-logloss:0.37632
[1518]	validation_0-logloss:0.376327
[1519]	validation_0-logloss:0.376332
[1520]	validation_0-logloss:0.376312
[1521]	validation_0-logloss:0.376309
[1522]	validation_0-logloss:0.376316
[1523]	validation_0-logloss:0.376314
[1524]	validation_0-logloss:0.376312
[1525]	validation_0-logloss:0.376318
[1526]	validation_0-logloss:0.376309
[1527]	validation_0-logloss:0.376292
[1528]	validation_0-logloss:0.376292
[1529]	validation_0-logloss:0.376294
[1530]	validation_0-logloss:0.376282
[1531]	validation_0-logloss:0.376236
[1532]	validation_0-logloss:0.376235
[1533]	validation_0-logloss:0.376202
[1534]	validation_0-logloss:0.376205
[1535]	validation_0-logloss:0.376205
[1536]	validation_0-logloss:0.376218
[1537]	validation_0-logloss:0.376221
[1538]	validation_0-logloss:0.376218
[1539]	validation_0-logloss:0.376211
[1540]	validation_0-logloss:0.376205
[1541]	validation_0-logloss:0.376209
[1542]	validation_0-logloss:0.37621
[1543]	validation_0-logloss:0.376211
[1544]	validation_0-logloss:0.376207
[1545]	validation_0-logloss:0.376211
[1546]	validation_0-logloss:0.376212
[1547]	validation_0-logloss:0.376214
[1548]	validation_0-logloss:0.376208
[1549]	validation_0-logloss:0.376206
[1550]	validation_0-logloss:0.376207
[1551]	validation_0-logloss:0.376223
[1552]	validation_0-logloss:0.376218
[1553]	validation_0-logloss:0.376205
[1554]	validation_0-logloss:0.376209
[1555]	validation_0-logloss:0.376209
[1556]	validation_0-logloss:0.376205
[1557]	validation_0-logloss:0.376205
[1558]	validation_0-logloss:0.376204
[1559]	validation_0-logloss:0.376205
[1560]	validation_0-logloss:0.376205
[1561]	validation_0-logloss:0.376219
[1562]	validation_0-logloss:0.376226
[1563]	validation_0-logloss:0.376227
[1564]	validation_0-logloss:0.37624
[1565]	validation_0-logloss:0.376237
[1566]	validation_0-logloss:0.376243
[1567]	validation_0-logloss:0.376244
[1568]	validation_0-logloss:0.376227
[1569]	validation_0-logloss:0.376228
[1570]	validation_0-logloss:0.376223
[1571]	validation_0-logloss:0.37622
[1572]	validation_0-logloss:0.376223
[1573]	validation_0-logloss:0.376222
[1574]	validation_0-logloss:0.376222
[1575]	validation_0-logloss:0.376216
[1576]	validation_0-logloss:0.376217
[1577]	validation_0-logloss:0.376217
[1578]	validation_0-logloss:0.376219
[1579]	validation_0-logloss:0.376217
[1580]	validation_0-logloss:0.376218
[1581]	validation_0-logloss:0.376217
[1582]	validation_0-logloss:0.376222
[1583]	validation_0-logloss:0.376188
[1584]	validation_0-logloss:0.37619
[1585]	validation_0-logloss:0.376163
[1586]	validation_0-logloss:0.376165
[1587]	validation_0-logloss:0.376161
[1588]	validation_0-logloss:0.376164
[1589]	validation_0-logloss:0.376166
[1590]	validation_0-logloss:0.376171
[1591]	validation_0-logloss:0.376159
[1592]	validation_0-logloss:0.376177
[1593]	validation_0-logloss:0.376179
[1594]	validation_0-logloss:0.376182
[1595]	validation_0-logloss:0.376192
[1596]	validation_0-logloss:0.376177
[1597]	validation_0-logloss:0.376173
[1598]	validation_0-logloss:0.376173
[1599]	validation_0-logloss:0.376173
[1600]	validation_0-logloss:0.376173
[1601]	validation_0-logloss:0.376175
[1602]	validation_0-logloss:0.376174
[1603]	validation_0-logloss:0.376173
[1604]	validation_0-logloss:0.376176
[1605]	validation_0-logloss:0.37618
[1606]	validation_0-logloss:0.376173
[1607]	validation_0-logloss:0.376173
[1608]	validation_0-logloss:0.376177
[1609]	validation_0-logloss:0.376186
[1610]	validation_0-logloss:0.376185
[1611]	validation_0-logloss:0.376198
[1612]	validation_0-logloss:0.376168
[1613]	validation_0-logloss:0.376162
[1614]	validation_0-logloss:0.376161
[1615]	validation_0-logloss:0.376159
[1616]	validation_0-logloss:0.376159
[1617]	validation_0-logloss:0.376163
[1618]	validation_0-logloss:0.376159
[1619]	validation_0-logloss:0.376162
[1620]	validation_0-logloss:0.376161
[1621]	validation_0-logloss:0.376162
[1622]	validation_0-logloss:0.376155
[1623]	validation_0-logloss:0.376152
[1624]	validation_0-logloss:0.376165
[1625]	validation_0-logloss:0.37616
[1626]	validation_0-logloss:0.376165
[1627]	validation_0-logloss:0.376163
[1628]	validation_0-logloss:0.37616
[1629]	validation_0-logloss:0.37616
[1630]	validation_0-logloss:0.376156
[1631]	validation_0-logloss:0.376156
[1632]	validation_0-logloss:0.376156
[1633]	validation_0-logloss:0.376159
[1634]	validation_0-logloss:0.376159
[1635]	validation_0-logloss:0.376157
[1636]	validation_0-logloss:0.376159
[1637]	validation_0-logloss:0.376156
[1638]	validation_0-logloss:0.376157
[1639]	validation_0-logloss:0.376097
[1640]	validation_0-logloss:0.376095
[1641]	validation_0-logloss:0.376095
[1642]	validation_0-logloss:0.376104
[1643]	validation_0-logloss:0.376109
[1644]	validation_0-logloss:0.37611
[1645]	validation_0-logloss:0.376095
[1646]	validation_0-logloss:0.376096
[1647]	validation_0-logloss:0.376099
[1648]	validation_0-logloss:0.376096
[1649]	validation_0-logloss:0.37606
[1650]	validation_0-logloss:0.376061
[1651]	validation_0-logloss:0.376061
[1652]	validation_0-logloss:0.376057
[1653]	validation_0-logloss:0.37606
[1654]	validation_0-logloss:0.376059
[1655]	validation_0-logloss:0.37606
[1656]	validation_0-logloss:0.376059
[1657]	validation_0-logloss:0.37606
[1658]	validation_0-logloss:0.37606
[1659]	validation_0-logloss:0.376062
[1660]	validation_0-logloss:0.376067
[1661]	validation_0-logloss:0.376076
[1662]	validation_0-logloss:0.376086
[1663]	validation_0-logloss:0.376086
[1664]	validation_0-logloss:0.376075
[1665]	validation_0-logloss:0.376113
[1666]	validation_0-logloss:0.376162
[1667]	validation_0-logloss:0.376162
[1668]	validation_0-logloss:0.376163
[1669]	validation_0-logloss:0.376144
[1670]	validation_0-logloss:0.37613
[1671]	validation_0-logloss:0.37613
[1672]	validation_0-logloss:0.376166
[1673]	validation_0-logloss:0.376149
[1674]	validation_0-logloss:0.376143
[1675]	validation_0-logloss:0.376148
[1676]	validation_0-logloss:0.37616
[1677]	validation_0-logloss:0.376164
[1678]	validation_0-logloss:0.376138
[1679]	validation_0-logloss:0.376142
[1680]	validation_0-logloss:0.376125
[1681]	validation_0-logloss:0.376119
[1682]	validation_0-logloss:0.376118
[1683]	validation_0-logloss:0.376122
[1684]	validation_0-logloss:0.37612
[1685]	validation_0-logloss:0.376121
[1686]	validation_0-logloss:0.376115
[1687]	validation_0-logloss:0.376116
[1688]	validation_0-logloss:0.376118
[1689]	validation_0-logloss:0.376133
[1690]	validation_0-logloss:0.376143
[1691]	validation_0-logloss:0.376122
[1692]	validation_0-logloss:0.376116
[1693]	validation_0-logloss:0.376127
[1694]	validation_0-logloss:0.376129
[1695]	validation_0-logloss:0.376127
[1696]	validation_0-logloss:0.376128
[1697]	validation_0-logloss:0.376082
[1698]	validation_0-logloss:0.376082
[1699]	validation_0-logloss:0.376081
[1700]	validation_0-logloss:0.376081
[1701]	validation_0-logloss:0.376086
[1702]	validation_0-logloss:0.376083
[1703]	validation_0-logloss:0.376082
[1704]	validation_0-logloss:0.376082
[1705]	validation_0-logloss:0.376086
[1706]	validation_0-logloss:0.376099
[1707]	validation_0-logloss:0.376104
[1708]	validation_0-logloss:0.376118
[1709]	validation_0-logloss:0.376101
[1710]	validation_0-logloss:0.376097
[1711]	validation_0-logloss:0.376094
[1712]	validation_0-logloss:0.376086
[1713]	validation_0-logloss:0.376089
[1714]	validation_0-logloss:0.376086
[1715]	validation_0-logloss:0.376103
[1716]	validation_0-logloss:0.376079
[1717]	validation_0-logloss:0.37608
[1718]	validation_0-logloss:0.376077
[1719]	validation_0-logloss:0.376076
[1720]	validation_0-logloss:0.376072
[1721]	validation_0-logloss:0.376085
[1722]	validation_0-logloss:0.37608
[1723]	validation_0-logloss:0.376074
[1724]	validation_0-logloss:0.376073
[1725]	validation_0-logloss:0.376074
[1726]	validation_0-logloss:0.37607
[1727]	validation_0-logloss:0.376049
[1728]	validation_0-logloss:0.37608
[1729]	validation_0-logloss:0.37608
[1730]	validation_0-logloss:0.376083
[1731]	validation_0-logloss:0.376056
[1732]	validation_0-logloss:0.37606
[1733]	validation_0-logloss:0.376053
[1734]	validation_0-logloss:0.376048
[1735]	validation_0-logloss:0.37605
[1736]	validation_0-logloss:0.376048
[1737]	validation_0-logloss:0.376049
[1738]	validation_0-logloss:0.376049
[1739]	validation_0-logloss:0.376052
[1740]	validation_0-logloss:0.376056
[1741]	validation_0-logloss:0.376058
[1742]	validation_0-logloss:0.376055
[1743]	validation_0-logloss:0.376058
[1744]	validation_0-logloss:0.376058
[1745]	validation_0-logloss:0.376055
[1746]	validation_0-logloss:0.376056
[1747]	validation_0-logloss:0.376057
[1748]	validation_0-logloss:0.376056
[1749]	validation_0-logloss:0.376058
[1750]	validation_0-logloss:0.376058
[1751]	validation_0-logloss:0.376061
[1752]	validation_0-logloss:0.376059
[1753]	validation_0-logloss:0.376075
[1754]	validation_0-logloss:0.376073
[1755]	validation_0-logloss:0.376083
[1756]	validation_0-logloss:0.376083
[1757]	validation_0-logloss:0.376051
[1758]	validation_0-logloss:0.37605
[1759]	validation_0-logloss:0.37605
[1760]	validation_0-logloss:0.376072
[1761]	validation_0-logloss:0.37608
[1762]	validation_0-logloss:0.376074
[1763]	validation_0-logloss:0.376086
[1764]	validation_0-logloss:0.376083
[1765]	validation_0-logloss:0.37608
[1766]	validation_0-logloss:0.376059
[1767]	validation_0-logloss:0.37603
[1768]	validation_0-logloss:0.37603
[1769]	validation_0-logloss:0.376029
[1770]	validation_0-logloss:0.37603
[1771]	validation_0-logloss:0.37603
[1772]	validation_0-logloss:0.37603
[1773]	validation_0-logloss:0.37603
[1774]	validation_0-logloss:0.376034
[1775]	validation_0-logloss:0.376046
[1776]	validation_0-logloss:0.376039
[1777]	validation_0-logloss:0.376037
[1778]	validation_0-logloss:0.376063
[1779]	validation_0-logloss:0.376076
[1780]	validation_0-logloss:0.376073
[1781]	validation_0-logloss:0.376066
[1782]	validation_0-logloss:0.376081
[1783]	validation_0-logloss:0.37608
[1784]	validation_0-logloss:0.376079
[1785]	validation_0-logloss:0.376082
[1786]	validation_0-logloss:0.376083
[1787]	validation_0-logloss:0.376115
[1788]	validation_0-logloss:0.376118
[1789]	validation_0-logloss:0.376109
[1790]	validation_0-logloss:0.376111
[1791]	validation_0-logloss:0.376114
[1792]	validation_0-logloss:0.376115
[1793]	validation_0-logloss:0.376117
[1794]	validation_0-logloss:0.37609
[1795]	validation_0-logloss:0.376084
[1796]	validation_0-logloss:0.376081
[1797]	validation_0-logloss:0.376079
[1798]	validation_0-logloss:0.376079
[1799]	validation_0-logloss:0.376084
[1800]	validation_0-logloss:0.376081
[1801]	validation_0-logloss:0.376079
[1802]	validation_0-logloss:0.376096
[1803]	validation_0-logloss:0.376102
[1804]	validation_0-logloss:0.3761
[1805]	validation_0-logloss:0.376107
[1806]	validation_0-logloss:0.376094
[1807]	validation_0-logloss:0.37609
[1808]	validation_0-logloss:0.376091
[1809]	validation_0-logloss:0.376092
[1810]	validation_0-logloss:0.376095
[1811]	validation_0-logloss:0.376098
[1812]	validation_0-logloss:0.376094
[1813]	validation_0-logloss:0.37609
[1814]	validation_0-logloss:0.37609
[1815]	validation_0-logloss:0.376079
[1816]	validation_0-logloss:0.376072
[1817]	validation_0-logloss:0.376102
[1818]	validation_0-logloss:0.376122
[1819]	validation_0-logloss:0.376122
[1820]	validation_0-logloss:0.376095
[1821]	validation_0-logloss:0.376078
[1822]	validation_0-logloss:0.376092
[1823]	validation_0-logloss:0.376104
[1824]	validation_0-logloss:0.376087
[1825]	validation_0-logloss:0.37609
[1826]	validation_0-logloss:0.376095
[1827]	validation_0-logloss:0.376044
[1828]	validation_0-logloss:0.376036
[1829]	validation_0-logloss:0.376025
[1830]	validation_0-logloss:0.376027
[1831]	validation_0-logloss:0.376039
[1832]	validation_0-logloss:0.376037
[1833]	validation_0-logloss:0.376038
[1834]	validation_0-logloss:0.376037
[1835]	validation_0-logloss:0.376037
[1836]	validation_0-logloss:0.376037
[1837]	validation_0-logloss:0.376038
[1838]	validation_0-logloss:0.37604
[1839]	validation_0-logloss:0.376039
[1840]	validation_0-logloss:0.376048
[1841]	validation_0-logloss:0.3761
[1842]	validation_0-logloss:0.3761
[1843]	validation_0-logloss:0.3761
[1844]	validation_0-logloss:0.376107
[1845]	validation_0-logloss:0.376065
[1846]	validation_0-logloss:0.376058
[1847]	validation_0-logloss:0.376056
[1848]	validation_0-logloss:0.376048
[1849]	validation_0-logloss:0.376054
[1850]	validation_0-logloss:0.376056
[1851]	validation_0-logloss:0.376048
[1852]	validation_0-logloss:0.376049
[1853]	validation_0-logloss:0.376049
[1854]	validation_0-logloss:0.376057
[1855]	validation_0-logloss:0.376063
[1856]	validation_0-logloss:0.376036
[1857]	validation_0-logloss:0.376033
[1858]	validation_0-logloss:0.376034
[1859]	validation_0-logloss:0.376032
[1860]	validation_0-logloss:0.376016
[1861]	validation_0-logloss:0.376016
[1862]	validation_0-logloss:0.376022
[1863]	validation_0-logloss:0.376023
[1864]	validation_0-logloss:0.376024
[1865]	validation_0-logloss:0.376043
[1866]	validation_0-logloss:0.376053
[1867]	validation_0-logloss:0.376049
[1868]	validation_0-logloss:0.376031
[1869]	validation_0-logloss:0.37604
[1870]	validation_0-logloss:0.376036
[1871]	validation_0-logloss:0.376035
[1872]	validation_0-logloss:0.376036
[1873]	validation_0-logloss:0.376039
[1874]	validation_0-logloss:0.376038
[1875]	validation_0-logloss:0.376037
[1876]	validation_0-logloss:0.376035
[1877]	validation_0-logloss:0.376037
[1878]	validation_0-logloss:0.376038
[1879]	validation_0-logloss:0.376036
[1880]	validation_0-logloss:0.376039
[1881]	validation_0-logloss:0.37604
[1882]	validation_0-logloss:0.376041
[1883]	validation_0-logloss:0.376035
[1884]	validation_0-logloss:0.376037
[1885]	validation_0-logloss:0.376036
[1886]	validation_0-logloss:0.376043
[1887]	validation_0-logloss:0.376042
[1888]	validation_0-logloss:0.376045
[1889]	validation_0-logloss:0.376072
[1890]	validation_0-logloss:0.376056
[1891]	validation_0-logloss:0.376047
[1892]	validation_0-logloss:0.376053
[1893]	validation_0-logloss:0.376043
[1894]	validation_0-logloss:0.376052
[1895]	validation_0-logloss:0.376046
[1896]	validation_0-logloss:0.376046
[1897]	validation_0-logloss:0.376045
[1898]	validation_0-logloss:0.376043
[1899]	validation_0-logloss:0.376061
[1900]	validation_0-logloss:0.376052
[1901]	validation_0-logloss:0.376055
[1902]	validation_0-logloss:0.376052
[1903]	validation_0-logloss:0.376027
[1904]	validation_0-logloss:0.376046
[1905]	validation_0-logloss:0.376042
[1906]	validation_0-logloss:0.376053
[1907]	validation_0-logloss:0.376037
[1908]	validation_0-logloss:0.376035
[1909]	validation_0-logloss:0.37606
[1910]	validation_0-logloss:0.376056
[1911]	validation_0-logloss:0.376052
[1912]	validation_0-logloss:0.376052
[1913]	validation_0-logloss:0.376052
[1914]	validation_0-logloss:0.376054
[1915]	validation_0-logloss:0.376052
[1916]	validation_0-logloss:0.376052
[1917]	validation_0-logloss:0.376052
[1918]	validation_0-logloss:0.376053
[1919]	validation_0-logloss:0.376052
[1920]	validation_0-logloss:0.376052
[1921]	validation_0-logloss:0.376052
[1922]	validation_0-logloss:0.376052
[1923]	validation_0-logloss:0.376057
[1924]	validation_0-logloss:0.37606
[1925]	validation_0-logloss:0.376049
[1926]	validation_0-logloss:0.37605
[1927]	validation_0-logloss:0.37605
[1928]	validation_0-logloss:0.37605
[1929]	validation_0-logloss:0.376049
[1930]	validation_0-logloss:0.37605
[1931]	validation_0-logloss:0.37607
[1932]	validation_0-logloss:0.376035
[1933]	validation_0-logloss:0.376019
[1934]	validation_0-logloss:0.376024
[1935]	validation_0-logloss:0.376017
[1936]	validation_0-logloss:0.376011
[1937]	validation_0-logloss:0.376018
[1938]	validation_0-logloss:0.376024
[1939]	validation_0-logloss:0.376024
[1940]	validation_0-logloss:0.376018
[1941]	validation_0-logloss:0.376017
[1942]	validation_0-logloss:0.37602
[1943]	validation_0-logloss:0.376017
[1944]	validation_0-logloss:0.376016
[1945]	validation_0-logloss:0.376018
[1946]	validation_0-logloss:0.376023
[1947]	validation_0-logloss:0.376019
[1948]	validation_0-logloss:0.376021
[1949]	validation_0-logloss:0.376016
[1950]	validation_0-logloss:0.376019
[1951]	validation_0-logloss:0.376018
[1952]	validation_0-logloss:0.376012
[1953]	validation_0-logloss:0.376017
[1954]	validation_0-logloss:0.376028
[1955]	validation_0-logloss:0.37602
[1956]	validation_0-logloss:0.376022
[1957]	validation_0-logloss:0.37602
[1958]	validation_0-logloss:0.37602
[1959]	validation_0-logloss:0.376019
[1960]	validation_0-logloss:0.376
[1961]	validation_0-logloss:0.375999
[1962]	validation_0-logloss:0.376056
[1963]	validation_0-logloss:0.37607
[1964]	validation_0-logloss:0.376095
[1965]	validation_0-logloss:0.376088
[1966]	validation_0-logloss:0.376069
[1967]	validation_0-logloss:0.376069
[1968]	validation_0-logloss:0.376069
[1969]	validation_0-logloss:0.376069
[1970]	validation_0-logloss:0.37607
[1971]	validation_0-logloss:0.376083
[1972]	validation_0-logloss:0.376074
[1973]	validation_0-logloss:0.376101
[1974]	validation_0-logloss:0.3761
[1975]	validation_0-logloss:0.376101
[1976]	validation_0-logloss:0.376101
[1977]	validation_0-logloss:0.3761
[1978]	validation_0-logloss:0.376033
[1979]	validation_0-logloss:0.376038
[1980]	validation_0-logloss:0.376038
[1981]	validation_0-logloss:0.376037
[1982]	validation_0-logloss:0.376039
[1983]	validation_0-logloss:0.376034
[1984]	validation_0-logloss:0.376033
[1985]	validation_0-logloss:0.376036
[1986]	validation_0-logloss:0.376036
[1987]	validation_0-logloss:0.376038
[1988]	validation_0-logloss:0.376045
[1989]	validation_0-logloss:0.376036
[1990]	validation_0-logloss:0.376036
[1991]	validation_0-logloss:0.376056
[1992]	validation_0-logloss:0.376064
[1993]	validation_0-logloss:0.376055
[1994]	validation_0-logloss:0.376057
[1995]	validation_0-logloss:0.376043
[1996]	validation_0-logloss:0.376042
[1997]	validation_0-logloss:0.37604
[1998]	validation_0-logloss:0.375992
[1999]	validation_0-logloss:0.375993

``````
``````

In [39]:

import matplotlib.pyplot as plt
from sklearn.learning_curve import learning_curve

param_dist = {
"n_estimators": 2000,
"max_depth": 4,
"min_child_weight": 2,
#gamma=1,
"gamma":0.9,
"subsample":0.8,
"colsample_bytree":0.8,
"objective": 'binary:logistic',
"scale_pos_weight":1
}

clf = xgb.XGBClassifier(**param_dist)

train_sizes, train_scores, test_scores =\
learning_curve(estimator=clf,
X=x_train,
y=y_train,
train_sizes=np.linspace(0.1, 1.0, 10),
cv=10,
n_jobs=1)

train_mean = np.mean(train_scores, axis=1)
train_std = np.std(train_scores, axis=1)
test_mean = np.mean(test_scores, axis=1)
test_std = np.std(test_scores, axis=1)

plt.plot(train_sizes, train_mean,
color='blue', marker='o',
markersize=8, label='training accuracy')

plt.fill_between(train_sizes,
train_mean + train_std,
train_mean - train_std,
alpha=0.15, color='blue')

plt.plot(train_sizes, test_mean,
color='green', linestyle='--',
marker='s', markersize=8,
label='validation accuracy')

plt.fill_between(train_sizes,
test_mean + test_std,
test_mean - test_std,
alpha=0.15, color='green')

plt.grid()
plt.xlabel('Number of training samples')
plt.ylabel('Accuracy')
plt.legend(loc='lower right')
plt.ylim([0.8, 1.0])
plt.tight_layout()
# plt.savefig('./figures/learning_curve.png', dpi=300)
plt.show()

#XGBoost eval example

# clf.fit(x_train, y_train,
#         eval_set=[(x_train, y_train)],
#         eval_metric='logloss',
#         verbose=False)

# # Load evals result by calling the evals_result() function
# evals_result = clf.evals_result()

# print('Access logloss metric directly from validation_0:')
# print(evals_result['validation_0']['logloss'])

# print('')
# print('Access metrics through a loop:')
# for e_name, e_mtrs in evals_result.items():
#     print('- {}'.format(e_name))
#     for e_mtr_name, e_mtr_vals in e_mtrs.items():
#         print('   - {}'.format(e_mtr_name))
#         print('      - {}'.format(e_mtr_vals))

# print('')
# print('Access complete dict:')
#print(evals_result['validation_0']['logloss'][-1])

``````
``````

``````
``````

In [31]:

xgb_model = xgb.XGBClassifier()
clf = GridSearchCV(xgb_model,
{'max_depth': [3,4,5],
'n_estimators': [2000],
'gamma':[0.8,0.9,1],
"min_child_weight": [2,3],
"subsample":[0.8,0.9],
'colsample_bytree':[0.8],
"scale_pos_weight":[1]}, verbose=1)
clf.fit(x_train,y_train)

print('*' * 30)
print(clf.best_score_)
print('*' * 30)
print(clf.best_params_)

``````
``````

Fitting 3 folds for each of 36 candidates, totalling 108 fits

[Parallel(n_jobs=1)]: Done 108 out of 108 | elapsed:  1.4min finished

******************************
0.8653198653198653
******************************
{'colsample_bytree': 0.8, 'gamma': 1, 'max_depth': 3, 'min_child_weight': 2, 'n_estimators': 2000, 'scale_pos_weight': 1, 'subsample': 0.9}

``````

Just a quick run down of the XGBoost parameters used in the model:

max_depth : How deep you want to grow your tree. Beware if set to too high a number might run the risk of overfitting.

gamma : minimum loss reduction required to make a further partition on a leaf node of the tree. The larger, the more conservative the algorithm will be.

eta : step size shrinkage used in each boosting step to prevent overfitting

Producing the Submission file

Finally having trained and fit all our first-level and second-level models, we can now output the predictions into the proper format for submission to the Titanic competition as follows:

``````

In [33]:

# Generate Submission File
StackingSubmission = pd.DataFrame({ 'PassengerId': PassengerId,
'Survived': predictions })
StackingSubmission.to_csv("StackingSubmission.csv", index=False)

``````

Steps for Further Improvement

As a closing remark it must be noted that the steps taken above just show a very simple way of producing an ensemble stacker. You hear of ensembles created at the highest level of Kaggle competitions which involves monstrous combinations of stacked classifiers as well as levels of stacking which go to more than 2 levels.

Some additional steps that may be taken to improve one's score could be:

1. Implementing a good cross-validation strategy in training the models to find optimal parameter values
2. Introduce a greater variety of base models for learning. The more uncorrelated the results, the better the final score.

Conclusion

I have this notebook has been helpful somewhat in introducing a working script for stacking learning models. Again credit must be extended to Faron and Sina.

For other excellent material on stacking or ensembling in general, refer to the de-facto Must read article on the website MLWave: Kaggle Ensembling Guide.

Till next time, Peace Out