#MachineLearning #SupervisedLearning #Classification

By Billy Gustave

Glasses Classification

Goal :

  • Create a ML model to classify glasses, classify by 'Type'
  • Data: glass.csv
  • Comparing different models and including boosting ones
  • Gridsearch and Confusion Matrix

Data Cleaning and Exploration

In [1]:
import numpy as np, pandas as pd, matplotlib.pyplot as plt, seaborn as sns
%matplotlib inline
from matplotlib.pylab import rcParams
rcParams['figure.figsize'] = 16, 14
In [2]:
df = pd.read_csv('glass.csv')
df.shape
Out[2]:
(214, 10)
In [3]:
df.head()
Out[3]:
RI Na Mg Al Si K Ca Ba Fe Type
0 1.52101 13.64 4.49 1.10 71.78 0.06 8.75 0.0 0.0 1
1 1.51761 13.89 3.60 1.36 72.73 0.48 7.83 0.0 0.0 1
2 1.51618 13.53 3.55 1.54 72.99 0.39 7.78 0.0 0.0 1
3 1.51766 13.21 3.69 1.29 72.61 0.57 8.22 0.0 0.0 1
4 1.51742 13.27 3.62 1.24 73.08 0.55 8.07 0.0 0.0 1
In [4]:
df.describe()
Out[4]:
RI Na Mg Al Si K Ca Ba Fe Type
count 214.000000 214.000000 214.000000 214.000000 214.000000 214.000000 214.000000 214.000000 214.000000 214.000000
mean 1.518365 13.407850 2.684533 1.444907 72.650935 0.497056 8.956963 0.175047 0.057009 2.780374
std 0.003037 0.816604 1.442408 0.499270 0.774546 0.652192 1.423153 0.497219 0.097439 2.103739
min 1.511150 10.730000 0.000000 0.290000 69.810000 0.000000 5.430000 0.000000 0.000000 1.000000
25% 1.516523 12.907500 2.115000 1.190000 72.280000 0.122500 8.240000 0.000000 0.000000 1.000000
50% 1.517680 13.300000 3.480000 1.360000 72.790000 0.555000 8.600000 0.000000 0.000000 2.000000
75% 1.519157 13.825000 3.600000 1.630000 73.087500 0.610000 9.172500 0.000000 0.100000 3.000000
max 1.533930 17.380000 4.490000 3.500000 75.410000 6.210000 16.190000 3.150000 0.510000 7.000000
In [5]:
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 214 entries, 0 to 213
Data columns (total 10 columns):
 #   Column  Non-Null Count  Dtype  
---  ------  --------------  -----  
 0   RI      214 non-null    float64
 1   Na      214 non-null    float64
 2   Mg      214 non-null    float64
 3   Al      214 non-null    float64
 4   Si      214 non-null    float64
 5   K       214 non-null    float64
 6   Ca      214 non-null    float64
 7   Ba      214 non-null    float64
 8   Fe      214 non-null    float64
 9   Type    214 non-null    int64  
dtypes: float64(9), int64(1)
memory usage: 16.8 KB
In [6]:
fig, ax = plt.subplots(figsize=(8,7))
df.groupby('Type').Type.count().plot(kind='bar', title='Glasse types distribution')
Out[6]:
<matplotlib.axes._subplots.AxesSubplot at 0x22c71250bc8>
In [7]:
#number of classes
num_class = len(df.Type.unique())

Missing values

In [8]:
# checking the percentage of missing values in each variable
df.isnull().sum()/len(df)*100
Out[8]:
RI      0.0
Na      0.0
Mg      0.0
Al      0.0
Si      0.0
K       0.0
Ca      0.0
Ba      0.0
Fe      0.0
Type    0.0
dtype: float64
In [9]:
# Features and Target
X = df.drop(['Type'], axis=1)
# Some models only use target starting from 0 to classes-1, must map even if target is already numbers.
y = df.Type.map({1: 0, 2: 1, 3: 2, 5: 3, 6: 4, 7: 5})
In [10]:
from sklearn.model_selection import train_test_split
# random_state guarantees the same output everytime the program is run
x_train, x_test, y_train, y_test = train_test_split(X,y, test_size = .2, random_state=17)

No missing values

Unique values

In [11]:
from sklearn.feature_selection import VarianceThreshold
# zero variance (unique values)
constant_filter = VarianceThreshold(threshold=0)
constant_filter.fit(x_train)
columns_to_remove = [name for name in x_train.columns if name not in x_train.columns[constant_filter.get_support()]]
print('Unique features: ', columns_to_remove)
Unique features:  []

No unique values

Highly correlated values

Threshold: 75%

In [12]:
# Correlation matrix for all independent vars
corrMatrix = x_train.corr()
allVars = corrMatrix.keys()

absCorrWithDep = []
for var in allVars:
    absCorrWithDep.append(abs(y.corr(x_train[var])))
# threshold seeting
corrTol = 0.75

# for each column in the corr matrix
for col in corrMatrix:
    
    if col in corrMatrix.keys():
        thisCol = []
        thisVars = []
        temp = corrMatrix[col]
        
        # Store the corr with the dep var for fields that are highly correlated with each other
        for i in range(len(corrMatrix)):
            
            if abs(corrMatrix[col][i]) == 1.0 and col != corrMatrix.keys()[i]:
                thisCorr = 0
            else:
                thisCorr = (1 if abs(corrMatrix[col][i]) > corrTol else -1) * abs(temp[corrMatrix.keys()[i]])
            thisCol.append(thisCorr)
            thisVars.append(corrMatrix.keys()[i])
        
        mask = np.ones(len(thisCol), dtype = bool) # Initialize the mask
        
        ctDelCol = 0 # To keep track of the number of columns deleted
        
        for n, j in enumerate(thisCol):
            # Delete if (a) a var is correlated withh others and do not ave the best corr with dep,
            # or (b) completely corr with the 'col'
            mask[n] = not (j != max(thisCol) and j >= 0)
            
            if j != max(thisCol) and j >= 0:
                # Delete the column from the corr matrix
                corrMatrix.pop('%s' %thisVars[n])
                ctDelCol += 1
                
        # Delete the corresponding row(s) from the corr matrix
        corrMatrix = corrMatrix[mask]
In [13]:
columns_to_keep = corrMatrix.columns
print(columns_to_keep)
len(columns_to_keep)
Index(['RI', 'Na', 'Mg', 'Al', 'Si', 'K', 'Ba', 'Fe'], dtype='object')
Out[13]:
8
In [14]:
x_train_clean = x_train[columns_to_keep]
fig, ax = plt.subplots(figsize=(16,14))
sns.heatmap(x_train_clean.corr(), cmap='Reds',annot=True, linewidths=.5, ax=ax)
Out[14]:
<matplotlib.axes._subplots.AxesSubplot at 0x22c7196a908>

Keeping 8 feature below the 75% correlation threshold

Feature importance

In [15]:
from sklearn.ensemble import RandomForestClassifier
rfc = RandomForestClassifier()
rfc.fit(x_train_clean, y_train)
features = x_train_clean.columns
importances = rfc.feature_importances_
indices = np.argsort(importances)
plt.title('Feature Importances')
plt.barh(range(len(indices)), importances[indices], color='b', align='center')
plt.yticks(range(len(indices)), [features[i] for i in indices])
plt.xlabel('Relative Importance')
plt.show()

Keeping all features

Recursive Feature Selection using RFECV with XGBoost model

In [16]:
from sklearn.model_selection import cross_val_score, KFold
kfold = KFold(n_splits=num_class, random_state=17, shuffle=True)
In [17]:
from xgboost import XGBClassifier
from sklearn.feature_selection import RFECV
objective='multi:softmax'
model = XGBClassifier(objective=objective, num_class=num_class)
rfecv = RFECV(estimator=model, cv=kfold, scoring='accuracy')
rfecv.fit(x_train_clean, y_train)
Out[17]:
RFECV(cv=KFold(n_splits=6, random_state=17, shuffle=True),
      estimator=XGBClassifier(base_score=0.5, booster='gbtree',
                              colsample_bylevel=1, colsample_bynode=1,
                              colsample_bytree=1, gamma=0, learning_rate=0.1,
                              max_delta_step=0, max_depth=3, min_child_weight=1,
                              missing=None, n_estimators=100, n_jobs=1,
                              nthread=None, num_class=6,
                              objective='multi:softmax', random_state=0,
                              reg_alpha=0, reg_lambda=1, scale_pos_weight=1,
                              seed=None, silent=None, subsample=1,
                              verbosity=1),
      min_features_to_select=1, n_jobs=None, scoring='accuracy', step=1,
      verbose=0)
In [18]:
plt.figure()
plt.title('XGB CV score vs No of Features')
plt.xlabel("Number of features selected")
plt.ylabel("Cross validation score (nb of correct classifications)")
plt.plot(range(1, len(rfecv.grid_scores_) + 1), rfecv.grid_scores_)
plt.grid()
plt.show()
In [19]:
feature_importance = list(zip(x_train_clean.columns, rfecv.support_))
new_features = []
for key,value in enumerate(feature_importance):
    if(value[1]) == True:
        new_features.append(value[0])
        
print(new_features)
x_train_best = x_train_clean[new_features]
x_test_best = x_test[new_features]
['RI', 'Na', 'Mg', 'Al', 'Si', 'K', 'Ba', 'Fe']

Keeping all features
Note :
**Recursive Feature selection** works for data with low feature count as it is an exhaustive process which using a lot of computing power.

Model Selection

Comapring results from 9 different models

In [20]:
# model libraries
from sklearn.linear_model import LogisticRegression
from sklearn.naive_bayes import GaussianNB
from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import GradientBoostingClassifier, AdaBoostClassifier
from sklearn.svm import SVC
from sklearn.neighbors import KNeighborsClassifier
from xgboost import XGBRFClassifier
In [21]:
models = []
models.append(('LR',LogisticRegression(solver = 'newton-cg')))
models.append(('NB',GaussianNB()))
models.append(('DTC',DecisionTreeClassifier()))
models.append(('RFC',RandomForestClassifier()))
models.append(('SVC',SVC()))
models.append(('KNN',KNeighborsClassifier()))
models.append(('GBC',GradientBoostingClassifier()))
models.append(('ABC',AdaBoostClassifier()))
models.append(('XGB',XGBClassifier(objective='multi:softmax', num_class=num_class)))
models.append(('XGBRF',XGBRFClassifier()))

Using Kfold and Cross Validation

In [22]:
names = []
scores = []
for name, model in models:
    score = cross_val_score(model, x_train_best, y_train, cv=kfold, scoring='accuracy').mean()
    names.append(name)
    scores.append(score)
results  = pd.DataFrame({'Model': names,'Accuracy': scores})
results
Out[22]:
Model Accuracy
0 LR 0.596059
1 NB 0.485222
2 DTC 0.648810
3 RFC 0.766215
4 SVC 0.322455
5 KNN 0.620279
6 GBC 0.696018
7 ABC 0.418309
8 XGB 0.747947
9 XGBRF 0.660099
In [23]:
axis = sns.barplot(x = 'Model', y = 'Accuracy', data = results)
axis.set(xlabel='Classifier', ylabel='Accuracy')
for p in axis.patches:
    height = p.get_height()
    axis.text(p.get_x() + p.get_width()/2, height + 0.005, '{:1.4f}'.format(height), ha="center") 
    
plt.show()

Usng XGBoost
Note :
XGBoost is a very powerful model and will perform better than other models 80% of the time.

Model tuning

Learning rate and estimators
This function allows us to find the best number of estimators for our starting learning rate. We will run it again after we tune the learning for best estimators for that learning rate.

In [24]:
import xgboost as XGB
from sklearn.metrics import accuracy_score, roc_auc_score
def modelfit(alg, useTrainCV=True, cv_folds=num_class, early_stopping_rounds=50):
    
    if useTrainCV:
        xgb_param = alg.get_xgb_params()
        #xgtrain = xgb.DMatrix(dtrain[predictors].values, label=dtrain[target].values)
        xgtrain = XGB.DMatrix(x_train_best.values, label=y_train.values)
        cvresult = XGB.cv(xgb_param, xgtrain, num_boost_round=alg.get_params()['n_estimators'], nfold=cv_folds,
                          metrics='mlogloss', early_stopping_rounds=early_stopping_rounds, verbose_eval=-1)
        n_estimators = cvresult.shape[0]
        alg.set_params(n_estimators=n_estimators)
    
    #Fit the algorithm on the data
    alg.fit(x_train_best, y_train)
    
    #Predict training set:
    dtrain_predictions = alg.predict(x_train_best)
        
    #Print model report:
    print("\nModel Report")
    print("Accuracy (Train): %.4g" % accuracy_score(y_train.values, dtrain_predictions))
    print('n_estimators :',n_estimators )
    
    feat_imp = pd.Series(alg._Booster.get_fscore()).sort_values(ascending=False)
    feat_imp.plot(kind='bar', title='Feature Importances')
    plt.ylabel('Feature Importance Score')
    return n_estimators
In [25]:
# initial parameters value (starting point which will change later on)
xgb = XGBClassifier(learning_rate=0.1, max_depth=5, n_estimators=500, subsample=0.8,
                    colsample_bytree=0.8, objective=objective, num_class=num_class, seed=7)
n_estimators = modelfit(xgb)
[0]	train-mlogloss:1.63168+0.00474254	test-mlogloss:1.6778+0.0167785
[1]	train-mlogloss:1.50293+0.00857344	test-mlogloss:1.58611+0.0279914
[2]	train-mlogloss:1.38793+0.008702	test-mlogloss:1.50528+0.0367546
[3]	train-mlogloss:1.28646+0.011043	test-mlogloss:1.43589+0.0425792
[4]	train-mlogloss:1.19606+0.0130615	test-mlogloss:1.37481+0.0556228
[5]	train-mlogloss:1.1149+0.0115582	test-mlogloss:1.32498+0.0573271
[6]	train-mlogloss:1.04186+0.0128109	test-mlogloss:1.27263+0.0542978
[7]	train-mlogloss:0.977696+0.0121558	test-mlogloss:1.22856+0.0562873
[8]	train-mlogloss:0.919807+0.0135393	test-mlogloss:1.18706+0.0605188
[9]	train-mlogloss:0.866331+0.0138976	test-mlogloss:1.15313+0.0624951
[10]	train-mlogloss:0.815863+0.0152306	test-mlogloss:1.12439+0.058307
[11]	train-mlogloss:0.769364+0.013158	test-mlogloss:1.09722+0.0640436
[12]	train-mlogloss:0.724936+0.0122034	test-mlogloss:1.07288+0.0674558
[13]	train-mlogloss:0.685773+0.0122073	test-mlogloss:1.04999+0.072063
[14]	train-mlogloss:0.649979+0.0116079	test-mlogloss:1.02585+0.0716913
[15]	train-mlogloss:0.617402+0.0111047	test-mlogloss:1.00915+0.0734997
[16]	train-mlogloss:0.587236+0.0111524	test-mlogloss:0.989093+0.0742954
[17]	train-mlogloss:0.558109+0.0097591	test-mlogloss:0.969952+0.0755915
[18]	train-mlogloss:0.53105+0.0107557	test-mlogloss:0.953851+0.0752722
[19]	train-mlogloss:0.506542+0.0108301	test-mlogloss:0.941594+0.0778205
[20]	train-mlogloss:0.482729+0.0110397	test-mlogloss:0.927048+0.0783226
[21]	train-mlogloss:0.461514+0.0104264	test-mlogloss:0.917191+0.0810749
[22]	train-mlogloss:0.439901+0.0108933	test-mlogloss:0.905226+0.0828008
[23]	train-mlogloss:0.421083+0.011158	test-mlogloss:0.89341+0.0845363
[24]	train-mlogloss:0.40216+0.01078	test-mlogloss:0.883943+0.0867599
[25]	train-mlogloss:0.385598+0.0111398	test-mlogloss:0.879202+0.089214
[26]	train-mlogloss:0.369427+0.010204	test-mlogloss:0.870954+0.0930122
[27]	train-mlogloss:0.354378+0.00938382	test-mlogloss:0.865483+0.0938362
[28]	train-mlogloss:0.34066+0.00943032	test-mlogloss:0.862018+0.0939048
[29]	train-mlogloss:0.326949+0.00928878	test-mlogloss:0.855643+0.092755
[30]	train-mlogloss:0.313301+0.00910632	test-mlogloss:0.850324+0.0926937
[31]	train-mlogloss:0.301906+0.00875001	test-mlogloss:0.846646+0.0922236
[32]	train-mlogloss:0.290438+0.00925858	test-mlogloss:0.841407+0.0929369
[33]	train-mlogloss:0.280394+0.00947972	test-mlogloss:0.837979+0.0933908
[34]	train-mlogloss:0.270227+0.0090132	test-mlogloss:0.832618+0.0924288
[35]	train-mlogloss:0.260958+0.00849533	test-mlogloss:0.828886+0.0919236
[36]	train-mlogloss:0.251741+0.00835947	test-mlogloss:0.825015+0.0914437
[37]	train-mlogloss:0.243707+0.00856121	test-mlogloss:0.823961+0.0921951
[38]	train-mlogloss:0.235424+0.00855701	test-mlogloss:0.821132+0.0928005
[39]	train-mlogloss:0.227885+0.008378	test-mlogloss:0.818611+0.0936503
[40]	train-mlogloss:0.220662+0.00808736	test-mlogloss:0.816498+0.0925941
[41]	train-mlogloss:0.213886+0.00810878	test-mlogloss:0.814373+0.0941603
[42]	train-mlogloss:0.207147+0.00805422	test-mlogloss:0.813808+0.0946144
[43]	train-mlogloss:0.200691+0.00782889	test-mlogloss:0.812869+0.0948593
[44]	train-mlogloss:0.194594+0.00732756	test-mlogloss:0.810167+0.0943547
[45]	train-mlogloss:0.189104+0.00733039	test-mlogloss:0.809926+0.0959891
[46]	train-mlogloss:0.183685+0.00740228	test-mlogloss:0.808666+0.0967663
[47]	train-mlogloss:0.178727+0.00677726	test-mlogloss:0.808539+0.0981952
[48]	train-mlogloss:0.173797+0.00632213	test-mlogloss:0.806283+0.098792
[49]	train-mlogloss:0.169292+0.00631988	test-mlogloss:0.806179+0.0993302
[50]	train-mlogloss:0.164606+0.00642448	test-mlogloss:0.805835+0.100703
[51]	train-mlogloss:0.160448+0.00643337	test-mlogloss:0.804387+0.102297
[52]	train-mlogloss:0.156166+0.0061582	test-mlogloss:0.805266+0.102506
[53]	train-mlogloss:0.152139+0.00588225	test-mlogloss:0.803393+0.102718
[54]	train-mlogloss:0.148831+0.00561396	test-mlogloss:0.804498+0.103731
[55]	train-mlogloss:0.145389+0.00560776	test-mlogloss:0.805824+0.102836
[56]	train-mlogloss:0.141964+0.00536992	test-mlogloss:0.804385+0.103416
[57]	train-mlogloss:0.138723+0.00541369	test-mlogloss:0.804027+0.1062
[58]	train-mlogloss:0.13568+0.00537005	test-mlogloss:0.802755+0.107914
[59]	train-mlogloss:0.132867+0.00508792	test-mlogloss:0.803779+0.108247
[60]	train-mlogloss:0.13006+0.00482162	test-mlogloss:0.804448+0.110579
[61]	train-mlogloss:0.127161+0.00448253	test-mlogloss:0.805526+0.111996
[62]	train-mlogloss:0.124758+0.00437854	test-mlogloss:0.80625+0.112338
[63]	train-mlogloss:0.122266+0.00417165	test-mlogloss:0.807176+0.114749
[64]	train-mlogloss:0.119902+0.00415319	test-mlogloss:0.807443+0.115587
[65]	train-mlogloss:0.117627+0.00421374	test-mlogloss:0.808088+0.116971
[66]	train-mlogloss:0.115565+0.00410047	test-mlogloss:0.808365+0.118181
[67]	train-mlogloss:0.113507+0.00391097	test-mlogloss:0.810244+0.119331
[68]	train-mlogloss:0.111417+0.00382575	test-mlogloss:0.808665+0.118886
[69]	train-mlogloss:0.109495+0.00372091	test-mlogloss:0.809536+0.118719
[70]	train-mlogloss:0.107666+0.00363091	test-mlogloss:0.810302+0.119796
[71]	train-mlogloss:0.106056+0.00351905	test-mlogloss:0.809423+0.118909
[72]	train-mlogloss:0.104316+0.00341356	test-mlogloss:0.808814+0.116984
[73]	train-mlogloss:0.102633+0.00334548	test-mlogloss:0.808801+0.117233
[74]	train-mlogloss:0.100993+0.00326803	test-mlogloss:0.810072+0.116646
[75]	train-mlogloss:0.0994227+0.00323567	test-mlogloss:0.811628+0.117174
[76]	train-mlogloss:0.0979823+0.00309594	test-mlogloss:0.811568+0.116661
[77]	train-mlogloss:0.0965745+0.00307779	test-mlogloss:0.81113+0.115597
[78]	train-mlogloss:0.0951368+0.00300283	test-mlogloss:0.81238+0.116266
[79]	train-mlogloss:0.0938635+0.00290202	test-mlogloss:0.813426+0.117047
[80]	train-mlogloss:0.092556+0.00294916	test-mlogloss:0.814028+0.117206
[81]	train-mlogloss:0.091364+0.00297435	test-mlogloss:0.814316+0.117963
[82]	train-mlogloss:0.0902338+0.00287944	test-mlogloss:0.814737+0.119372
[83]	train-mlogloss:0.0890687+0.00288233	test-mlogloss:0.814962+0.119904
[84]	train-mlogloss:0.0880548+0.00286199	test-mlogloss:0.816726+0.120016
[85]	train-mlogloss:0.0870237+0.00291005	test-mlogloss:0.819052+0.120566
[86]	train-mlogloss:0.0859907+0.00283729	test-mlogloss:0.819463+0.121239
[87]	train-mlogloss:0.0849967+0.00280069	test-mlogloss:0.819424+0.122047
[88]	train-mlogloss:0.0840575+0.00279593	test-mlogloss:0.820461+0.121677
[89]	train-mlogloss:0.0830388+0.00272883	test-mlogloss:0.821069+0.122228
[90]	train-mlogloss:0.0821945+0.00264592	test-mlogloss:0.8219+0.121936
[91]	train-mlogloss:0.0813015+0.0025941	test-mlogloss:0.822468+0.122923
[92]	train-mlogloss:0.0804017+0.00252739	test-mlogloss:0.823542+0.122833
[93]	train-mlogloss:0.0795398+0.0024839	test-mlogloss:0.824301+0.123613
[94]	train-mlogloss:0.078762+0.00240302	test-mlogloss:0.825021+0.123302
[95]	train-mlogloss:0.0778965+0.00240555	test-mlogloss:0.824955+0.124485
[96]	train-mlogloss:0.0770832+0.00236041	test-mlogloss:0.825367+0.12404
[97]	train-mlogloss:0.0763612+0.0023365	test-mlogloss:0.827141+0.124237
[98]	train-mlogloss:0.0756075+0.00232264	test-mlogloss:0.826532+0.124795
[99]	train-mlogloss:0.074899+0.00234086	test-mlogloss:0.82665+0.125859
[100]	train-mlogloss:0.0741228+0.00239761	test-mlogloss:0.827535+0.125509
[101]	train-mlogloss:0.0735008+0.00244454	test-mlogloss:0.828972+0.127237
[102]	train-mlogloss:0.0728167+0.00241282	test-mlogloss:0.83046+0.127694
[103]	train-mlogloss:0.0722113+0.00240714	test-mlogloss:0.83117+0.128832
[104]	train-mlogloss:0.0716417+0.00237698	test-mlogloss:0.831717+0.128678
[105]	train-mlogloss:0.070931+0.00222909	test-mlogloss:0.832336+0.127999
[106]	train-mlogloss:0.0702787+0.00223615	test-mlogloss:0.83241+0.128394
[107]	train-mlogloss:0.0696645+0.00217989	test-mlogloss:0.833459+0.129213

Model Report
Accuracy (Train): 1
n_estimators : 59
In [26]:
from sklearn.model_selection import GridSearchCV

max_depth and min_child_weight

In [27]:
param = {'max_depth':range(0,15,1),
         'min_child_weight':range(0,15,2)}
In [28]:
gsrch = GridSearchCV(estimator=XGBClassifier(learning_rate=0.1, n_estimators=n_estimators, subsample=0.8, colsample_bytree=0.8,
                                             objective=objective, num_class=num_class, seed=7), 
                     param_grid=param, scoring='accuracy', n_jobs=-1,cv=kfold)
gsrch.fit(x_train_best, y_train)
max_depth = gsrch.best_params_['max_depth']
min_child_weight = gsrch.best_params_['min_child_weight']
gsrch.best_params_, gsrch.best_score_
Out[28]:
({'max_depth': 5, 'min_child_weight': 2}, 0.736863711001642)

gamma

In [29]:
param = {'gamma':[i/10.0 for i in range(0,101)]}
In [30]:
gsrch = GridSearchCV(estimator=XGBClassifier(learning_rate=0.1, n_estimators=n_estimators, subsample=0.8, colsample_bytree=0.8, 
                                             objective=objective, num_class=num_class, seed=7, max_depth=max_depth, 
                                             min_child_weight=min_child_weight), param_grid=param, scoring='accuracy', 
                     n_jobs=-1, cv=kfold)
gsrch.fit(x_train_best, y_train)
gamma = gsrch.best_params_['gamma']
gsrch.best_params_, gsrch.best_score_
Out[30]:
({'gamma': 0.1}, 0.7485632183908045)

subsample and colsample_bytree

In [31]:
param = {'subsample':[i/10.0 for i in range(6,11)],
        'colsample_bytree':[i/10.0 for i in range(6,11)]}
In [32]:
gsrch = GridSearchCV(estimator=XGBClassifier(learning_rate=0.1, n_estimators=n_estimators, objective=objective, 
                                             num_class=num_class, seed=7, max_depth=max_depth, 
                                             min_child_weight=min_child_weight, gamma=gamma), 
                     param_grid=param, scoring='accuracy', n_jobs=-1, cv=kfold)
gsrch.fit(x_train_best, y_train)
colsample_bytree = gsrch.best_params_['colsample_bytree']
subsample = gsrch.best_params_['subsample']
gsrch.best_params_, gsrch.best_score_
Out[32]:
({'colsample_bytree': 1.0, 'subsample': 1.0}, 0.7536945812807883)

reg_alpha

In [33]:
param = {'reg_alpha':[1e-5, 1e-2, 0.1, 1, 100, 0, 0.001, 0.005, 0.01, 0.05]}
In [34]:
gsrch = GridSearchCV(estimator=XGBClassifier(learning_rate=0.1, n_estimators=n_estimators, objective=objective, 
                                             num_class=num_class, seed=7, max_depth=max_depth, 
                                             min_child_weight=min_child_weight, gamma=gamma, 
                                             colsample_bytree=colsample_bytree, subsample=subsample), param_grid=param, 
                     scoring='accuracy', n_jobs=-1, cv=kfold)
gsrch.fit(x_train_best, y_train)
reg_alpha = gsrch.best_params_['reg_alpha']
gsrch.best_params_, gsrch.best_score_
Out[34]:
({'reg_alpha': 1e-05}, 0.7536945812807883)

estimators for learning rate 0.01

In [35]:
xgb = XGBClassifier(learning_rate=0.01, n_estimators=100000, objective=objective, num_class=num_class,seed=7, 
                    max_depth=max_depth, min_child_weight=min_child_weight, gamma=gamma, colsample_bytree=colsample_bytree, 
                    subsample=subsample)
n_estimators = modelfit(xgb)
[0]	train-mlogloss:1.77549+0.000330388	test-mlogloss:1.77971+0.0014522
[1]	train-mlogloss:1.75955+0.000650631	test-mlogloss:1.76791+0.00279767
[2]	train-mlogloss:1.74392+0.000964537	test-mlogloss:1.75612+0.00395197
[3]	train-mlogloss:1.72848+0.00121826	test-mlogloss:1.74463+0.00528464
[4]	train-mlogloss:1.7133+0.00144858	test-mlogloss:1.73346+0.00654851
[5]	train-mlogloss:1.69838+0.00165162	test-mlogloss:1.7224+0.00771661
[6]	train-mlogloss:1.6837+0.00182988	test-mlogloss:1.71166+0.00914919
[7]	train-mlogloss:1.66924+0.00202451	test-mlogloss:1.70114+0.010187
[8]	train-mlogloss:1.65503+0.00223748	test-mlogloss:1.69086+0.0113683
[9]	train-mlogloss:1.64106+0.00243852	test-mlogloss:1.68088+0.0124913
[10]	train-mlogloss:1.62731+0.00261661	test-mlogloss:1.67079+0.0133727
[11]	train-mlogloss:1.61382+0.00281726	test-mlogloss:1.66116+0.0145591
[12]	train-mlogloss:1.60054+0.00299714	test-mlogloss:1.65143+0.0153599
[13]	train-mlogloss:1.58735+0.00314258	test-mlogloss:1.64204+0.0163397
[14]	train-mlogloss:1.57442+0.00332181	test-mlogloss:1.63277+0.017156
[15]	train-mlogloss:1.56172+0.00351335	test-mlogloss:1.62337+0.0177365
[16]	train-mlogloss:1.5492+0.0036916	test-mlogloss:1.61447+0.0184543
[17]	train-mlogloss:1.53688+0.00381876	test-mlogloss:1.60544+0.0191327
[18]	train-mlogloss:1.52471+0.00396424	test-mlogloss:1.59696+0.0199494
[19]	train-mlogloss:1.51281+0.00416905	test-mlogloss:1.58825+0.0205819
[20]	train-mlogloss:1.501+0.00429774	test-mlogloss:1.5797+0.0211697
[21]	train-mlogloss:1.4894+0.00444276	test-mlogloss:1.57166+0.0219017
[22]	train-mlogloss:1.47791+0.00456279	test-mlogloss:1.56335+0.0225894
[23]	train-mlogloss:1.46657+0.00465859	test-mlogloss:1.55543+0.0231519
[24]	train-mlogloss:1.45542+0.00473769	test-mlogloss:1.5475+0.0237848
[25]	train-mlogloss:1.4444+0.0048722	test-mlogloss:1.53995+0.0245543
[26]	train-mlogloss:1.43359+0.00505311	test-mlogloss:1.53208+0.0249659
[27]	train-mlogloss:1.42285+0.00512426	test-mlogloss:1.5247+0.0254336
[28]	train-mlogloss:1.41218+0.00518898	test-mlogloss:1.51729+0.026102
[29]	train-mlogloss:1.40166+0.00524579	test-mlogloss:1.50996+0.0264887
[30]	train-mlogloss:1.39129+0.00532373	test-mlogloss:1.50265+0.0269854
[31]	train-mlogloss:1.38111+0.00534579	test-mlogloss:1.49536+0.0276396
[32]	train-mlogloss:1.37096+0.00530118	test-mlogloss:1.48802+0.0280624
[33]	train-mlogloss:1.36107+0.00532954	test-mlogloss:1.4814+0.0285104
[34]	train-mlogloss:1.35128+0.00536172	test-mlogloss:1.47442+0.0289853
[35]	train-mlogloss:1.34162+0.00542778	test-mlogloss:1.46757+0.0292859
[36]	train-mlogloss:1.33207+0.00551286	test-mlogloss:1.46115+0.0297565
[37]	train-mlogloss:1.32266+0.00560387	test-mlogloss:1.45432+0.030143
[38]	train-mlogloss:1.31325+0.00554763	test-mlogloss:1.44789+0.0305394
[39]	train-mlogloss:1.30398+0.00555255	test-mlogloss:1.44111+0.0309639
[40]	train-mlogloss:1.29489+0.00561393	test-mlogloss:1.43491+0.0313377
[41]	train-mlogloss:1.2859+0.00570312	test-mlogloss:1.42855+0.0318113
[42]	train-mlogloss:1.27699+0.00571718	test-mlogloss:1.42234+0.0322397
[43]	train-mlogloss:1.26821+0.00577448	test-mlogloss:1.4163+0.0327466
[44]	train-mlogloss:1.25949+0.00596303	test-mlogloss:1.41042+0.0332404
[45]	train-mlogloss:1.25092+0.00604698	test-mlogloss:1.4046+0.0336683
[46]	train-mlogloss:1.24238+0.00608606	test-mlogloss:1.39857+0.0341654
[47]	train-mlogloss:1.2339+0.00608672	test-mlogloss:1.39317+0.0347626
[48]	train-mlogloss:1.22543+0.00623746	test-mlogloss:1.38748+0.0354476
[49]	train-mlogloss:1.21721+0.00631968	test-mlogloss:1.38198+0.0359615
[50]	train-mlogloss:1.20895+0.00644247	test-mlogloss:1.37665+0.0366794
[51]	train-mlogloss:1.20093+0.00648694	test-mlogloss:1.37123+0.0372134
[52]	train-mlogloss:1.19289+0.00658319	test-mlogloss:1.36581+0.0378193
[53]	train-mlogloss:1.185+0.00668874	test-mlogloss:1.36051+0.038354
[54]	train-mlogloss:1.17716+0.0067887	test-mlogloss:1.35532+0.0388969
[55]	train-mlogloss:1.1693+0.00695309	test-mlogloss:1.35025+0.0395153
[56]	train-mlogloss:1.16164+0.00707233	test-mlogloss:1.34534+0.0400261
[57]	train-mlogloss:1.15399+0.0072528	test-mlogloss:1.34016+0.040553
[58]	train-mlogloss:1.14648+0.00730232	test-mlogloss:1.33512+0.0412043
[59]	train-mlogloss:1.13901+0.00744926	test-mlogloss:1.33045+0.0415738
[60]	train-mlogloss:1.13165+0.00755931	test-mlogloss:1.32567+0.0419682
[61]	train-mlogloss:1.12426+0.00778069	test-mlogloss:1.32084+0.0422065
[62]	train-mlogloss:1.11685+0.00798462	test-mlogloss:1.31606+0.0426905
[63]	train-mlogloss:1.10957+0.00824614	test-mlogloss:1.31121+0.0427833
[64]	train-mlogloss:1.10242+0.00847898	test-mlogloss:1.30647+0.0429759
[65]	train-mlogloss:1.09529+0.00873174	test-mlogloss:1.30201+0.0433362
[66]	train-mlogloss:1.08829+0.00893248	test-mlogloss:1.29757+0.0437674
[67]	train-mlogloss:1.08134+0.00910524	test-mlogloss:1.29294+0.0438717
[68]	train-mlogloss:1.07435+0.00945433	test-mlogloss:1.28822+0.0441476
[69]	train-mlogloss:1.06743+0.00975649	test-mlogloss:1.28363+0.0444717
[70]	train-mlogloss:1.06078+0.0100167	test-mlogloss:1.27918+0.0446562
[71]	train-mlogloss:1.05412+0.0103658	test-mlogloss:1.27477+0.044891
[72]	train-mlogloss:1.04743+0.0107032	test-mlogloss:1.27012+0.0451044
[73]	train-mlogloss:1.04095+0.0109589	test-mlogloss:1.26608+0.0453087
[74]	train-mlogloss:1.03443+0.0112599	test-mlogloss:1.26156+0.0453796
[75]	train-mlogloss:1.02793+0.0115747	test-mlogloss:1.25719+0.0455871
[76]	train-mlogloss:1.02148+0.0119049	test-mlogloss:1.25314+0.0458188
[77]	train-mlogloss:1.01508+0.0121171	test-mlogloss:1.24873+0.0460824
[78]	train-mlogloss:1.00891+0.0123104	test-mlogloss:1.24473+0.0462713
[79]	train-mlogloss:1.00263+0.0124792	test-mlogloss:1.24059+0.0463368
[80]	train-mlogloss:0.996477+0.0127935	test-mlogloss:1.23632+0.046333
[81]	train-mlogloss:0.990377+0.0129776	test-mlogloss:1.23224+0.0465734
[82]	train-mlogloss:0.984312+0.013112	test-mlogloss:1.22828+0.0467174
[83]	train-mlogloss:0.978262+0.0133189	test-mlogloss:1.22418+0.0468042
[84]	train-mlogloss:0.972411+0.0135121	test-mlogloss:1.22016+0.0470515
[85]	train-mlogloss:0.966603+0.0137279	test-mlogloss:1.21651+0.0472121
[86]	train-mlogloss:0.960747+0.0138868	test-mlogloss:1.21263+0.0474318
[87]	train-mlogloss:0.955056+0.0140496	test-mlogloss:1.20881+0.0477448
[88]	train-mlogloss:0.949405+0.0141467	test-mlogloss:1.20512+0.0479968
[89]	train-mlogloss:0.943727+0.014311	test-mlogloss:1.20114+0.0479155
[90]	train-mlogloss:0.938208+0.014482	test-mlogloss:1.19736+0.0480653
[91]	train-mlogloss:0.932634+0.0147273	test-mlogloss:1.19372+0.0482001
[92]	train-mlogloss:0.927198+0.0148773	test-mlogloss:1.19002+0.0483008
[93]	train-mlogloss:0.921692+0.0149679	test-mlogloss:1.18632+0.0483005
[94]	train-mlogloss:0.91619+0.0150787	test-mlogloss:1.18281+0.0486515
[95]	train-mlogloss:0.910799+0.0152126	test-mlogloss:1.17903+0.0485191
[96]	train-mlogloss:0.905425+0.0152498	test-mlogloss:1.17534+0.0487905
[97]	train-mlogloss:0.900066+0.0151883	test-mlogloss:1.1719+0.0489112
[98]	train-mlogloss:0.894703+0.0152392	test-mlogloss:1.1683+0.0492743
[99]	train-mlogloss:0.889449+0.0152958	test-mlogloss:1.16499+0.0494988
[100]	train-mlogloss:0.884196+0.0152485	test-mlogloss:1.16144+0.049677
[101]	train-mlogloss:0.879053+0.0153958	test-mlogloss:1.15809+0.049939
[102]	train-mlogloss:0.87393+0.0154458	test-mlogloss:1.15454+0.0501506
[103]	train-mlogloss:0.868946+0.0153483	test-mlogloss:1.15128+0.0506179
[104]	train-mlogloss:0.863901+0.0154143	test-mlogloss:1.14794+0.0506781
[105]	train-mlogloss:0.858927+0.0154604	test-mlogloss:1.14458+0.050774
[106]	train-mlogloss:0.854011+0.0154892	test-mlogloss:1.14145+0.051142
[107]	train-mlogloss:0.849094+0.0155592	test-mlogloss:1.13815+0.0512634
[108]	train-mlogloss:0.844274+0.0155276	test-mlogloss:1.13489+0.0513236
[109]	train-mlogloss:0.83948+0.0155991	test-mlogloss:1.13145+0.0515388
[110]	train-mlogloss:0.834741+0.0155705	test-mlogloss:1.12841+0.0516888
[111]	train-mlogloss:0.830054+0.0156491	test-mlogloss:1.12525+0.0517213
[112]	train-mlogloss:0.825374+0.01566	test-mlogloss:1.12198+0.0519754
[113]	train-mlogloss:0.820667+0.0156549	test-mlogloss:1.11883+0.0521564
[114]	train-mlogloss:0.816106+0.0157792	test-mlogloss:1.11577+0.0523469
[115]	train-mlogloss:0.811622+0.0157023	test-mlogloss:1.11281+0.0525987
[116]	train-mlogloss:0.807156+0.0158195	test-mlogloss:1.1101+0.0525378
[117]	train-mlogloss:0.802668+0.0158795	test-mlogloss:1.1069+0.0528126
[118]	train-mlogloss:0.798304+0.015918	test-mlogloss:1.10417+0.0527897
[119]	train-mlogloss:0.793942+0.0159785	test-mlogloss:1.10149+0.0529164
[120]	train-mlogloss:0.789579+0.0160793	test-mlogloss:1.09856+0.0531571
[121]	train-mlogloss:0.785302+0.0161146	test-mlogloss:1.09583+0.0531712
[122]	train-mlogloss:0.781076+0.016163	test-mlogloss:1.0932+0.0534169
[123]	train-mlogloss:0.776825+0.0162367	test-mlogloss:1.0903+0.0535744
[124]	train-mlogloss:0.772667+0.0162281	test-mlogloss:1.08751+0.0537765
[125]	train-mlogloss:0.768484+0.0162636	test-mlogloss:1.08465+0.0539257
[126]	train-mlogloss:0.764389+0.0163272	test-mlogloss:1.08207+0.0538877
[127]	train-mlogloss:0.760268+0.0164542	test-mlogloss:1.07938+0.0542225
[128]	train-mlogloss:0.756219+0.016495	test-mlogloss:1.0767+0.0543366
[129]	train-mlogloss:0.75219+0.0165606	test-mlogloss:1.07407+0.0545539
[130]	train-mlogloss:0.7482+0.0164935	test-mlogloss:1.07136+0.054938
[131]	train-mlogloss:0.744126+0.0165751	test-mlogloss:1.06867+0.0550982
[132]	train-mlogloss:0.740211+0.016529	test-mlogloss:1.06616+0.0553141
[133]	train-mlogloss:0.736288+0.0165212	test-mlogloss:1.0635+0.0554511
[134]	train-mlogloss:0.732409+0.0164843	test-mlogloss:1.06085+0.0558345
[135]	train-mlogloss:0.728519+0.0163908	test-mlogloss:1.05829+0.0558928
[136]	train-mlogloss:0.724663+0.0163124	test-mlogloss:1.05591+0.0560741
[137]	train-mlogloss:0.72077+0.0162721	test-mlogloss:1.05354+0.0566125
[138]	train-mlogloss:0.716977+0.0161285	test-mlogloss:1.05121+0.0567197
[139]	train-mlogloss:0.713144+0.0161146	test-mlogloss:1.04896+0.0571676
[140]	train-mlogloss:0.709374+0.0160776	test-mlogloss:1.04663+0.0574925
[141]	train-mlogloss:0.705657+0.0160471	test-mlogloss:1.04446+0.0578366
[142]	train-mlogloss:0.702015+0.0159553	test-mlogloss:1.04245+0.0582492
[143]	train-mlogloss:0.698353+0.0159713	test-mlogloss:1.0403+0.0586258
[144]	train-mlogloss:0.694729+0.015915	test-mlogloss:1.03827+0.0590314
[145]	train-mlogloss:0.691042+0.015839	test-mlogloss:1.03619+0.0595021
[146]	train-mlogloss:0.687421+0.0157283	test-mlogloss:1.03396+0.0600292
[147]	train-mlogloss:0.683812+0.0156781	test-mlogloss:1.03188+0.0604587
[148]	train-mlogloss:0.680267+0.0156611	test-mlogloss:1.0299+0.0610587
[149]	train-mlogloss:0.676673+0.0155777	test-mlogloss:1.0279+0.0613964
[150]	train-mlogloss:0.673138+0.015451	test-mlogloss:1.02569+0.0616818
[151]	train-mlogloss:0.669581+0.0153434	test-mlogloss:1.02365+0.0621714
[152]	train-mlogloss:0.666093+0.0152648	test-mlogloss:1.02175+0.0624746
[153]	train-mlogloss:0.66262+0.0151487	test-mlogloss:1.01967+0.0629278
[154]	train-mlogloss:0.659172+0.0150895	test-mlogloss:1.01765+0.0630142
[155]	train-mlogloss:0.655823+0.0149683	test-mlogloss:1.01577+0.063314
[156]	train-mlogloss:0.652489+0.0149097	test-mlogloss:1.0136+0.0636563
[157]	train-mlogloss:0.649204+0.0148508	test-mlogloss:1.0117+0.0639255
[158]	train-mlogloss:0.64589+0.0147332	test-mlogloss:1.00981+0.0643884
[159]	train-mlogloss:0.642605+0.014636	test-mlogloss:1.00769+0.0644289
[160]	train-mlogloss:0.639344+0.0145951	test-mlogloss:1.0059+0.0648842
[161]	train-mlogloss:0.63614+0.0144918	test-mlogloss:1.00378+0.0652217
[162]	train-mlogloss:0.632954+0.0144814	test-mlogloss:1.0019+0.0656334
[163]	train-mlogloss:0.62979+0.0143941	test-mlogloss:1+0.0662472
[164]	train-mlogloss:0.626711+0.0143837	test-mlogloss:0.998056+0.0664501
[165]	train-mlogloss:0.623622+0.0143727	test-mlogloss:0.996136+0.0667865
[166]	train-mlogloss:0.62058+0.0143638	test-mlogloss:0.994292+0.0672096
[167]	train-mlogloss:0.617544+0.0143492	test-mlogloss:0.992289+0.067584
[168]	train-mlogloss:0.614455+0.014318	test-mlogloss:0.990566+0.0680319
[169]	train-mlogloss:0.611463+0.0143505	test-mlogloss:0.988689+0.0680579
[170]	train-mlogloss:0.608447+0.0142617	test-mlogloss:0.987008+0.0684073
[171]	train-mlogloss:0.60547+0.0142948	test-mlogloss:0.985304+0.0687084
[172]	train-mlogloss:0.602557+0.0142771	test-mlogloss:0.98354+0.0691697
[173]	train-mlogloss:0.599569+0.0142534	test-mlogloss:0.981888+0.0695271
[174]	train-mlogloss:0.596632+0.0141815	test-mlogloss:0.980196+0.069609
[175]	train-mlogloss:0.593756+0.0141143	test-mlogloss:0.978736+0.0698177
[176]	train-mlogloss:0.59089+0.0140329	test-mlogloss:0.97704+0.0702468
[177]	train-mlogloss:0.588044+0.0140215	test-mlogloss:0.975366+0.0704566
[178]	train-mlogloss:0.58519+0.0139879	test-mlogloss:0.973808+0.0706715
[179]	train-mlogloss:0.58238+0.0139415	test-mlogloss:0.97208+0.0706214
[180]	train-mlogloss:0.579593+0.0138998	test-mlogloss:0.970552+0.071067
[181]	train-mlogloss:0.57681+0.0138304	test-mlogloss:0.969014+0.0713175
[182]	train-mlogloss:0.574042+0.0138203	test-mlogloss:0.967368+0.0715873
[183]	train-mlogloss:0.571326+0.0137359	test-mlogloss:0.965992+0.0719651
[184]	train-mlogloss:0.568539+0.0136722	test-mlogloss:0.964269+0.0720293
[185]	train-mlogloss:0.565852+0.0136113	test-mlogloss:0.962823+0.0724068
[186]	train-mlogloss:0.563186+0.0135339	test-mlogloss:0.961327+0.0725491
[187]	train-mlogloss:0.560499+0.0134413	test-mlogloss:0.959811+0.0728482
[188]	train-mlogloss:0.557814+0.0133729	test-mlogloss:0.958459+0.0732122
[189]	train-mlogloss:0.555171+0.0132738	test-mlogloss:0.95701+0.0734963
[190]	train-mlogloss:0.552536+0.0131995	test-mlogloss:0.955575+0.0736867
[191]	train-mlogloss:0.549957+0.0131212	test-mlogloss:0.954111+0.073933
[192]	train-mlogloss:0.54735+0.0130405	test-mlogloss:0.95274+0.0743388
[193]	train-mlogloss:0.544794+0.0129686	test-mlogloss:0.95132+0.07463
[194]	train-mlogloss:0.542213+0.0128882	test-mlogloss:0.94985+0.0749149
[195]	train-mlogloss:0.539689+0.0128531	test-mlogloss:0.948605+0.0752734
[196]	train-mlogloss:0.537192+0.0127785	test-mlogloss:0.947175+0.0753786
[197]	train-mlogloss:0.534708+0.0127119	test-mlogloss:0.946081+0.0756482
[198]	train-mlogloss:0.532247+0.0126719	test-mlogloss:0.944725+0.0759991
[199]	train-mlogloss:0.529763+0.0125882	test-mlogloss:0.943384+0.0763465
[200]	train-mlogloss:0.527341+0.0125799	test-mlogloss:0.942143+0.0765635
[201]	train-mlogloss:0.524889+0.0125353	test-mlogloss:0.940623+0.0766519
[202]	train-mlogloss:0.52255+0.0124913	test-mlogloss:0.93946+0.0769679
[203]	train-mlogloss:0.520205+0.0124789	test-mlogloss:0.93822+0.0770537
[204]	train-mlogloss:0.517884+0.0124868	test-mlogloss:0.93705+0.0772289
[205]	train-mlogloss:0.515534+0.0124063	test-mlogloss:0.936028+0.0775347
[206]	train-mlogloss:0.513162+0.0123144	test-mlogloss:0.934769+0.0777008
[207]	train-mlogloss:0.510848+0.0122396	test-mlogloss:0.933511+0.0778395
[208]	train-mlogloss:0.508533+0.0121356	test-mlogloss:0.932494+0.0779128
[209]	train-mlogloss:0.506267+0.0120742	test-mlogloss:0.931361+0.0779921
[210]	train-mlogloss:0.504003+0.0120232	test-mlogloss:0.930075+0.0782249
[211]	train-mlogloss:0.501749+0.0120071	test-mlogloss:0.929036+0.0784201
[212]	train-mlogloss:0.49958+0.0119971	test-mlogloss:0.927807+0.0784837
[213]	train-mlogloss:0.497364+0.0119452	test-mlogloss:0.926505+0.0788591
[214]	train-mlogloss:0.49517+0.0118882	test-mlogloss:0.925521+0.0790424
[215]	train-mlogloss:0.492997+0.0118556	test-mlogloss:0.924471+0.0791597
[216]	train-mlogloss:0.490861+0.0118143	test-mlogloss:0.923437+0.079502
[217]	train-mlogloss:0.488726+0.0118153	test-mlogloss:0.922285+0.0797352
[218]	train-mlogloss:0.486642+0.0117922	test-mlogloss:0.921259+0.0799253
[219]	train-mlogloss:0.484547+0.0117849	test-mlogloss:0.920248+0.0801625
[220]	train-mlogloss:0.482445+0.0117451	test-mlogloss:0.919189+0.080346
[221]	train-mlogloss:0.480384+0.011731	test-mlogloss:0.918205+0.0806184
[222]	train-mlogloss:0.478254+0.0117214	test-mlogloss:0.917183+0.0808707
[223]	train-mlogloss:0.476218+0.0116801	test-mlogloss:0.916065+0.0811279
[224]	train-mlogloss:0.474163+0.0116587	test-mlogloss:0.915059+0.0812139
[225]	train-mlogloss:0.472122+0.0116405	test-mlogloss:0.914115+0.0815041
[226]	train-mlogloss:0.470102+0.0116066	test-mlogloss:0.913267+0.0815826
[227]	train-mlogloss:0.468073+0.0116073	test-mlogloss:0.912364+0.0816902
[228]	train-mlogloss:0.466086+0.0115128	test-mlogloss:0.911403+0.0818545
[229]	train-mlogloss:0.464067+0.0114784	test-mlogloss:0.910301+0.0817576
[230]	train-mlogloss:0.462103+0.0114858	test-mlogloss:0.909341+0.0820326
[231]	train-mlogloss:0.460144+0.0114712	test-mlogloss:0.908296+0.0820208
[232]	train-mlogloss:0.458198+0.0114564	test-mlogloss:0.907436+0.0821796
[233]	train-mlogloss:0.456263+0.0114279	test-mlogloss:0.906299+0.0820803
[234]	train-mlogloss:0.454337+0.0114048	test-mlogloss:0.905365+0.0819124
[235]	train-mlogloss:0.452409+0.0113956	test-mlogloss:0.904501+0.0820722
[236]	train-mlogloss:0.450518+0.0113724	test-mlogloss:0.903511+0.0817578
[237]	train-mlogloss:0.448655+0.0113868	test-mlogloss:0.902601+0.0821304
[238]	train-mlogloss:0.446772+0.0113236	test-mlogloss:0.901389+0.0819192
[239]	train-mlogloss:0.44488+0.0112754	test-mlogloss:0.900486+0.0819991
[240]	train-mlogloss:0.443046+0.0112649	test-mlogloss:0.899531+0.0820009
[241]	train-mlogloss:0.44121+0.0112341	test-mlogloss:0.898636+0.0819816
[242]	train-mlogloss:0.439364+0.011221	test-mlogloss:0.897639+0.0821118
[243]	train-mlogloss:0.43754+0.0111984	test-mlogloss:0.896647+0.0820236
[244]	train-mlogloss:0.435696+0.0111557	test-mlogloss:0.895677+0.0821978
[245]	train-mlogloss:0.43391+0.0111204	test-mlogloss:0.894838+0.0824083
[246]	train-mlogloss:0.432072+0.0110975	test-mlogloss:0.893763+0.0821486
[247]	train-mlogloss:0.430332+0.0110754	test-mlogloss:0.892818+0.0823278
[248]	train-mlogloss:0.428517+0.0110531	test-mlogloss:0.891967+0.0824812
[249]	train-mlogloss:0.426711+0.0110743	test-mlogloss:0.891037+0.0826405
[250]	train-mlogloss:0.424972+0.0110604	test-mlogloss:0.889931+0.0826767
[251]	train-mlogloss:0.423178+0.011045	test-mlogloss:0.888913+0.0825378
[252]	train-mlogloss:0.421438+0.0110349	test-mlogloss:0.887999+0.0828047
[253]	train-mlogloss:0.419706+0.0109892	test-mlogloss:0.887066+0.0827028
[254]	train-mlogloss:0.417968+0.0110055	test-mlogloss:0.886153+0.0829624
[255]	train-mlogloss:0.41628+0.0109172	test-mlogloss:0.885416+0.0831482
[256]	train-mlogloss:0.414545+0.0108998	test-mlogloss:0.884345+0.0831236
[257]	train-mlogloss:0.412812+0.0109045	test-mlogloss:0.883606+0.083419
[258]	train-mlogloss:0.411173+0.0109025	test-mlogloss:0.882722+0.0836411
[259]	train-mlogloss:0.409496+0.010871	test-mlogloss:0.881815+0.0838658
[260]	train-mlogloss:0.407842+0.0108807	test-mlogloss:0.881027+0.0839734
[261]	train-mlogloss:0.406195+0.0109338	test-mlogloss:0.88029+0.0843815
[262]	train-mlogloss:0.40459+0.0109341	test-mlogloss:0.879426+0.0845169
[263]	train-mlogloss:0.402951+0.0109133	test-mlogloss:0.878449+0.0843208
[264]	train-mlogloss:0.40129+0.0109281	test-mlogloss:0.877751+0.084674
[265]	train-mlogloss:0.399648+0.0109182	test-mlogloss:0.877054+0.0847602
[266]	train-mlogloss:0.39802+0.0108912	test-mlogloss:0.876077+0.084986
[267]	train-mlogloss:0.396465+0.0109061	test-mlogloss:0.875299+0.0852196
[268]	train-mlogloss:0.394839+0.0108783	test-mlogloss:0.874608+0.0853647
[269]	train-mlogloss:0.393276+0.0108905	test-mlogloss:0.87374+0.0852826
[270]	train-mlogloss:0.391702+0.0108652	test-mlogloss:0.872987+0.0853709
[271]	train-mlogloss:0.390156+0.0108597	test-mlogloss:0.872426+0.0856387
[272]	train-mlogloss:0.388607+0.0108217	test-mlogloss:0.871692+0.0858874
[273]	train-mlogloss:0.387069+0.0108067	test-mlogloss:0.871224+0.086238
[274]	train-mlogloss:0.385547+0.0107567	test-mlogloss:0.870607+0.0863853
[275]	train-mlogloss:0.384028+0.0107657	test-mlogloss:0.869793+0.0864497
[276]	train-mlogloss:0.382486+0.0107426	test-mlogloss:0.869332+0.0867931
[277]	train-mlogloss:0.381009+0.0107024	test-mlogloss:0.868537+0.0869615
[278]	train-mlogloss:0.379531+0.0106929	test-mlogloss:0.867951+0.0871407
[279]	train-mlogloss:0.378033+0.0106897	test-mlogloss:0.867351+0.0873018
[280]	train-mlogloss:0.376568+0.0106556	test-mlogloss:0.866741+0.0874714
[281]	train-mlogloss:0.375101+0.0106416	test-mlogloss:0.866179+0.0877485
[282]	train-mlogloss:0.373668+0.0106282	test-mlogloss:0.865674+0.0879939
[283]	train-mlogloss:0.372246+0.0105684	test-mlogloss:0.864961+0.0880928
[284]	train-mlogloss:0.370796+0.0105131	test-mlogloss:0.864422+0.0882847
[285]	train-mlogloss:0.369403+0.0105106	test-mlogloss:0.863896+0.0883712
[286]	train-mlogloss:0.36799+0.0105173	test-mlogloss:0.863476+0.0888158
[287]	train-mlogloss:0.366636+0.0104686	test-mlogloss:0.863029+0.088982
[288]	train-mlogloss:0.365222+0.010419	test-mlogloss:0.862461+0.0891132
[289]	train-mlogloss:0.363812+0.0104141	test-mlogloss:0.862144+0.0893215
[290]	train-mlogloss:0.362463+0.0103886	test-mlogloss:0.861458+0.0893903
[291]	train-mlogloss:0.361083+0.0103719	test-mlogloss:0.861094+0.0896442
[292]	train-mlogloss:0.359742+0.0103321	test-mlogloss:0.860821+0.0899243
[293]	train-mlogloss:0.358389+0.0103149	test-mlogloss:0.860084+0.0900469
[294]	train-mlogloss:0.357062+0.0103134	test-mlogloss:0.859775+0.090154
[295]	train-mlogloss:0.35576+0.0103015	test-mlogloss:0.859329+0.0905376
[296]	train-mlogloss:0.35445+0.0102909	test-mlogloss:0.858845+0.0906023
[297]	train-mlogloss:0.353123+0.0102949	test-mlogloss:0.858466+0.0910874
[298]	train-mlogloss:0.351817+0.0102904	test-mlogloss:0.858072+0.0911594
[299]	train-mlogloss:0.350534+0.0102602	test-mlogloss:0.857648+0.0912807
[300]	train-mlogloss:0.349271+0.0102596	test-mlogloss:0.8572+0.0915634
[301]	train-mlogloss:0.347997+0.0102518	test-mlogloss:0.856747+0.0916008
[302]	train-mlogloss:0.346715+0.0102651	test-mlogloss:0.856152+0.0916837
[303]	train-mlogloss:0.345454+0.010217	test-mlogloss:0.855929+0.0919349
[304]	train-mlogloss:0.344202+0.0102342	test-mlogloss:0.855421+0.0921514
[305]	train-mlogloss:0.342972+0.0102398	test-mlogloss:0.855008+0.0921329
[306]	train-mlogloss:0.341698+0.0101996	test-mlogloss:0.85442+0.0921767
[307]	train-mlogloss:0.340453+0.0102261	test-mlogloss:0.854108+0.0924771
[308]	train-mlogloss:0.339223+0.0102561	test-mlogloss:0.853643+0.0923388
[309]	train-mlogloss:0.33799+0.0102237	test-mlogloss:0.853105+0.0924106
[310]	train-mlogloss:0.336768+0.0102513	test-mlogloss:0.852771+0.0926793
[311]	train-mlogloss:0.335532+0.0102348	test-mlogloss:0.852468+0.0927356
[312]	train-mlogloss:0.334337+0.0102485	test-mlogloss:0.851892+0.0928323
[313]	train-mlogloss:0.333154+0.0102664	test-mlogloss:0.851533+0.0930744
[314]	train-mlogloss:0.331978+0.0102383	test-mlogloss:0.851136+0.0930546
[315]	train-mlogloss:0.330794+0.0102585	test-mlogloss:0.850624+0.0929538
[316]	train-mlogloss:0.329638+0.0102656	test-mlogloss:0.850262+0.0930296
[317]	train-mlogloss:0.328466+0.010281	test-mlogloss:0.849954+0.0931427
[318]	train-mlogloss:0.327303+0.0102714	test-mlogloss:0.849507+0.0932338
[319]	train-mlogloss:0.3262+0.0102915	test-mlogloss:0.84901+0.0932064
[320]	train-mlogloss:0.325017+0.0102959	test-mlogloss:0.848747+0.0932166
[321]	train-mlogloss:0.323889+0.0102955	test-mlogloss:0.848339+0.0931669
[322]	train-mlogloss:0.322717+0.0103106	test-mlogloss:0.847935+0.0933112
[323]	train-mlogloss:0.321581+0.0103379	test-mlogloss:0.847632+0.0933738
[324]	train-mlogloss:0.320428+0.0103322	test-mlogloss:0.847219+0.0933334
[325]	train-mlogloss:0.319346+0.0103308	test-mlogloss:0.847034+0.0934669
[326]	train-mlogloss:0.318229+0.0103554	test-mlogloss:0.846632+0.0933954
[327]	train-mlogloss:0.317099+0.0103803	test-mlogloss:0.846237+0.0933549
[328]	train-mlogloss:0.315996+0.0103913	test-mlogloss:0.84593+0.0932723
[329]	train-mlogloss:0.314926+0.0104059	test-mlogloss:0.845558+0.0932394
[330]	train-mlogloss:0.313837+0.010422	test-mlogloss:0.845209+0.0933914
[331]	train-mlogloss:0.312779+0.0104265	test-mlogloss:0.844947+0.0933437
[332]	train-mlogloss:0.311726+0.0104359	test-mlogloss:0.844506+0.0933509
[333]	train-mlogloss:0.310693+0.0104148	test-mlogloss:0.844248+0.0934774
[334]	train-mlogloss:0.309638+0.0104199	test-mlogloss:0.84406+0.0936312
[335]	train-mlogloss:0.308631+0.0104558	test-mlogloss:0.843669+0.0937124
[336]	train-mlogloss:0.307578+0.010442	test-mlogloss:0.843242+0.0937085
[337]	train-mlogloss:0.306563+0.0104318	test-mlogloss:0.842969+0.0936845
[338]	train-mlogloss:0.305556+0.0104479	test-mlogloss:0.842558+0.0937162
[339]	train-mlogloss:0.304513+0.0104046	test-mlogloss:0.842217+0.0938593
[340]	train-mlogloss:0.30346+0.0104014	test-mlogloss:0.841952+0.0938249
[341]	train-mlogloss:0.302475+0.0103677	test-mlogloss:0.8414+0.0936874
[342]	train-mlogloss:0.301441+0.010357	test-mlogloss:0.84112+0.0936612
[343]	train-mlogloss:0.300444+0.0103383	test-mlogloss:0.840663+0.0935431
[344]	train-mlogloss:0.299438+0.010351	test-mlogloss:0.84041+0.0937212
[345]	train-mlogloss:0.298411+0.0103155	test-mlogloss:0.839869+0.093684
[346]	train-mlogloss:0.297434+0.0102751	test-mlogloss:0.839638+0.0936624
[347]	train-mlogloss:0.296457+0.0102612	test-mlogloss:0.839247+0.0936711
[348]	train-mlogloss:0.295471+0.0102586	test-mlogloss:0.839004+0.0937747
[349]	train-mlogloss:0.294501+0.010232	test-mlogloss:0.838743+0.0938336
[350]	train-mlogloss:0.293572+0.0101985	test-mlogloss:0.838462+0.0940199
[351]	train-mlogloss:0.292618+0.0101856	test-mlogloss:0.838371+0.0941809
[352]	train-mlogloss:0.291665+0.0101575	test-mlogloss:0.838164+0.0944332
[353]	train-mlogloss:0.290705+0.0101476	test-mlogloss:0.837926+0.0945655
[354]	train-mlogloss:0.289792+0.0100952	test-mlogloss:0.837656+0.0947996
[355]	train-mlogloss:0.288867+0.0100582	test-mlogloss:0.837283+0.0946683
[356]	train-mlogloss:0.287928+0.0100462	test-mlogloss:0.836914+0.0946885
[357]	train-mlogloss:0.286972+0.009997	test-mlogloss:0.83658+0.0947668
[358]	train-mlogloss:0.286043+0.00995692	test-mlogloss:0.836346+0.0948452
[359]	train-mlogloss:0.285108+0.00990858	test-mlogloss:0.836115+0.0950656
[360]	train-mlogloss:0.284203+0.00986821	test-mlogloss:0.83576+0.0950192
[361]	train-mlogloss:0.28327+0.00982638	test-mlogloss:0.835529+0.0951286
[362]	train-mlogloss:0.282389+0.00978035	test-mlogloss:0.835166+0.0951225
[363]	train-mlogloss:0.281484+0.00974923	test-mlogloss:0.834869+0.095242
[364]	train-mlogloss:0.280606+0.00969249	test-mlogloss:0.83478+0.0955053
[365]	train-mlogloss:0.279684+0.00965708	test-mlogloss:0.834496+0.0954541
[366]	train-mlogloss:0.278795+0.00961632	test-mlogloss:0.83417+0.0954208
[367]	train-mlogloss:0.277867+0.00958984	test-mlogloss:0.834034+0.0955889
[368]	train-mlogloss:0.277022+0.0095373	test-mlogloss:0.833747+0.0955545
[369]	train-mlogloss:0.276139+0.00951266	test-mlogloss:0.833708+0.0957895
[370]	train-mlogloss:0.27529+0.00945277	test-mlogloss:0.833459+0.0956807
[371]	train-mlogloss:0.27442+0.00943374	test-mlogloss:0.833359+0.0958147
[372]	train-mlogloss:0.273567+0.00940054	test-mlogloss:0.833009+0.0957844
[373]	train-mlogloss:0.272721+0.00933923	test-mlogloss:0.832855+0.0959818
[374]	train-mlogloss:0.271847+0.00932031	test-mlogloss:0.832773+0.0960639
[375]	train-mlogloss:0.271022+0.00930881	test-mlogloss:0.83255+0.0961272
[376]	train-mlogloss:0.27021+0.0092528	test-mlogloss:0.832321+0.0961546
[377]	train-mlogloss:0.269392+0.0092057	test-mlogloss:0.832079+0.0962395
[378]	train-mlogloss:0.268554+0.0092163	test-mlogloss:0.831909+0.0964932
[379]	train-mlogloss:0.267772+0.00917819	test-mlogloss:0.831606+0.0965967
[380]	train-mlogloss:0.266966+0.00916713	test-mlogloss:0.831353+0.0966345
[381]	train-mlogloss:0.266171+0.00916069	test-mlogloss:0.831267+0.0968954
[382]	train-mlogloss:0.265361+0.00913595	test-mlogloss:0.831055+0.0969737
[383]	train-mlogloss:0.26458+0.00910745	test-mlogloss:0.830768+0.0970071
[384]	train-mlogloss:0.263768+0.00908857	test-mlogloss:0.830535+0.097096
[385]	train-mlogloss:0.263004+0.00906573	test-mlogloss:0.830256+0.0972844
[386]	train-mlogloss:0.26224+0.00903919	test-mlogloss:0.830112+0.0974753
[387]	train-mlogloss:0.261471+0.00901837	test-mlogloss:0.829952+0.0977965
[388]	train-mlogloss:0.260674+0.00903015	test-mlogloss:0.829675+0.0979617
[389]	train-mlogloss:0.259899+0.00905169	test-mlogloss:0.829463+0.0979849
[390]	train-mlogloss:0.259125+0.00902888	test-mlogloss:0.829239+0.0981049
[391]	train-mlogloss:0.258372+0.00900631	test-mlogloss:0.829104+0.0982335
[392]	train-mlogloss:0.257653+0.00896265	test-mlogloss:0.828859+0.098414
[393]	train-mlogloss:0.2569+0.00893776	test-mlogloss:0.828728+0.0986383
[394]	train-mlogloss:0.256149+0.00892523	test-mlogloss:0.828512+0.0987465
[395]	train-mlogloss:0.255403+0.00888797	test-mlogloss:0.828251+0.0987251
[396]	train-mlogloss:0.254675+0.00886559	test-mlogloss:0.828126+0.0989587
[397]	train-mlogloss:0.253936+0.00884839	test-mlogloss:0.827923+0.0990841
[398]	train-mlogloss:0.253181+0.00883065	test-mlogloss:0.827567+0.0991445
[399]	train-mlogloss:0.252449+0.00882688	test-mlogloss:0.827324+0.0992455
[400]	train-mlogloss:0.251719+0.00880264	test-mlogloss:0.827127+0.0993875
[401]	train-mlogloss:0.250985+0.00878745	test-mlogloss:0.826918+0.0994805
[402]	train-mlogloss:0.250265+0.00876142	test-mlogloss:0.826557+0.0995216
[403]	train-mlogloss:0.249546+0.00875954	test-mlogloss:0.826295+0.0996908
[404]	train-mlogloss:0.24886+0.00872976	test-mlogloss:0.826141+0.0996713
[405]	train-mlogloss:0.248141+0.00871482	test-mlogloss:0.825827+0.0999017
[406]	train-mlogloss:0.247442+0.00871155	test-mlogloss:0.825701+0.100026
[407]	train-mlogloss:0.246743+0.00867762	test-mlogloss:0.825505+0.10026
[408]	train-mlogloss:0.246063+0.00863646	test-mlogloss:0.825275+0.10036
[409]	train-mlogloss:0.245356+0.00862626	test-mlogloss:0.825116+0.100441
[410]	train-mlogloss:0.24468+0.00859084	test-mlogloss:0.824898+0.100454
[411]	train-mlogloss:0.244006+0.00856161	test-mlogloss:0.824729+0.100624
[412]	train-mlogloss:0.243317+0.00855935	test-mlogloss:0.824545+0.100816
[413]	train-mlogloss:0.242652+0.00854116	test-mlogloss:0.824385+0.100944
[414]	train-mlogloss:0.241954+0.00852893	test-mlogloss:0.824149+0.101005
[415]	train-mlogloss:0.241286+0.00853135	test-mlogloss:0.823945+0.1011
[416]	train-mlogloss:0.240634+0.0085336	test-mlogloss:0.823657+0.101064
[417]	train-mlogloss:0.239988+0.00851831	test-mlogloss:0.823427+0.101065
[418]	train-mlogloss:0.239325+0.00852312	test-mlogloss:0.823225+0.101197
[419]	train-mlogloss:0.238673+0.00850475	test-mlogloss:0.823057+0.10115
[420]	train-mlogloss:0.238021+0.00849936	test-mlogloss:0.822825+0.101209
[421]	train-mlogloss:0.237394+0.00850222	test-mlogloss:0.822554+0.101225
[422]	train-mlogloss:0.236751+0.00850148	test-mlogloss:0.822296+0.101187
[423]	train-mlogloss:0.236108+0.00850549	test-mlogloss:0.8221+0.101143
[424]	train-mlogloss:0.235479+0.00849383	test-mlogloss:0.821922+0.101123
[425]	train-mlogloss:0.234856+0.00850316	test-mlogloss:0.821693+0.101232
[426]	train-mlogloss:0.234226+0.00848994	test-mlogloss:0.821467+0.101287
[427]	train-mlogloss:0.233609+0.00848259	test-mlogloss:0.821248+0.101259
[428]	train-mlogloss:0.233017+0.00848007	test-mlogloss:0.821098+0.101305
[429]	train-mlogloss:0.232419+0.00848752	test-mlogloss:0.820951+0.101359
[430]	train-mlogloss:0.231797+0.00847101	test-mlogloss:0.82077+0.101357
[431]	train-mlogloss:0.23121+0.00845988	test-mlogloss:0.82067+0.101477
[432]	train-mlogloss:0.230608+0.00844896	test-mlogloss:0.82046+0.101554
[433]	train-mlogloss:0.23002+0.00844974	test-mlogloss:0.820295+0.101602
[434]	train-mlogloss:0.22941+0.00844564	test-mlogloss:0.820142+0.101595
[435]	train-mlogloss:0.228815+0.00842636	test-mlogloss:0.820088+0.101661
[436]	train-mlogloss:0.228243+0.00843755	test-mlogloss:0.819871+0.101604
[437]	train-mlogloss:0.22768+0.00841891	test-mlogloss:0.819752+0.101612
[438]	train-mlogloss:0.227086+0.00840672	test-mlogloss:0.81962+0.101585
[439]	train-mlogloss:0.226531+0.00839084	test-mlogloss:0.819481+0.10161
[440]	train-mlogloss:0.225961+0.00839324	test-mlogloss:0.81931+0.101585
[441]	train-mlogloss:0.225399+0.00836307	test-mlogloss:0.819255+0.101652
[442]	train-mlogloss:0.224842+0.00835675	test-mlogloss:0.819045+0.101665
[443]	train-mlogloss:0.224287+0.00835372	test-mlogloss:0.818846+0.101636
[444]	train-mlogloss:0.223721+0.0083454	test-mlogloss:0.818819+0.101687
[445]	train-mlogloss:0.223163+0.00835034	test-mlogloss:0.818545+0.101688
[446]	train-mlogloss:0.222599+0.00833782	test-mlogloss:0.818385+0.10169
[447]	train-mlogloss:0.222052+0.0083155	test-mlogloss:0.818263+0.101812
[448]	train-mlogloss:0.221494+0.00830234	test-mlogloss:0.818157+0.101777
[449]	train-mlogloss:0.220954+0.00829604	test-mlogloss:0.817949+0.10186
[450]	train-mlogloss:0.220421+0.00830471	test-mlogloss:0.817987+0.101875
[451]	train-mlogloss:0.219878+0.00829422	test-mlogloss:0.817767+0.101917
[452]	train-mlogloss:0.219346+0.00827215	test-mlogloss:0.817691+0.102
[453]	train-mlogloss:0.218805+0.00824406	test-mlogloss:0.817542+0.102077
[454]	train-mlogloss:0.218271+0.00822274	test-mlogloss:0.817501+0.102221
[455]	train-mlogloss:0.217758+0.00822085	test-mlogloss:0.817304+0.10229
[456]	train-mlogloss:0.217233+0.00820284	test-mlogloss:0.817243+0.102412
[457]	train-mlogloss:0.216709+0.00819505	test-mlogloss:0.817241+0.10249
[458]	train-mlogloss:0.216195+0.00818115	test-mlogloss:0.817124+0.102687
[459]	train-mlogloss:0.215676+0.00818238	test-mlogloss:0.817151+0.10283
[460]	train-mlogloss:0.215159+0.00815736	test-mlogloss:0.816922+0.102849
[461]	train-mlogloss:0.214648+0.00815244	test-mlogloss:0.816851+0.10302
[462]	train-mlogloss:0.21415+0.00814807	test-mlogloss:0.816738+0.103131
[463]	train-mlogloss:0.213656+0.00813379	test-mlogloss:0.816647+0.103208
[464]	train-mlogloss:0.213149+0.00813601	test-mlogloss:0.816607+0.103362
[465]	train-mlogloss:0.212653+0.00812213	test-mlogloss:0.816559+0.103362
[466]	train-mlogloss:0.212152+0.00812596	test-mlogloss:0.816455+0.103528
[467]	train-mlogloss:0.211654+0.00810159	test-mlogloss:0.816293+0.103597
[468]	train-mlogloss:0.211152+0.0080982	test-mlogloss:0.816217+0.103794
[469]	train-mlogloss:0.210663+0.00807619	test-mlogloss:0.816097+0.103923
[470]	train-mlogloss:0.210173+0.00807621	test-mlogloss:0.815958+0.103921
[471]	train-mlogloss:0.209697+0.0080683	test-mlogloss:0.815947+0.104119
[472]	train-mlogloss:0.209215+0.00805608	test-mlogloss:0.815917+0.104246
[473]	train-mlogloss:0.208746+0.00805052	test-mlogloss:0.815691+0.104303
[474]	train-mlogloss:0.208258+0.00803391	test-mlogloss:0.815598+0.104413
[475]	train-mlogloss:0.207788+0.00803296	test-mlogloss:0.815605+0.104644
[476]	train-mlogloss:0.207313+0.00801664	test-mlogloss:0.815444+0.10463
[477]	train-mlogloss:0.206838+0.00800133	test-mlogloss:0.815421+0.104915
[478]	train-mlogloss:0.206386+0.0080044	test-mlogloss:0.815356+0.105039
[479]	train-mlogloss:0.205913+0.00798212	test-mlogloss:0.815075+0.105129
[480]	train-mlogloss:0.205429+0.00796783	test-mlogloss:0.815219+0.105225
[481]	train-mlogloss:0.204975+0.0079562	test-mlogloss:0.815021+0.10524
[482]	train-mlogloss:0.204496+0.00794569	test-mlogloss:0.815131+0.105379
[483]	train-mlogloss:0.204047+0.0079328	test-mlogloss:0.81503+0.1055
[484]	train-mlogloss:0.203579+0.00793567	test-mlogloss:0.814916+0.10568
[485]	train-mlogloss:0.20312+0.0079212	test-mlogloss:0.814865+0.105851
[486]	train-mlogloss:0.202674+0.00791959	test-mlogloss:0.814896+0.106082
[487]	train-mlogloss:0.202227+0.00791495	test-mlogloss:0.814794+0.106093
[488]	train-mlogloss:0.201766+0.00791666	test-mlogloss:0.814762+0.106201
[489]	train-mlogloss:0.201322+0.00790668	test-mlogloss:0.814856+0.106226
[490]	train-mlogloss:0.200871+0.00791049	test-mlogloss:0.814734+0.106286
[491]	train-mlogloss:0.200436+0.00789243	test-mlogloss:0.814672+0.106495
[492]	train-mlogloss:0.19999+0.00789119	test-mlogloss:0.814629+0.106467
[493]	train-mlogloss:0.199561+0.00790008	test-mlogloss:0.81458+0.106658
[494]	train-mlogloss:0.199127+0.0078806	test-mlogloss:0.814593+0.106706
[495]	train-mlogloss:0.198695+0.00788245	test-mlogloss:0.81447+0.106876
[496]	train-mlogloss:0.198256+0.00787168	test-mlogloss:0.814487+0.106965
[497]	train-mlogloss:0.197839+0.00788354	test-mlogloss:0.814473+0.107058
[498]	train-mlogloss:0.197416+0.00786242	test-mlogloss:0.814398+0.107171
[499]	train-mlogloss:0.196989+0.00786899	test-mlogloss:0.814291+0.107137
[500]	train-mlogloss:0.196583+0.00786642	test-mlogloss:0.814138+0.107346
[501]	train-mlogloss:0.196162+0.00786809	test-mlogloss:0.814154+0.107316
[502]	train-mlogloss:0.195744+0.00786764	test-mlogloss:0.814256+0.107416
[503]	train-mlogloss:0.195327+0.00784653	test-mlogloss:0.813974+0.107543
[504]	train-mlogloss:0.194918+0.0078537	test-mlogloss:0.814031+0.10755
[505]	train-mlogloss:0.194512+0.00784862	test-mlogloss:0.813991+0.107639
[506]	train-mlogloss:0.1941+0.00784257	test-mlogloss:0.813761+0.107723
[507]	train-mlogloss:0.193678+0.00783576	test-mlogloss:0.813783+0.107831
[508]	train-mlogloss:0.193272+0.00781589	test-mlogloss:0.813747+0.107789
[509]	train-mlogloss:0.192863+0.00781016	test-mlogloss:0.813671+0.108004
[510]	train-mlogloss:0.192481+0.00781739	test-mlogloss:0.813535+0.108
[511]	train-mlogloss:0.192065+0.00780919	test-mlogloss:0.813561+0.108084
[512]	train-mlogloss:0.191685+0.00780374	test-mlogloss:0.813528+0.10824
[513]	train-mlogloss:0.191262+0.00779432	test-mlogloss:0.813301+0.108287
[514]	train-mlogloss:0.19086+0.0077854	test-mlogloss:0.813393+0.108445
[515]	train-mlogloss:0.190473+0.00779841	test-mlogloss:0.813292+0.108533
[516]	train-mlogloss:0.190085+0.0077781	test-mlogloss:0.813123+0.108592
[517]	train-mlogloss:0.189702+0.00778337	test-mlogloss:0.813136+0.108572
[518]	train-mlogloss:0.189295+0.00775943	test-mlogloss:0.813229+0.108729
[519]	train-mlogloss:0.188928+0.00775698	test-mlogloss:0.813209+0.108815
[520]	train-mlogloss:0.188535+0.0077569	test-mlogloss:0.813129+0.108991
[521]	train-mlogloss:0.188154+0.00774499	test-mlogloss:0.813145+0.108963
[522]	train-mlogloss:0.187772+0.00773548	test-mlogloss:0.813264+0.109079
[523]	train-mlogloss:0.187396+0.00771658	test-mlogloss:0.813204+0.109301
[524]	train-mlogloss:0.18701+0.00771884	test-mlogloss:0.813063+0.10924
[525]	train-mlogloss:0.186644+0.00770573	test-mlogloss:0.812962+0.109346
[526]	train-mlogloss:0.18626+0.00769367	test-mlogloss:0.813002+0.109352
[527]	train-mlogloss:0.185883+0.0076993	test-mlogloss:0.812747+0.109391
[528]	train-mlogloss:0.185501+0.00768259	test-mlogloss:0.812858+0.109458
[529]	train-mlogloss:0.185125+0.0076723	test-mlogloss:0.812669+0.109591
[530]	train-mlogloss:0.184761+0.00767954	test-mlogloss:0.812657+0.109654
[531]	train-mlogloss:0.184394+0.00765586	test-mlogloss:0.812618+0.109814
[532]	train-mlogloss:0.184023+0.00765645	test-mlogloss:0.812462+0.109948
[533]	train-mlogloss:0.183681+0.00765366	test-mlogloss:0.812349+0.109921
[534]	train-mlogloss:0.183291+0.0076447	test-mlogloss:0.812268+0.110109
[535]	train-mlogloss:0.182937+0.00762987	test-mlogloss:0.812186+0.110168
[536]	train-mlogloss:0.182578+0.00765035	test-mlogloss:0.812088+0.110284
[537]	train-mlogloss:0.18223+0.00764223	test-mlogloss:0.812081+0.110289
[538]	train-mlogloss:0.181865+0.00764344	test-mlogloss:0.812037+0.110312
[539]	train-mlogloss:0.181493+0.00761625	test-mlogloss:0.811857+0.110507
[540]	train-mlogloss:0.181134+0.00762619	test-mlogloss:0.811659+0.110451
[541]	train-mlogloss:0.180795+0.0076053	test-mlogloss:0.811698+0.110562
[542]	train-mlogloss:0.180445+0.00760523	test-mlogloss:0.81167+0.110464
[543]	train-mlogloss:0.180088+0.00759271	test-mlogloss:0.81158+0.110456
[544]	train-mlogloss:0.179737+0.00758236	test-mlogloss:0.811456+0.110539
[545]	train-mlogloss:0.179382+0.00757788	test-mlogloss:0.811259+0.110474
[546]	train-mlogloss:0.179041+0.00758111	test-mlogloss:0.81133+0.110641
[547]	train-mlogloss:0.1787+0.00756714	test-mlogloss:0.811075+0.110616
[548]	train-mlogloss:0.178355+0.00755176	test-mlogloss:0.81096+0.110597
[549]	train-mlogloss:0.178015+0.00756226	test-mlogloss:0.810899+0.110611
[550]	train-mlogloss:0.177672+0.00754158	test-mlogloss:0.810687+0.110805
[551]	train-mlogloss:0.177336+0.0075378	test-mlogloss:0.810567+0.110709
[552]	train-mlogloss:0.177001+0.00753277	test-mlogloss:0.810458+0.11076
[553]	train-mlogloss:0.176667+0.0075177	test-mlogloss:0.810391+0.110785
[554]	train-mlogloss:0.176331+0.00751488	test-mlogloss:0.810231+0.110709
[555]	train-mlogloss:0.176001+0.00750896	test-mlogloss:0.810148+0.110748
[556]	train-mlogloss:0.175687+0.0074958	test-mlogloss:0.810091+0.110848
[557]	train-mlogloss:0.175358+0.00747475	test-mlogloss:0.810043+0.110898
[558]	train-mlogloss:0.175035+0.00746734	test-mlogloss:0.80997+0.110943
[559]	train-mlogloss:0.17472+0.00745186	test-mlogloss:0.809815+0.110903
[560]	train-mlogloss:0.174413+0.00744899	test-mlogloss:0.809805+0.110954
[561]	train-mlogloss:0.174104+0.0074437	test-mlogloss:0.809647+0.110937
[562]	train-mlogloss:0.1738+0.00740394	test-mlogloss:0.809475+0.110941
[563]	train-mlogloss:0.173495+0.00739383	test-mlogloss:0.809662+0.110987
[564]	train-mlogloss:0.173197+0.00737753	test-mlogloss:0.809492+0.110895
[565]	train-mlogloss:0.172896+0.00738747	test-mlogloss:0.809599+0.110942
[566]	train-mlogloss:0.172589+0.00737939	test-mlogloss:0.809345+0.110833
[567]	train-mlogloss:0.172288+0.0073612	test-mlogloss:0.809366+0.110756
[568]	train-mlogloss:0.171993+0.00735534	test-mlogloss:0.809362+0.110818
[569]	train-mlogloss:0.171702+0.00734652	test-mlogloss:0.809227+0.110789
[570]	train-mlogloss:0.171402+0.00735649	test-mlogloss:0.809206+0.110717
[571]	train-mlogloss:0.171109+0.00732407	test-mlogloss:0.809089+0.110638
[572]	train-mlogloss:0.170829+0.00734341	test-mlogloss:0.809156+0.110539
[573]	train-mlogloss:0.17053+0.00732869	test-mlogloss:0.808995+0.110486
[574]	train-mlogloss:0.170252+0.00733476	test-mlogloss:0.808841+0.110406
[575]	train-mlogloss:0.169976+0.00732143	test-mlogloss:0.808861+0.110445
[576]	train-mlogloss:0.169684+0.00732213	test-mlogloss:0.808755+0.110315
[577]	train-mlogloss:0.169408+0.00732306	test-mlogloss:0.808758+0.110386
[578]	train-mlogloss:0.169118+0.00732531	test-mlogloss:0.808678+0.110468
[579]	train-mlogloss:0.168833+0.00733148	test-mlogloss:0.808553+0.110325
[580]	train-mlogloss:0.168562+0.00732558	test-mlogloss:0.808436+0.110243
[581]	train-mlogloss:0.168285+0.00730503	test-mlogloss:0.808375+0.110189
[582]	train-mlogloss:0.168022+0.00731651	test-mlogloss:0.808343+0.110127
[583]	train-mlogloss:0.167762+0.00731728	test-mlogloss:0.808226+0.109932
[584]	train-mlogloss:0.167491+0.00730766	test-mlogloss:0.808176+0.109971
[585]	train-mlogloss:0.167213+0.00730051	test-mlogloss:0.808275+0.109898
[586]	train-mlogloss:0.166949+0.00732016	test-mlogloss:0.808151+0.109795
[587]	train-mlogloss:0.166676+0.00732902	test-mlogloss:0.808217+0.109827
[588]	train-mlogloss:0.166399+0.00732744	test-mlogloss:0.808168+0.109743
[589]	train-mlogloss:0.166153+0.00731943	test-mlogloss:0.808053+0.109639
[590]	train-mlogloss:0.165881+0.00731448	test-mlogloss:0.807898+0.109657
[591]	train-mlogloss:0.165614+0.00730805	test-mlogloss:0.807933+0.10967
[592]	train-mlogloss:0.165357+0.00731106	test-mlogloss:0.807734+0.109586
[593]	train-mlogloss:0.165082+0.0073182	test-mlogloss:0.807837+0.109619
[594]	train-mlogloss:0.164828+0.00730358	test-mlogloss:0.807729+0.109566
[595]	train-mlogloss:0.164566+0.00731078	test-mlogloss:0.807566+0.109491
[596]	train-mlogloss:0.164298+0.0073097	test-mlogloss:0.807591+0.109599
[597]	train-mlogloss:0.164053+0.00732062	test-mlogloss:0.807656+0.109557
[598]	train-mlogloss:0.16378+0.00729326	test-mlogloss:0.807615+0.109563
[599]	train-mlogloss:0.163528+0.00729377	test-mlogloss:0.807509+0.10944
[600]	train-mlogloss:0.163257+0.00728633	test-mlogloss:0.807526+0.109535
[601]	train-mlogloss:0.163011+0.00729774	test-mlogloss:0.807396+0.109565
[602]	train-mlogloss:0.162774+0.00728463	test-mlogloss:0.807365+0.109499
[603]	train-mlogloss:0.162518+0.00728683	test-mlogloss:0.807457+0.109587
[604]	train-mlogloss:0.162254+0.00727695	test-mlogloss:0.807444+0.10959
[605]	train-mlogloss:0.161997+0.00727452	test-mlogloss:0.807499+0.109676
[606]	train-mlogloss:0.16175+0.00725793	test-mlogloss:0.807408+0.10965
[607]	train-mlogloss:0.16151+0.0072683	test-mlogloss:0.807396+0.109706
[608]	train-mlogloss:0.161252+0.00725985	test-mlogloss:0.807528+0.109706
[609]	train-mlogloss:0.161016+0.00725315	test-mlogloss:0.807399+0.109653
[610]	train-mlogloss:0.160776+0.00723767	test-mlogloss:0.807359+0.109669
[611]	train-mlogloss:0.160515+0.00722977	test-mlogloss:0.807518+0.109849
[612]	train-mlogloss:0.160287+0.00723162	test-mlogloss:0.807277+0.109722
[613]	train-mlogloss:0.160035+0.00720729	test-mlogloss:0.807424+0.109802
[614]	train-mlogloss:0.159797+0.00718111	test-mlogloss:0.807345+0.109777
[615]	train-mlogloss:0.15956+0.00716374	test-mlogloss:0.807362+0.109829
[616]	train-mlogloss:0.159318+0.00716678	test-mlogloss:0.807387+0.109855
[617]	train-mlogloss:0.159086+0.00716153	test-mlogloss:0.807278+0.109883
[618]	train-mlogloss:0.15885+0.00714119	test-mlogloss:0.807264+0.109911
[619]	train-mlogloss:0.158615+0.0071145	test-mlogloss:0.80736+0.109941
[620]	train-mlogloss:0.158385+0.0070948	test-mlogloss:0.80731+0.109944
[621]	train-mlogloss:0.158165+0.00709045	test-mlogloss:0.807287+0.109999
[622]	train-mlogloss:0.157926+0.00710096	test-mlogloss:0.807464+0.110004
[623]	train-mlogloss:0.157692+0.00707236	test-mlogloss:0.807447+0.110026
[624]	train-mlogloss:0.157463+0.00706949	test-mlogloss:0.80741+0.11001
[625]	train-mlogloss:0.157231+0.00705433	test-mlogloss:0.807448+0.110034
[626]	train-mlogloss:0.157014+0.00705393	test-mlogloss:0.807534+0.110205
[627]	train-mlogloss:0.156777+0.00703824	test-mlogloss:0.80734+0.110025
[628]	train-mlogloss:0.156555+0.00702847	test-mlogloss:0.807546+0.110045
[629]	train-mlogloss:0.156347+0.00700952	test-mlogloss:0.807372+0.109955
[630]	train-mlogloss:0.156126+0.00700587	test-mlogloss:0.807382+0.110021
[631]	train-mlogloss:0.155892+0.00698685	test-mlogloss:0.807373+0.109995
[632]	train-mlogloss:0.155687+0.00698323	test-mlogloss:0.807431+0.109963
[633]	train-mlogloss:0.155462+0.00695624	test-mlogloss:0.807357+0.109997
[634]	train-mlogloss:0.155256+0.00695039	test-mlogloss:0.807494+0.110029
[635]	train-mlogloss:0.15503+0.0069359	test-mlogloss:0.807446+0.109992
[636]	train-mlogloss:0.154829+0.00693142	test-mlogloss:0.807437+0.110094
[637]	train-mlogloss:0.154616+0.00692665	test-mlogloss:0.807485+0.110157
[638]	train-mlogloss:0.154394+0.00691239	test-mlogloss:0.807528+0.110185
[639]	train-mlogloss:0.154198+0.00689513	test-mlogloss:0.807445+0.1101
[640]	train-mlogloss:0.153987+0.00688283	test-mlogloss:0.807345+0.109984
[641]	train-mlogloss:0.153777+0.00687623	test-mlogloss:0.807382+0.110093
[642]	train-mlogloss:0.153577+0.00686395	test-mlogloss:0.807308+0.109908
[643]	train-mlogloss:0.15336+0.00685031	test-mlogloss:0.807343+0.110002
[644]	train-mlogloss:0.153162+0.00684029	test-mlogloss:0.807353+0.11001
[645]	train-mlogloss:0.152936+0.00680659	test-mlogloss:0.807502+0.110003
[646]	train-mlogloss:0.15273+0.00679375	test-mlogloss:0.807583+0.110064
[647]	train-mlogloss:0.152518+0.00677898	test-mlogloss:0.80756+0.109979
[648]	train-mlogloss:0.152296+0.00675871	test-mlogloss:0.807437+0.109979
[649]	train-mlogloss:0.152095+0.00674357	test-mlogloss:0.807563+0.110007
[650]	train-mlogloss:0.151886+0.00672107	test-mlogloss:0.807494+0.10993
[651]	train-mlogloss:0.151691+0.00671392	test-mlogloss:0.807371+0.109954
[652]	train-mlogloss:0.151489+0.00670219	test-mlogloss:0.80739+0.110022
[653]	train-mlogloss:0.151276+0.00668243	test-mlogloss:0.807401+0.10999
[654]	train-mlogloss:0.151072+0.00667321	test-mlogloss:0.807473+0.110073
[655]	train-mlogloss:0.150867+0.00665993	test-mlogloss:0.807637+0.110156
[656]	train-mlogloss:0.150663+0.00664577	test-mlogloss:0.807676+0.11004
[657]	train-mlogloss:0.15047+0.00664254	test-mlogloss:0.807714+0.110104
[658]	train-mlogloss:0.150277+0.00663045	test-mlogloss:0.807719+0.110126
[659]	train-mlogloss:0.150076+0.00661566	test-mlogloss:0.807869+0.110192
[660]	train-mlogloss:0.149889+0.00660544	test-mlogloss:0.807832+0.110271
[661]	train-mlogloss:0.149707+0.00660679	test-mlogloss:0.807831+0.110248
[662]	train-mlogloss:0.149501+0.00659098	test-mlogloss:0.807926+0.110364
[663]	train-mlogloss:0.149299+0.00657791	test-mlogloss:0.807901+0.110417
[664]	train-mlogloss:0.149108+0.00657619	test-mlogloss:0.808037+0.110446
[665]	train-mlogloss:0.148908+0.00655913	test-mlogloss:0.808105+0.110487
[666]	train-mlogloss:0.148717+0.0065459	test-mlogloss:0.808128+0.110534
[667]	train-mlogloss:0.148517+0.0065311	test-mlogloss:0.808104+0.110568

Model Report
Accuracy (Train): 0.9942
n_estimators : 619

Predicting

In [36]:
xgb = XGBClassifier(learning_rate=0.01, n_estimators=n_estimators, objective=objective, num_class=num_class, seed=7, 
                    max_depth=max_depth, min_child_weight=min_child_weight, gamma=gamma, colsample_bytree=colsample_bytree, 
                    subsample=subsample, reg_alpha=reg_alpha)
In [37]:
final_score = cross_val_score(xgb, x_train_best, y_train, cv=kfold, scoring='accuracy').mean()
print("Final train accuracy : {} ".format(final_score))
Final train accuracy : 0.7247536945812807 
In [38]:
xgb.fit(x_train_best, y_train)
y_pred = xgb.predict(x_test_best)
score = accuracy_score(y_test, y_pred)
print("Final test accuracy : {} ".format(score))
Final test accuracy : 0.7441860465116279 
In [39]:
xgb2 = XGBClassifier()
xgb2.fit(x_train_best, y_train)
y_pred = xgb.predict(x_test_best)
score = accuracy_score(y_test, y_pred)
print("Test accuracy without tuning : {} ".format(score))
Test accuracy without tuning : 0.7441860465116279 

Tuning again...

In [40]:
param = {'max_depth':range(0,15,1),
         'min_child_weight':range(0,15,2)}
gsrch = GridSearchCV(estimator=XGBClassifier(learning_rate=0.01, n_estimators=n_estimators, objective=objective, 
                                             num_class=num_class, seed=7, gamma=gamma, colsample_bytree=colsample_bytree, 
                    subsample=subsample), param_grid=param, scoring='accuracy',n_jobs=-1, cv=kfold)
gsrch.fit(x_train_best, y_train)
max_depth = gsrch.best_params_['max_depth']
min_child_weight = gsrch.best_params_['min_child_weight']
gsrch.best_params_, gsrch.best_score_
Out[40]:
({'max_depth': 8, 'min_child_weight': 2}, 0.7653940886699507)
In [41]:
param = {'gamma':[i/10.0 for i in range(0,101)]}
gsrch = GridSearchCV(estimator=XGBClassifier(learning_rate=0.01, n_estimators=n_estimators, objective=objective, 
                                             num_class=num_class, seed=7, max_depth=max_depth, 
                                             min_child_weight=min_child_weight, colsample_bytree=colsample_bytree,
                                             subsample=subsample), param_grid=param, scoring='accuracy',
                     n_jobs=-1, cv=kfold)
gsrch.fit(x_train_best, y_train)
gamma = gsrch.best_params_['gamma']
gsrch.best_params_, gsrch.best_score_
Out[41]:
({'gamma': 0.1}, 0.7653940886699507)
In [42]:
param = {'subsample':[i/10.0 for i in range(6,11)],
        'colsample_bytree':[i/10.0 for i in range(6,11)]}
gsrch = GridSearchCV(estimator=XGBClassifier(learning_rate=0.01, n_estimators=n_estimators, objective=objective, 
                                             num_class=num_class, seed=7, max_depth=max_depth, 
                                             min_child_weight=min_child_weight, gamma=gamma), param_grid=param, 
                     scoring='accuracy', n_jobs=-1, cv=kfold)
gsrch.fit(x_train_best, y_train)
colsample_bytree = gsrch.best_params_['colsample_bytree']
subsample = gsrch.best_params_['subsample']
gsrch.best_params_, gsrch.best_score_
Out[42]:
({'colsample_bytree': 0.9, 'subsample': 0.9}, 0.7658045977011495)
In [43]:
param = {'reg_alpha':[1e-5, 1e-2, 0.1, 1, 100, 0, 0.001, 0.005, 0.01, 0.05]}
gsrch = GridSearchCV(estimator=XGBClassifier(learning_rate=0.01, n_estimators=n_estimators, objective=objective, 
                                             num_class=num_class, seed=7, max_depth=max_depth, 
                                             min_child_weight=min_child_weight, gamma=gamma, colsample_bytree=colsample_bytree,
                                             subsample=subsample), param_grid=param, 
                     scoring='accuracy', n_jobs=-1, cv=kfold)
gsrch.fit(x_train_best, y_train)
reg_alpha = gsrch.best_params_['reg_alpha']
gsrch.best_params_, gsrch.best_score_
Out[43]:
({'reg_alpha': 1e-05}, 0.7658045977011495)
In [44]:
xgb = XGBClassifier(learning_rate=0.01, n_estimators=100000, objective=objective, num_class=num_class,seed=7, 
                    max_depth=max_depth, min_child_weight=min_child_weight, gamma=gamma, colsample_bytree=colsample_bytree, 
                    subsample=subsample, reg_alpha=reg_alpha)
n_estimators = modelfit(xgb)
[0]	train-mlogloss:1.7765+0.000332796	test-mlogloss:1.78007+0.00116252
[1]	train-mlogloss:1.76162+0.00078474	test-mlogloss:1.7689+0.00335053
[2]	train-mlogloss:1.74705+0.00115485	test-mlogloss:1.75751+0.00458878
[3]	train-mlogloss:1.73223+0.00125853	test-mlogloss:1.7465+0.00463757
[4]	train-mlogloss:1.71781+0.0014414	test-mlogloss:1.73513+0.0055258
[5]	train-mlogloss:1.70336+0.00190767	test-mlogloss:1.72475+0.0064864
[6]	train-mlogloss:1.68959+0.0024641	test-mlogloss:1.71435+0.00725387
[7]	train-mlogloss:1.67585+0.00288404	test-mlogloss:1.70425+0.00826407
[8]	train-mlogloss:1.66225+0.0031634	test-mlogloss:1.69348+0.00948881
[9]	train-mlogloss:1.64922+0.00353673	test-mlogloss:1.68318+0.0104594
[10]	train-mlogloss:1.63578+0.00390428	test-mlogloss:1.67329+0.011181
[11]	train-mlogloss:1.62263+0.00376896	test-mlogloss:1.66331+0.0119613
[12]	train-mlogloss:1.60976+0.00410091	test-mlogloss:1.65424+0.0127538
[13]	train-mlogloss:1.59723+0.00415334	test-mlogloss:1.64497+0.0140105
[14]	train-mlogloss:1.58474+0.00437675	test-mlogloss:1.6359+0.0148801
[15]	train-mlogloss:1.57255+0.00451776	test-mlogloss:1.62683+0.0151041
[16]	train-mlogloss:1.56058+0.00463148	test-mlogloss:1.61789+0.016194
[17]	train-mlogloss:1.54864+0.00509114	test-mlogloss:1.6095+0.0175656
[18]	train-mlogloss:1.53678+0.00532851	test-mlogloss:1.60106+0.0184551
[19]	train-mlogloss:1.52516+0.00517769	test-mlogloss:1.59288+0.0191173
[20]	train-mlogloss:1.51367+0.00528623	test-mlogloss:1.58387+0.0193823
[21]	train-mlogloss:1.50251+0.00559526	test-mlogloss:1.57613+0.0209097
[22]	train-mlogloss:1.49125+0.00578861	test-mlogloss:1.5683+0.021504
[23]	train-mlogloss:1.48024+0.00618249	test-mlogloss:1.56082+0.0224487
[24]	train-mlogloss:1.46927+0.00643601	test-mlogloss:1.55325+0.0232443
[25]	train-mlogloss:1.45843+0.00661094	test-mlogloss:1.54568+0.0238291
[26]	train-mlogloss:1.44758+0.00675771	test-mlogloss:1.53795+0.0245217
[27]	train-mlogloss:1.4369+0.00675107	test-mlogloss:1.53055+0.0246811
[28]	train-mlogloss:1.42658+0.00706847	test-mlogloss:1.52365+0.0253392
[29]	train-mlogloss:1.41603+0.00736501	test-mlogloss:1.5159+0.0253783
[30]	train-mlogloss:1.40586+0.00736385	test-mlogloss:1.50859+0.0258202
[31]	train-mlogloss:1.39601+0.0077937	test-mlogloss:1.50131+0.026826
[32]	train-mlogloss:1.38617+0.00794485	test-mlogloss:1.49411+0.027594
[33]	train-mlogloss:1.37644+0.00835068	test-mlogloss:1.4872+0.0276548
[34]	train-mlogloss:1.36639+0.00833087	test-mlogloss:1.47971+0.0283559
[35]	train-mlogloss:1.35679+0.00859738	test-mlogloss:1.47289+0.0290104
[36]	train-mlogloss:1.34755+0.00871329	test-mlogloss:1.46656+0.0291179
[37]	train-mlogloss:1.33815+0.00884808	test-mlogloss:1.45962+0.0290912
[38]	train-mlogloss:1.32906+0.00894911	test-mlogloss:1.45293+0.0291411
[39]	train-mlogloss:1.31968+0.00922392	test-mlogloss:1.44606+0.029177
[40]	train-mlogloss:1.31066+0.0094529	test-mlogloss:1.4393+0.0297449
[41]	train-mlogloss:1.30176+0.00980286	test-mlogloss:1.43283+0.0303044
[42]	train-mlogloss:1.29274+0.0098656	test-mlogloss:1.42629+0.0311333
[43]	train-mlogloss:1.2842+0.00986248	test-mlogloss:1.41995+0.0312617
[44]	train-mlogloss:1.2756+0.0100203	test-mlogloss:1.41413+0.0320977
[45]	train-mlogloss:1.26705+0.0100529	test-mlogloss:1.40794+0.0328556
[46]	train-mlogloss:1.25855+0.0103274	test-mlogloss:1.40213+0.0333417
[47]	train-mlogloss:1.2501+0.0106949	test-mlogloss:1.3965+0.0341762
[48]	train-mlogloss:1.24209+0.0106424	test-mlogloss:1.39039+0.0344648
[49]	train-mlogloss:1.23356+0.0107951	test-mlogloss:1.38439+0.0350704
[50]	train-mlogloss:1.22543+0.0109685	test-mlogloss:1.37904+0.0354893
[51]	train-mlogloss:1.21723+0.0111331	test-mlogloss:1.37313+0.036189
[52]	train-mlogloss:1.20919+0.011223	test-mlogloss:1.36802+0.0370061
[53]	train-mlogloss:1.2013+0.0113262	test-mlogloss:1.36299+0.0377993
[54]	train-mlogloss:1.19318+0.0115862	test-mlogloss:1.3573+0.038217
[55]	train-mlogloss:1.18553+0.0117105	test-mlogloss:1.35184+0.0384588
[56]	train-mlogloss:1.17781+0.0117066	test-mlogloss:1.34647+0.038469
[57]	train-mlogloss:1.16995+0.0119086	test-mlogloss:1.34108+0.0390216
[58]	train-mlogloss:1.16224+0.0119019	test-mlogloss:1.33601+0.0398134
[59]	train-mlogloss:1.15475+0.0118752	test-mlogloss:1.33101+0.0404593
[60]	train-mlogloss:1.14747+0.0118437	test-mlogloss:1.3258+0.0406785
[61]	train-mlogloss:1.13999+0.012186	test-mlogloss:1.32104+0.0415203
[62]	train-mlogloss:1.13276+0.012386	test-mlogloss:1.31598+0.0418365
[63]	train-mlogloss:1.12555+0.012466	test-mlogloss:1.3109+0.0427889
[64]	train-mlogloss:1.11839+0.0124338	test-mlogloss:1.30595+0.0432244
[65]	train-mlogloss:1.11148+0.0122878	test-mlogloss:1.30149+0.0438042
[66]	train-mlogloss:1.10445+0.0122291	test-mlogloss:1.29635+0.0443922
[67]	train-mlogloss:1.09736+0.0122942	test-mlogloss:1.29182+0.044883
[68]	train-mlogloss:1.09048+0.0125124	test-mlogloss:1.28747+0.0450567
[69]	train-mlogloss:1.0838+0.0126684	test-mlogloss:1.28339+0.045937
[70]	train-mlogloss:1.07713+0.0127047	test-mlogloss:1.27949+0.0468628
[71]	train-mlogloss:1.07048+0.0128704	test-mlogloss:1.27487+0.0471747
[72]	train-mlogloss:1.0638+0.0128448	test-mlogloss:1.27016+0.0477797
[73]	train-mlogloss:1.05727+0.0129482	test-mlogloss:1.26601+0.0481267
[74]	train-mlogloss:1.0507+0.0130823	test-mlogloss:1.26192+0.0486036
[75]	train-mlogloss:1.04418+0.0132084	test-mlogloss:1.25726+0.0489381
[76]	train-mlogloss:1.03767+0.0133092	test-mlogloss:1.25301+0.0492184
[77]	train-mlogloss:1.03139+0.0131039	test-mlogloss:1.24879+0.0498583
[78]	train-mlogloss:1.02498+0.0131822	test-mlogloss:1.24456+0.050487
[79]	train-mlogloss:1.01862+0.013131	test-mlogloss:1.24044+0.0508527
[80]	train-mlogloss:1.01238+0.0132025	test-mlogloss:1.23603+0.0514598
[81]	train-mlogloss:1.00632+0.0132958	test-mlogloss:1.23198+0.0516286
[82]	train-mlogloss:1.00012+0.0132351	test-mlogloss:1.22808+0.0517899
[83]	train-mlogloss:0.994221+0.0132115	test-mlogloss:1.22432+0.052236
[84]	train-mlogloss:0.988246+0.0133847	test-mlogloss:1.22065+0.0526808
[85]	train-mlogloss:0.982382+0.0133885	test-mlogloss:1.21738+0.0529863
[86]	train-mlogloss:0.976533+0.0135395	test-mlogloss:1.21347+0.0534286
[87]	train-mlogloss:0.97056+0.0135814	test-mlogloss:1.20965+0.0534403
[88]	train-mlogloss:0.964677+0.0136613	test-mlogloss:1.20573+0.0534708
[89]	train-mlogloss:0.959042+0.0137546	test-mlogloss:1.20195+0.0535073
[90]	train-mlogloss:0.953257+0.0137384	test-mlogloss:1.19848+0.054081
[91]	train-mlogloss:0.9476+0.0136613	test-mlogloss:1.19486+0.0545925
[92]	train-mlogloss:0.942087+0.0138264	test-mlogloss:1.19146+0.055156
[93]	train-mlogloss:0.936458+0.0139643	test-mlogloss:1.18799+0.0557577
[94]	train-mlogloss:0.931037+0.014051	test-mlogloss:1.18477+0.0560881
[95]	train-mlogloss:0.925525+0.0143513	test-mlogloss:1.18138+0.0565685
[96]	train-mlogloss:0.920145+0.014387	test-mlogloss:1.17755+0.0569464
[97]	train-mlogloss:0.914784+0.0144599	test-mlogloss:1.17385+0.0570797
[98]	train-mlogloss:0.909406+0.0144965	test-mlogloss:1.17033+0.0572694
[99]	train-mlogloss:0.904126+0.0144828	test-mlogloss:1.16691+0.0573754
[100]	train-mlogloss:0.898773+0.0145291	test-mlogloss:1.16338+0.0575056
[101]	train-mlogloss:0.893591+0.0147588	test-mlogloss:1.16003+0.0581045
[102]	train-mlogloss:0.888367+0.014816	test-mlogloss:1.1566+0.0584405
[103]	train-mlogloss:0.883292+0.0149497	test-mlogloss:1.15349+0.0587387
[104]	train-mlogloss:0.878141+0.0149345	test-mlogloss:1.15016+0.0590054
[105]	train-mlogloss:0.873047+0.0150024	test-mlogloss:1.14681+0.0590262
[106]	train-mlogloss:0.867947+0.015184	test-mlogloss:1.14339+0.0591614
[107]	train-mlogloss:0.863063+0.0153054	test-mlogloss:1.14034+0.0591913
[108]	train-mlogloss:0.858115+0.0153471	test-mlogloss:1.13753+0.0597221
[109]	train-mlogloss:0.853186+0.0152983	test-mlogloss:1.13452+0.0599251
[110]	train-mlogloss:0.848259+0.0153113	test-mlogloss:1.13132+0.0597769
[111]	train-mlogloss:0.843399+0.015307	test-mlogloss:1.12865+0.0604005
[112]	train-mlogloss:0.838578+0.01533	test-mlogloss:1.12553+0.0606878
[113]	train-mlogloss:0.833935+0.0153107	test-mlogloss:1.12259+0.06081
[114]	train-mlogloss:0.829416+0.0153103	test-mlogloss:1.11974+0.0608994
[115]	train-mlogloss:0.824833+0.0153005	test-mlogloss:1.11708+0.0611825
[116]	train-mlogloss:0.820209+0.0152739	test-mlogloss:1.11369+0.0614588
[117]	train-mlogloss:0.81556+0.0152252	test-mlogloss:1.11059+0.061544
[118]	train-mlogloss:0.810993+0.0150741	test-mlogloss:1.10757+0.061975
[119]	train-mlogloss:0.806656+0.0151989	test-mlogloss:1.10484+0.0623888
[120]	train-mlogloss:0.802195+0.0152253	test-mlogloss:1.10211+0.062672
[121]	train-mlogloss:0.797716+0.0153177	test-mlogloss:1.09957+0.0631563
[122]	train-mlogloss:0.793327+0.015169	test-mlogloss:1.09718+0.0634892
[123]	train-mlogloss:0.78887+0.0151837	test-mlogloss:1.09477+0.0636381
[124]	train-mlogloss:0.784586+0.0152819	test-mlogloss:1.09228+0.06376
[125]	train-mlogloss:0.780286+0.0151416	test-mlogloss:1.08954+0.0638798
[126]	train-mlogloss:0.776122+0.0151072	test-mlogloss:1.08718+0.0644004
[127]	train-mlogloss:0.771849+0.0149461	test-mlogloss:1.08459+0.0645866
[128]	train-mlogloss:0.767563+0.0148818	test-mlogloss:1.082+0.0647004
[129]	train-mlogloss:0.763477+0.0148517	test-mlogloss:1.07971+0.0648375
[130]	train-mlogloss:0.759325+0.0148332	test-mlogloss:1.07682+0.0651061
[131]	train-mlogloss:0.755163+0.0147842	test-mlogloss:1.07435+0.0651389
[132]	train-mlogloss:0.751129+0.0147985	test-mlogloss:1.0721+0.0651368
[133]	train-mlogloss:0.747046+0.0147567	test-mlogloss:1.06963+0.0652175
[134]	train-mlogloss:0.743084+0.0146804	test-mlogloss:1.06747+0.0654259
[135]	train-mlogloss:0.739109+0.0146541	test-mlogloss:1.06525+0.0656339
[136]	train-mlogloss:0.735168+0.0147475	test-mlogloss:1.06274+0.0660954
[137]	train-mlogloss:0.731223+0.0147698	test-mlogloss:1.06053+0.066155
[138]	train-mlogloss:0.727395+0.0148005	test-mlogloss:1.05814+0.0663702
[139]	train-mlogloss:0.723524+0.0147349	test-mlogloss:1.05565+0.0665194
[140]	train-mlogloss:0.719622+0.0145792	test-mlogloss:1.05314+0.0669204
[141]	train-mlogloss:0.715795+0.0144723	test-mlogloss:1.05108+0.0675169
[142]	train-mlogloss:0.712014+0.0144123	test-mlogloss:1.04891+0.0678152
[143]	train-mlogloss:0.708451+0.0145628	test-mlogloss:1.04682+0.0680518
[144]	train-mlogloss:0.704768+0.0143828	test-mlogloss:1.04444+0.0685394
[145]	train-mlogloss:0.701115+0.0144792	test-mlogloss:1.04238+0.0686837
[146]	train-mlogloss:0.697638+0.0144718	test-mlogloss:1.04023+0.0685953
[147]	train-mlogloss:0.694072+0.0145065	test-mlogloss:1.03792+0.0686472
[148]	train-mlogloss:0.690382+0.0145168	test-mlogloss:1.0357+0.0687002
[149]	train-mlogloss:0.686869+0.0145734	test-mlogloss:1.03379+0.0686688
[150]	train-mlogloss:0.683261+0.0146267	test-mlogloss:1.0318+0.069199
[151]	train-mlogloss:0.679793+0.0146709	test-mlogloss:1.02995+0.0693464
[152]	train-mlogloss:0.676219+0.0146021	test-mlogloss:1.02788+0.0696122
[153]	train-mlogloss:0.672766+0.0146511	test-mlogloss:1.02574+0.0700002
[154]	train-mlogloss:0.669335+0.0146251	test-mlogloss:1.02355+0.0703293
[155]	train-mlogloss:0.665834+0.0146892	test-mlogloss:1.02156+0.0700996
[156]	train-mlogloss:0.662461+0.014657	test-mlogloss:1.01976+0.0704675
[157]	train-mlogloss:0.659092+0.0146016	test-mlogloss:1.01795+0.0706864
[158]	train-mlogloss:0.655719+0.014454	test-mlogloss:1.01629+0.0711471
[159]	train-mlogloss:0.652453+0.0143514	test-mlogloss:1.01424+0.0711503
[160]	train-mlogloss:0.649111+0.0143549	test-mlogloss:1.01245+0.071064
[161]	train-mlogloss:0.645918+0.0144694	test-mlogloss:1.01059+0.0714549
[162]	train-mlogloss:0.642694+0.0143985	test-mlogloss:1.00878+0.0716453
[163]	train-mlogloss:0.639461+0.0144049	test-mlogloss:1.00678+0.0718304
[164]	train-mlogloss:0.636252+0.0141993	test-mlogloss:1.00499+0.0720151
[165]	train-mlogloss:0.633029+0.0141095	test-mlogloss:1.00325+0.0723289
[166]	train-mlogloss:0.629939+0.0141162	test-mlogloss:1.0013+0.0722456
[167]	train-mlogloss:0.626767+0.0140492	test-mlogloss:0.99936+0.0721695
[168]	train-mlogloss:0.623688+0.0140344	test-mlogloss:0.997551+0.0722876
[169]	train-mlogloss:0.620508+0.0140179	test-mlogloss:0.995803+0.0722719
[170]	train-mlogloss:0.617481+0.0139981	test-mlogloss:0.994255+0.0724849
[171]	train-mlogloss:0.61442+0.0139038	test-mlogloss:0.992329+0.0726213
[172]	train-mlogloss:0.611407+0.0138414	test-mlogloss:0.990657+0.0726883
[173]	train-mlogloss:0.608391+0.0136948	test-mlogloss:0.988891+0.0729044
[174]	train-mlogloss:0.605394+0.0136568	test-mlogloss:0.987216+0.0730303
[175]	train-mlogloss:0.602436+0.013631	test-mlogloss:0.985266+0.073312
[176]	train-mlogloss:0.599476+0.013572	test-mlogloss:0.983544+0.0734648
[177]	train-mlogloss:0.596595+0.0135814	test-mlogloss:0.981973+0.073846
[178]	train-mlogloss:0.593687+0.0135108	test-mlogloss:0.980357+0.0739473
[179]	train-mlogloss:0.590871+0.0133651	test-mlogloss:0.978188+0.0740038
[180]	train-mlogloss:0.588026+0.0133496	test-mlogloss:0.976587+0.0740349
[181]	train-mlogloss:0.585215+0.0132961	test-mlogloss:0.975174+0.0740641
[182]	train-mlogloss:0.582391+0.0132728	test-mlogloss:0.973637+0.0743421
[183]	train-mlogloss:0.57953+0.0132835	test-mlogloss:0.971749+0.074538
[184]	train-mlogloss:0.576745+0.0132126	test-mlogloss:0.970078+0.0748515
[185]	train-mlogloss:0.574038+0.0131629	test-mlogloss:0.968583+0.074889
[186]	train-mlogloss:0.571402+0.0131224	test-mlogloss:0.967235+0.0750985
[187]	train-mlogloss:0.568721+0.0130838	test-mlogloss:0.965694+0.0748278
[188]	train-mlogloss:0.566022+0.0130332	test-mlogloss:0.964106+0.0748772
[189]	train-mlogloss:0.563311+0.0130141	test-mlogloss:0.962307+0.0747698
[190]	train-mlogloss:0.560665+0.012999	test-mlogloss:0.960532+0.0748297
[191]	train-mlogloss:0.558063+0.0129176	test-mlogloss:0.95905+0.0750189
[192]	train-mlogloss:0.555508+0.0128794	test-mlogloss:0.957806+0.075254
[193]	train-mlogloss:0.55293+0.0128245	test-mlogloss:0.956523+0.0751735
[194]	train-mlogloss:0.550315+0.0127912	test-mlogloss:0.954871+0.0751183
[195]	train-mlogloss:0.547671+0.0127807	test-mlogloss:0.953281+0.0753043
[196]	train-mlogloss:0.545166+0.0127154	test-mlogloss:0.952061+0.0755519
[197]	train-mlogloss:0.542698+0.0126274	test-mlogloss:0.950584+0.0754144
[198]	train-mlogloss:0.540173+0.0126336	test-mlogloss:0.949095+0.0758647
[199]	train-mlogloss:0.53773+0.0126296	test-mlogloss:0.947865+0.0761771
[200]	train-mlogloss:0.535327+0.0125792	test-mlogloss:0.946443+0.0759498
[201]	train-mlogloss:0.532924+0.0125132	test-mlogloss:0.945354+0.0759163
[202]	train-mlogloss:0.530424+0.0124687	test-mlogloss:0.943895+0.076007
[203]	train-mlogloss:0.528062+0.0123591	test-mlogloss:0.94273+0.0761736
[204]	train-mlogloss:0.525654+0.0123752	test-mlogloss:0.941571+0.076233
[205]	train-mlogloss:0.523319+0.0123635	test-mlogloss:0.940613+0.0763777
[206]	train-mlogloss:0.52104+0.0123985	test-mlogloss:0.939435+0.0767831
[207]	train-mlogloss:0.518699+0.0123355	test-mlogloss:0.937955+0.0769245
[208]	train-mlogloss:0.516347+0.0122397	test-mlogloss:0.936739+0.0772377
[209]	train-mlogloss:0.514011+0.0121933	test-mlogloss:0.935433+0.0773868
[210]	train-mlogloss:0.51173+0.0122226	test-mlogloss:0.933859+0.0772561
[211]	train-mlogloss:0.509497+0.0121659	test-mlogloss:0.932517+0.0775567
[212]	train-mlogloss:0.507208+0.0121439	test-mlogloss:0.931643+0.0776544
[213]	train-mlogloss:0.504926+0.012109	test-mlogloss:0.930477+0.0777158
[214]	train-mlogloss:0.502669+0.012136	test-mlogloss:0.929201+0.0777969
[215]	train-mlogloss:0.500476+0.0122105	test-mlogloss:0.928168+0.0780878
[216]	train-mlogloss:0.498279+0.0121195	test-mlogloss:0.926896+0.0782123
[217]	train-mlogloss:0.496113+0.0121442	test-mlogloss:0.925942+0.0782086
[218]	train-mlogloss:0.493855+0.0121572	test-mlogloss:0.924812+0.0784063
[219]	train-mlogloss:0.491714+0.012255	test-mlogloss:0.923606+0.0784811
[220]	train-mlogloss:0.489594+0.0121783	test-mlogloss:0.922485+0.0786347
[221]	train-mlogloss:0.487442+0.0121094	test-mlogloss:0.921583+0.078815
[222]	train-mlogloss:0.485322+0.0121887	test-mlogloss:0.920336+0.0786942
[223]	train-mlogloss:0.483252+0.0120809	test-mlogloss:0.919162+0.0786528
[224]	train-mlogloss:0.481098+0.0119508	test-mlogloss:0.918027+0.0787861
[225]	train-mlogloss:0.479073+0.0119083	test-mlogloss:0.917105+0.0793032
[226]	train-mlogloss:0.477002+0.0119042	test-mlogloss:0.916041+0.0792412
[227]	train-mlogloss:0.474903+0.0118951	test-mlogloss:0.914794+0.0791691
[228]	train-mlogloss:0.472863+0.0118261	test-mlogloss:0.913659+0.0795957
[229]	train-mlogloss:0.470848+0.011817	test-mlogloss:0.912887+0.0797314
[230]	train-mlogloss:0.468794+0.0117716	test-mlogloss:0.912244+0.0799623
[231]	train-mlogloss:0.466749+0.0117307	test-mlogloss:0.911334+0.0800487
[232]	train-mlogloss:0.464782+0.0117471	test-mlogloss:0.910505+0.0802533
[233]	train-mlogloss:0.462798+0.0117895	test-mlogloss:0.90962+0.0804267
[234]	train-mlogloss:0.460819+0.0117632	test-mlogloss:0.908741+0.0807262
[235]	train-mlogloss:0.458931+0.0117843	test-mlogloss:0.907521+0.0804215
[236]	train-mlogloss:0.45699+0.0117298	test-mlogloss:0.906428+0.0803053
[237]	train-mlogloss:0.455005+0.0117931	test-mlogloss:0.905273+0.0804927
[238]	train-mlogloss:0.453113+0.0117401	test-mlogloss:0.90397+0.0804402
[239]	train-mlogloss:0.451129+0.0116934	test-mlogloss:0.903184+0.0805756
[240]	train-mlogloss:0.449244+0.0116576	test-mlogloss:0.902135+0.0807872
[241]	train-mlogloss:0.447364+0.0116432	test-mlogloss:0.901493+0.0808152
[242]	train-mlogloss:0.445502+0.0115918	test-mlogloss:0.900826+0.0809365
[243]	train-mlogloss:0.44359+0.0115526	test-mlogloss:0.89969+0.0808143
[244]	train-mlogloss:0.441776+0.0115762	test-mlogloss:0.898692+0.0808835
[245]	train-mlogloss:0.43995+0.0115169	test-mlogloss:0.89788+0.0810084
[246]	train-mlogloss:0.438167+0.0114259	test-mlogloss:0.896929+0.0812476
[247]	train-mlogloss:0.43639+0.0114052	test-mlogloss:0.896052+0.0813621
[248]	train-mlogloss:0.434617+0.0114171	test-mlogloss:0.895026+0.0812399
[249]	train-mlogloss:0.432813+0.0113634	test-mlogloss:0.894281+0.0811366
[250]	train-mlogloss:0.431083+0.0113087	test-mlogloss:0.893311+0.0813716
[251]	train-mlogloss:0.429321+0.0112591	test-mlogloss:0.892294+0.0813583
[252]	train-mlogloss:0.427568+0.0112279	test-mlogloss:0.890996+0.0814046
[253]	train-mlogloss:0.425853+0.0111648	test-mlogloss:0.890048+0.081432
[254]	train-mlogloss:0.424123+0.0110885	test-mlogloss:0.889413+0.0815408
[255]	train-mlogloss:0.42247+0.0111089	test-mlogloss:0.888671+0.0814769
[256]	train-mlogloss:0.420737+0.0110616	test-mlogloss:0.887946+0.0815785
[257]	train-mlogloss:0.419068+0.0110737	test-mlogloss:0.887076+0.0818207
[258]	train-mlogloss:0.41738+0.0110515	test-mlogloss:0.886078+0.0817453
[259]	train-mlogloss:0.415789+0.0110214	test-mlogloss:0.884921+0.0816068
[260]	train-mlogloss:0.414166+0.0110439	test-mlogloss:0.884073+0.0817007
[261]	train-mlogloss:0.412519+0.011007	test-mlogloss:0.883416+0.0816428
[262]	train-mlogloss:0.410921+0.0109669	test-mlogloss:0.882594+0.0817683
[263]	train-mlogloss:0.409338+0.0108474	test-mlogloss:0.882023+0.0820853
[264]	train-mlogloss:0.40775+0.010796	test-mlogloss:0.881324+0.0821579
[265]	train-mlogloss:0.406164+0.010767	test-mlogloss:0.880414+0.0820794
[266]	train-mlogloss:0.404535+0.0107772	test-mlogloss:0.879452+0.0820156
[267]	train-mlogloss:0.402954+0.0107183	test-mlogloss:0.878475+0.0820263
[268]	train-mlogloss:0.401408+0.0107	test-mlogloss:0.877824+0.0821234
[269]	train-mlogloss:0.399762+0.0106624	test-mlogloss:0.87707+0.0823549
[270]	train-mlogloss:0.398171+0.0106251	test-mlogloss:0.876367+0.0826154
[271]	train-mlogloss:0.39665+0.0105603	test-mlogloss:0.8754+0.0825534
[272]	train-mlogloss:0.395155+0.0105442	test-mlogloss:0.874918+0.0826247
[273]	train-mlogloss:0.393629+0.0104531	test-mlogloss:0.874206+0.0829391
[274]	train-mlogloss:0.39211+0.0104386	test-mlogloss:0.873397+0.0829585
[275]	train-mlogloss:0.390541+0.0103746	test-mlogloss:0.872714+0.0829425
[276]	train-mlogloss:0.389047+0.0103826	test-mlogloss:0.871828+0.0829877
[277]	train-mlogloss:0.387523+0.0103289	test-mlogloss:0.871241+0.0832673
[278]	train-mlogloss:0.386061+0.010291	test-mlogloss:0.870598+0.0831838
[279]	train-mlogloss:0.384526+0.0102749	test-mlogloss:0.869732+0.0831804
[280]	train-mlogloss:0.383043+0.0102328	test-mlogloss:0.868831+0.0832013
[281]	train-mlogloss:0.381607+0.0102073	test-mlogloss:0.868259+0.083107
[282]	train-mlogloss:0.380179+0.0102112	test-mlogloss:0.867958+0.083179
[283]	train-mlogloss:0.378736+0.0101272	test-mlogloss:0.867252+0.0834264
[284]	train-mlogloss:0.377309+0.0100789	test-mlogloss:0.866754+0.0835604
[285]	train-mlogloss:0.375896+0.0100199	test-mlogloss:0.865892+0.0834307
[286]	train-mlogloss:0.374543+0.0100464	test-mlogloss:0.86518+0.0835236
[287]	train-mlogloss:0.373164+0.0100384	test-mlogloss:0.864666+0.08374
[288]	train-mlogloss:0.371756+0.00999605	test-mlogloss:0.864258+0.0839806
[289]	train-mlogloss:0.370372+0.0100141	test-mlogloss:0.863492+0.0838696
[290]	train-mlogloss:0.36901+0.00998251	test-mlogloss:0.862929+0.0841445
[291]	train-mlogloss:0.367693+0.00993616	test-mlogloss:0.862339+0.0842344
[292]	train-mlogloss:0.366332+0.00993746	test-mlogloss:0.861708+0.0840398
[293]	train-mlogloss:0.364981+0.00988567	test-mlogloss:0.860907+0.0840917
[294]	train-mlogloss:0.363719+0.00990033	test-mlogloss:0.860386+0.0842284
[295]	train-mlogloss:0.362407+0.00983602	test-mlogloss:0.859655+0.0841363
[296]	train-mlogloss:0.361065+0.00981557	test-mlogloss:0.859009+0.084127
[297]	train-mlogloss:0.359745+0.00977386	test-mlogloss:0.858426+0.0842715
[298]	train-mlogloss:0.358472+0.0097414	test-mlogloss:0.857446+0.084089
[299]	train-mlogloss:0.357137+0.00977095	test-mlogloss:0.856957+0.0844935
[300]	train-mlogloss:0.355808+0.00974796	test-mlogloss:0.856523+0.0846693
[301]	train-mlogloss:0.354473+0.0097055	test-mlogloss:0.855871+0.0845634
[302]	train-mlogloss:0.353219+0.00968893	test-mlogloss:0.855169+0.084591
[303]	train-mlogloss:0.351952+0.00963717	test-mlogloss:0.85464+0.0847892
[304]	train-mlogloss:0.350659+0.00964452	test-mlogloss:0.85415+0.0849115
[305]	train-mlogloss:0.349402+0.00963814	test-mlogloss:0.853515+0.084891
[306]	train-mlogloss:0.348144+0.0096173	test-mlogloss:0.853166+0.0849307
[307]	train-mlogloss:0.346861+0.00960007	test-mlogloss:0.852536+0.0850775
[308]	train-mlogloss:0.345641+0.00959534	test-mlogloss:0.852058+0.084963
[309]	train-mlogloss:0.344474+0.00956224	test-mlogloss:0.85149+0.0849233
[310]	train-mlogloss:0.343295+0.00959346	test-mlogloss:0.851001+0.0849023
[311]	train-mlogloss:0.342061+0.00954853	test-mlogloss:0.850598+0.0850953
[312]	train-mlogloss:0.340902+0.00951412	test-mlogloss:0.850103+0.0849662
[313]	train-mlogloss:0.339729+0.00949475	test-mlogloss:0.849491+0.0848914
[314]	train-mlogloss:0.338538+0.00948558	test-mlogloss:0.848855+0.0849933
[315]	train-mlogloss:0.33738+0.00945392	test-mlogloss:0.848277+0.0849409
[316]	train-mlogloss:0.336255+0.00947941	test-mlogloss:0.848086+0.085
[317]	train-mlogloss:0.335117+0.00947675	test-mlogloss:0.847528+0.0851589
[318]	train-mlogloss:0.333992+0.00946909	test-mlogloss:0.847096+0.0852051
[319]	train-mlogloss:0.332835+0.00944841	test-mlogloss:0.846849+0.0853688
[320]	train-mlogloss:0.331686+0.0094199	test-mlogloss:0.846452+0.0853962
[321]	train-mlogloss:0.330524+0.0093796	test-mlogloss:0.845797+0.0855475
[322]	train-mlogloss:0.329416+0.00939164	test-mlogloss:0.845175+0.0854272
[323]	train-mlogloss:0.328283+0.00934663	test-mlogloss:0.844529+0.0854461
[324]	train-mlogloss:0.32719+0.00929762	test-mlogloss:0.843957+0.0854948
[325]	train-mlogloss:0.326075+0.00928066	test-mlogloss:0.843612+0.085731
[326]	train-mlogloss:0.325065+0.00926807	test-mlogloss:0.843313+0.0857981
[327]	train-mlogloss:0.323995+0.00925506	test-mlogloss:0.842962+0.0858824
[328]	train-mlogloss:0.32292+0.00924654	test-mlogloss:0.842288+0.0855112
[329]	train-mlogloss:0.321765+0.00923416	test-mlogloss:0.841543+0.0854226
[330]	train-mlogloss:0.320678+0.0092202	test-mlogloss:0.841039+0.085464
[331]	train-mlogloss:0.319648+0.00922075	test-mlogloss:0.840657+0.0854329
[332]	train-mlogloss:0.318611+0.00920033	test-mlogloss:0.840373+0.0854991
[333]	train-mlogloss:0.317563+0.00916981	test-mlogloss:0.839949+0.0854305
[334]	train-mlogloss:0.31653+0.00918677	test-mlogloss:0.839333+0.085543
[335]	train-mlogloss:0.315493+0.00916297	test-mlogloss:0.838741+0.0856009
[336]	train-mlogloss:0.314448+0.00914597	test-mlogloss:0.838323+0.0856578
[337]	train-mlogloss:0.313425+0.00909048	test-mlogloss:0.837787+0.0857246
[338]	train-mlogloss:0.312433+0.00907739	test-mlogloss:0.837242+0.0856791
[339]	train-mlogloss:0.311413+0.00905935	test-mlogloss:0.836857+0.0857849
[340]	train-mlogloss:0.310437+0.00901433	test-mlogloss:0.836588+0.0858723
[341]	train-mlogloss:0.309454+0.00902497	test-mlogloss:0.836165+0.0859535
[342]	train-mlogloss:0.308431+0.00898069	test-mlogloss:0.835434+0.0856502
[343]	train-mlogloss:0.307412+0.00895302	test-mlogloss:0.835059+0.0856397
[344]	train-mlogloss:0.306437+0.00890157	test-mlogloss:0.834457+0.0857689
[345]	train-mlogloss:0.305462+0.00888907	test-mlogloss:0.833917+0.0856779
[346]	train-mlogloss:0.304525+0.00889976	test-mlogloss:0.833388+0.0855583
[347]	train-mlogloss:0.303564+0.00888434	test-mlogloss:0.832769+0.0855938
[348]	train-mlogloss:0.302632+0.00887464	test-mlogloss:0.832288+0.085555
[349]	train-mlogloss:0.301703+0.00887226	test-mlogloss:0.831954+0.0855291
[350]	train-mlogloss:0.300751+0.00885798	test-mlogloss:0.831427+0.0855056
[351]	train-mlogloss:0.29981+0.00884729	test-mlogloss:0.83103+0.0857875
[352]	train-mlogloss:0.298917+0.00886657	test-mlogloss:0.830798+0.0858737
[353]	train-mlogloss:0.298006+0.00886168	test-mlogloss:0.830439+0.0860033
[354]	train-mlogloss:0.297081+0.00885826	test-mlogloss:0.830183+0.0860751
[355]	train-mlogloss:0.296139+0.0088668	test-mlogloss:0.829796+0.086129
[356]	train-mlogloss:0.295218+0.00885827	test-mlogloss:0.829496+0.0861918
[357]	train-mlogloss:0.294316+0.00888425	test-mlogloss:0.829113+0.0860817
[358]	train-mlogloss:0.293413+0.00889542	test-mlogloss:0.828701+0.0860404
[359]	train-mlogloss:0.292532+0.00890196	test-mlogloss:0.828283+0.0860556
[360]	train-mlogloss:0.291634+0.00888222	test-mlogloss:0.827893+0.086168
[361]	train-mlogloss:0.290731+0.00887555	test-mlogloss:0.827709+0.0862973
[362]	train-mlogloss:0.289873+0.00884325	test-mlogloss:0.827248+0.086378
[363]	train-mlogloss:0.289001+0.00882009	test-mlogloss:0.826967+0.0863712
[364]	train-mlogloss:0.288104+0.00881706	test-mlogloss:0.826716+0.0864737
[365]	train-mlogloss:0.287193+0.00877391	test-mlogloss:0.826334+0.0866957
[366]	train-mlogloss:0.286282+0.00876102	test-mlogloss:0.82601+0.08681
[367]	train-mlogloss:0.285409+0.00873252	test-mlogloss:0.825583+0.0866944
[368]	train-mlogloss:0.284574+0.0087406	test-mlogloss:0.825255+0.0865096
[369]	train-mlogloss:0.283706+0.00873563	test-mlogloss:0.825004+0.0864351
[370]	train-mlogloss:0.282851+0.00871802	test-mlogloss:0.824595+0.0863092
[371]	train-mlogloss:0.281998+0.00871538	test-mlogloss:0.824521+0.0866081
[372]	train-mlogloss:0.281111+0.00867164	test-mlogloss:0.824209+0.0866911
[373]	train-mlogloss:0.280306+0.00864125	test-mlogloss:0.823972+0.0868619
[374]	train-mlogloss:0.279522+0.00862431	test-mlogloss:0.823575+0.087083
[375]	train-mlogloss:0.278703+0.00860509	test-mlogloss:0.823234+0.0868899
[376]	train-mlogloss:0.27788+0.00860466	test-mlogloss:0.823108+0.0871927
[377]	train-mlogloss:0.277082+0.00856388	test-mlogloss:0.822835+0.0871837
[378]	train-mlogloss:0.276307+0.0085621	test-mlogloss:0.822381+0.0871164
[379]	train-mlogloss:0.275485+0.0085603	test-mlogloss:0.82232+0.0872605
[380]	train-mlogloss:0.274679+0.00857168	test-mlogloss:0.822159+0.0873741
[381]	train-mlogloss:0.273884+0.00855376	test-mlogloss:0.821964+0.087574
[382]	train-mlogloss:0.273092+0.00852474	test-mlogloss:0.821627+0.0876432
[383]	train-mlogloss:0.272297+0.00848547	test-mlogloss:0.821238+0.0876088
[384]	train-mlogloss:0.27148+0.00847391	test-mlogloss:0.820838+0.0876293
[385]	train-mlogloss:0.270685+0.0084465	test-mlogloss:0.820568+0.0876134
[386]	train-mlogloss:0.269931+0.00843415	test-mlogloss:0.820308+0.0875422
[387]	train-mlogloss:0.269172+0.00839188	test-mlogloss:0.819926+0.0875393
[388]	train-mlogloss:0.268401+0.00838481	test-mlogloss:0.819583+0.0875186
[389]	train-mlogloss:0.267644+0.00837153	test-mlogloss:0.819443+0.0875245
[390]	train-mlogloss:0.266858+0.00835563	test-mlogloss:0.819213+0.0875565
[391]	train-mlogloss:0.266123+0.00836611	test-mlogloss:0.818833+0.0874921
[392]	train-mlogloss:0.26539+0.00835681	test-mlogloss:0.818504+0.0874136
[393]	train-mlogloss:0.264692+0.00837277	test-mlogloss:0.818273+0.0872818
[394]	train-mlogloss:0.263966+0.00835445	test-mlogloss:0.81803+0.0873656
[395]	train-mlogloss:0.263211+0.00837061	test-mlogloss:0.817829+0.0874037
[396]	train-mlogloss:0.262505+0.00835727	test-mlogloss:0.817667+0.0873652
[397]	train-mlogloss:0.261804+0.00835176	test-mlogloss:0.817463+0.087421
[398]	train-mlogloss:0.261122+0.00832511	test-mlogloss:0.817109+0.0874021
[399]	train-mlogloss:0.260401+0.00830892	test-mlogloss:0.816936+0.0873919
[400]	train-mlogloss:0.259687+0.00831524	test-mlogloss:0.816785+0.0875626
[401]	train-mlogloss:0.258974+0.00831759	test-mlogloss:0.816568+0.0875915
[402]	train-mlogloss:0.258273+0.00833195	test-mlogloss:0.816341+0.0877142
[403]	train-mlogloss:0.257532+0.00830973	test-mlogloss:0.816123+0.087579
[404]	train-mlogloss:0.256864+0.00830401	test-mlogloss:0.815914+0.087538
[405]	train-mlogloss:0.256188+0.00828696	test-mlogloss:0.81594+0.0875631
[406]	train-mlogloss:0.255498+0.008275	test-mlogloss:0.815554+0.0874793
[407]	train-mlogloss:0.254858+0.00825171	test-mlogloss:0.815328+0.0876923
[408]	train-mlogloss:0.254216+0.0082489	test-mlogloss:0.815297+0.0879151
[409]	train-mlogloss:0.253516+0.00823238	test-mlogloss:0.815182+0.0879803
[410]	train-mlogloss:0.252819+0.00822913	test-mlogloss:0.814814+0.0879883
[411]	train-mlogloss:0.252134+0.00820474	test-mlogloss:0.814603+0.0880798
[412]	train-mlogloss:0.251493+0.00818618	test-mlogloss:0.814296+0.0881179
[413]	train-mlogloss:0.250876+0.00817022	test-mlogloss:0.814339+0.0879895
[414]	train-mlogloss:0.250233+0.0081555	test-mlogloss:0.813888+0.0878227
[415]	train-mlogloss:0.249592+0.00813115	test-mlogloss:0.813553+0.0878088
[416]	train-mlogloss:0.248946+0.00812497	test-mlogloss:0.813286+0.0878486
[417]	train-mlogloss:0.248309+0.00810668	test-mlogloss:0.812918+0.0878199
[418]	train-mlogloss:0.247693+0.00810328	test-mlogloss:0.812696+0.0876762
[419]	train-mlogloss:0.24706+0.00808279	test-mlogloss:0.812453+0.0876503
[420]	train-mlogloss:0.246413+0.00806068	test-mlogloss:0.812313+0.0876984
[421]	train-mlogloss:0.245748+0.00803623	test-mlogloss:0.81203+0.0878316
[422]	train-mlogloss:0.245098+0.00802653	test-mlogloss:0.811848+0.0876575
[423]	train-mlogloss:0.244452+0.00799434	test-mlogloss:0.811532+0.0877581
[424]	train-mlogloss:0.243834+0.00798899	test-mlogloss:0.811364+0.0878999
[425]	train-mlogloss:0.243205+0.00795845	test-mlogloss:0.811123+0.0879609
[426]	train-mlogloss:0.242613+0.00793035	test-mlogloss:0.811185+0.0880215
[427]	train-mlogloss:0.242033+0.00790623	test-mlogloss:0.811044+0.0881037
[428]	train-mlogloss:0.241421+0.00789696	test-mlogloss:0.811099+0.088301
[429]	train-mlogloss:0.240843+0.00790276	test-mlogloss:0.810914+0.0882882
[430]	train-mlogloss:0.240224+0.00787789	test-mlogloss:0.810715+0.0881758
[431]	train-mlogloss:0.239625+0.00787686	test-mlogloss:0.810633+0.0882789
[432]	train-mlogloss:0.239033+0.00785079	test-mlogloss:0.810547+0.0883803
[433]	train-mlogloss:0.238451+0.00784307	test-mlogloss:0.810319+0.0884717
[434]	train-mlogloss:0.237893+0.00782272	test-mlogloss:0.810218+0.0885045
[435]	train-mlogloss:0.237304+0.00776389	test-mlogloss:0.809941+0.0885617
[436]	train-mlogloss:0.236739+0.00775307	test-mlogloss:0.809688+0.0885604
[437]	train-mlogloss:0.236116+0.00775109	test-mlogloss:0.80948+0.0885822
[438]	train-mlogloss:0.235558+0.00772187	test-mlogloss:0.809396+0.0885087
[439]	train-mlogloss:0.235006+0.00774992	test-mlogloss:0.809113+0.0884572
[440]	train-mlogloss:0.234433+0.00775745	test-mlogloss:0.808879+0.0885722
[441]	train-mlogloss:0.23387+0.00772978	test-mlogloss:0.808494+0.0885582
[442]	train-mlogloss:0.23332+0.00769937	test-mlogloss:0.808315+0.088563
[443]	train-mlogloss:0.232757+0.00770562	test-mlogloss:0.808137+0.0886236
[444]	train-mlogloss:0.232185+0.00767262	test-mlogloss:0.808179+0.0887898
[445]	train-mlogloss:0.231625+0.00763042	test-mlogloss:0.808199+0.088969
[446]	train-mlogloss:0.231067+0.00761785	test-mlogloss:0.808162+0.0889791
[447]	train-mlogloss:0.230529+0.00761911	test-mlogloss:0.807935+0.0889511
[448]	train-mlogloss:0.229982+0.00758307	test-mlogloss:0.807876+0.089018
[449]	train-mlogloss:0.229394+0.00756166	test-mlogloss:0.807811+0.0892
[450]	train-mlogloss:0.228846+0.00759442	test-mlogloss:0.807625+0.0891354
[451]	train-mlogloss:0.228314+0.00757832	test-mlogloss:0.807464+0.0891592
[452]	train-mlogloss:0.227758+0.00757485	test-mlogloss:0.807363+0.0891405
[453]	train-mlogloss:0.22722+0.00753774	test-mlogloss:0.807173+0.0891968
[454]	train-mlogloss:0.226686+0.00753324	test-mlogloss:0.807074+0.0891721
[455]	train-mlogloss:0.226149+0.00752123	test-mlogloss:0.806824+0.0892932
[456]	train-mlogloss:0.22563+0.00750122	test-mlogloss:0.806596+0.089183
[457]	train-mlogloss:0.225123+0.00750278	test-mlogloss:0.806546+0.0891249
[458]	train-mlogloss:0.224626+0.00749922	test-mlogloss:0.806442+0.0892342
[459]	train-mlogloss:0.22412+0.00749609	test-mlogloss:0.806167+0.0892132
[460]	train-mlogloss:0.223626+0.00748254	test-mlogloss:0.805891+0.0891409
[461]	train-mlogloss:0.223111+0.00749403	test-mlogloss:0.805723+0.089167
[462]	train-mlogloss:0.222619+0.00750142	test-mlogloss:0.805626+0.0890858
[463]	train-mlogloss:0.222111+0.00749428	test-mlogloss:0.805649+0.0891838
[464]	train-mlogloss:0.221585+0.00746733	test-mlogloss:0.805481+0.0891391
[465]	train-mlogloss:0.221077+0.00744756	test-mlogloss:0.80547+0.0891121
[466]	train-mlogloss:0.220603+0.00744481	test-mlogloss:0.80523+0.0890578
[467]	train-mlogloss:0.220071+0.00742575	test-mlogloss:0.805234+0.0893568
[468]	train-mlogloss:0.219569+0.0074086	test-mlogloss:0.805243+0.0894774
[469]	train-mlogloss:0.219103+0.00740385	test-mlogloss:0.805266+0.0896252
[470]	train-mlogloss:0.218631+0.00740749	test-mlogloss:0.805102+0.0895033
[471]	train-mlogloss:0.218151+0.00738034	test-mlogloss:0.804821+0.0894506
[472]	train-mlogloss:0.217639+0.00737959	test-mlogloss:0.804666+0.0895071
[473]	train-mlogloss:0.217152+0.0073375	test-mlogloss:0.804618+0.0896043
[474]	train-mlogloss:0.216704+0.00732611	test-mlogloss:0.804495+0.0899602
[475]	train-mlogloss:0.216225+0.00728364	test-mlogloss:0.804379+0.0900831
[476]	train-mlogloss:0.215749+0.00728092	test-mlogloss:0.804286+0.0901498
[477]	train-mlogloss:0.215304+0.0072798	test-mlogloss:0.804148+0.0902626
[478]	train-mlogloss:0.214832+0.00726499	test-mlogloss:0.804019+0.0905586
[479]	train-mlogloss:0.21438+0.00725422	test-mlogloss:0.804013+0.0903788
[480]	train-mlogloss:0.213921+0.00725908	test-mlogloss:0.803766+0.0903104
[481]	train-mlogloss:0.213448+0.0072508	test-mlogloss:0.803675+0.0903197
[482]	train-mlogloss:0.212981+0.00725516	test-mlogloss:0.803495+0.0903077
[483]	train-mlogloss:0.212534+0.00725429	test-mlogloss:0.803402+0.0903572
[484]	train-mlogloss:0.212091+0.00723647	test-mlogloss:0.803276+0.0903592
[485]	train-mlogloss:0.21164+0.00721543	test-mlogloss:0.803249+0.0904437
[486]	train-mlogloss:0.211193+0.0072004	test-mlogloss:0.803389+0.0906656
[487]	train-mlogloss:0.210753+0.00719664	test-mlogloss:0.803289+0.0907347
[488]	train-mlogloss:0.210304+0.00716742	test-mlogloss:0.803223+0.0906324
[489]	train-mlogloss:0.209845+0.00713266	test-mlogloss:0.802986+0.0904973
[490]	train-mlogloss:0.209423+0.00712614	test-mlogloss:0.802837+0.0904664
[491]	train-mlogloss:0.208973+0.00711498	test-mlogloss:0.802748+0.0907149
[492]	train-mlogloss:0.208525+0.00711409	test-mlogloss:0.80272+0.0908431
[493]	train-mlogloss:0.208114+0.0071056	test-mlogloss:0.80258+0.0909106
[494]	train-mlogloss:0.207677+0.0070805	test-mlogloss:0.802286+0.0908156
[495]	train-mlogloss:0.207262+0.00705517	test-mlogloss:0.80212+0.0908402
[496]	train-mlogloss:0.206833+0.00703276	test-mlogloss:0.802023+0.0910757
[497]	train-mlogloss:0.206434+0.00703281	test-mlogloss:0.801957+0.0912082
[498]	train-mlogloss:0.20602+0.00702286	test-mlogloss:0.802032+0.0913202
[499]	train-mlogloss:0.205596+0.00700681	test-mlogloss:0.801743+0.0914298
[500]	train-mlogloss:0.205181+0.00699711	test-mlogloss:0.801653+0.0915645
[501]	train-mlogloss:0.204754+0.00695556	test-mlogloss:0.801661+0.0917147
[502]	train-mlogloss:0.204334+0.00692105	test-mlogloss:0.801673+0.0918954
[503]	train-mlogloss:0.203923+0.00691362	test-mlogloss:0.801397+0.0917939
[504]	train-mlogloss:0.203497+0.00691149	test-mlogloss:0.80159+0.0919179
[505]	train-mlogloss:0.203086+0.00688166	test-mlogloss:0.80154+0.0919389
[506]	train-mlogloss:0.202705+0.00686919	test-mlogloss:0.801398+0.0920027
[507]	train-mlogloss:0.202296+0.00684843	test-mlogloss:0.801138+0.0919964
[508]	train-mlogloss:0.201884+0.006817	test-mlogloss:0.801195+0.0920061
[509]	train-mlogloss:0.201476+0.00681474	test-mlogloss:0.800972+0.0922185
[510]	train-mlogloss:0.201052+0.00679714	test-mlogloss:0.800905+0.0921627
[511]	train-mlogloss:0.200666+0.00678039	test-mlogloss:0.80092+0.09222
[512]	train-mlogloss:0.200259+0.00677892	test-mlogloss:0.800953+0.0923814
[513]	train-mlogloss:0.199861+0.00676554	test-mlogloss:0.80084+0.0923452
[514]	train-mlogloss:0.19946+0.00673239	test-mlogloss:0.800744+0.0923641
[515]	train-mlogloss:0.199083+0.00671773	test-mlogloss:0.800662+0.0925139
[516]	train-mlogloss:0.198696+0.0067417	test-mlogloss:0.800656+0.0923902
[517]	train-mlogloss:0.198315+0.00676112	test-mlogloss:0.800543+0.0923348
[518]	train-mlogloss:0.197936+0.00677668	test-mlogloss:0.800543+0.0925169
[519]	train-mlogloss:0.197556+0.00678889	test-mlogloss:0.800435+0.0926432
[520]	train-mlogloss:0.197209+0.00676215	test-mlogloss:0.800419+0.0926848
[521]	train-mlogloss:0.196836+0.00676509	test-mlogloss:0.800265+0.0928434
[522]	train-mlogloss:0.196445+0.00676132	test-mlogloss:0.800055+0.0926818
[523]	train-mlogloss:0.19607+0.00676061	test-mlogloss:0.80006+0.0928793
[524]	train-mlogloss:0.195686+0.00676394	test-mlogloss:0.800019+0.0927855
[525]	train-mlogloss:0.195321+0.00674393	test-mlogloss:0.799879+0.0929019
[526]	train-mlogloss:0.194989+0.00674671	test-mlogloss:0.799953+0.0929957
[527]	train-mlogloss:0.194622+0.00674408	test-mlogloss:0.799832+0.0928947
[528]	train-mlogloss:0.194252+0.00672959	test-mlogloss:0.79968+0.0929447
[529]	train-mlogloss:0.193891+0.00668902	test-mlogloss:0.7997+0.0930562
[530]	train-mlogloss:0.193539+0.00667995	test-mlogloss:0.7997+0.093139
[531]	train-mlogloss:0.193166+0.00666003	test-mlogloss:0.799678+0.0932657
[532]	train-mlogloss:0.192802+0.00664795	test-mlogloss:0.799576+0.0931698
[533]	train-mlogloss:0.192444+0.00663641	test-mlogloss:0.7994+0.0931446
[534]	train-mlogloss:0.19209+0.00663081	test-mlogloss:0.799394+0.0931598
[535]	train-mlogloss:0.191757+0.00663534	test-mlogloss:0.799239+0.0930453
[536]	train-mlogloss:0.191404+0.00663439	test-mlogloss:0.799301+0.0931203
[537]	train-mlogloss:0.191021+0.00660659	test-mlogloss:0.799206+0.0932321
[538]	train-mlogloss:0.190653+0.00660487	test-mlogloss:0.799107+0.0932184
[539]	train-mlogloss:0.190292+0.00659626	test-mlogloss:0.799112+0.0932555
[540]	train-mlogloss:0.189954+0.00657518	test-mlogloss:0.798948+0.0932654
[541]	train-mlogloss:0.189631+0.0066073	test-mlogloss:0.799168+0.093472
[542]	train-mlogloss:0.189292+0.00657893	test-mlogloss:0.799028+0.0936867
[543]	train-mlogloss:0.188941+0.00657121	test-mlogloss:0.798863+0.0934785
[544]	train-mlogloss:0.188594+0.0065703	test-mlogloss:0.798962+0.0937641
[545]	train-mlogloss:0.188276+0.00654325	test-mlogloss:0.798942+0.0936583
[546]	train-mlogloss:0.187939+0.0065371	test-mlogloss:0.798821+0.0935953
[547]	train-mlogloss:0.187611+0.00654034	test-mlogloss:0.798686+0.0936487
[548]	train-mlogloss:0.187273+0.00653473	test-mlogloss:0.798803+0.0937282
[549]	train-mlogloss:0.186944+0.00652885	test-mlogloss:0.798654+0.093802
[550]	train-mlogloss:0.186619+0.00649593	test-mlogloss:0.798643+0.0938475
[551]	train-mlogloss:0.186291+0.00647891	test-mlogloss:0.798516+0.0938398
[552]	train-mlogloss:0.18595+0.00647357	test-mlogloss:0.79846+0.0939278
[553]	train-mlogloss:0.185632+0.00649029	test-mlogloss:0.798409+0.094009
[554]	train-mlogloss:0.185305+0.00648118	test-mlogloss:0.798543+0.0940276
[555]	train-mlogloss:0.185002+0.00648845	test-mlogloss:0.798651+0.0939175
[556]	train-mlogloss:0.184683+0.00645732	test-mlogloss:0.79884+0.0940485
[557]	train-mlogloss:0.184376+0.00645553	test-mlogloss:0.798895+0.0941026
[558]	train-mlogloss:0.184054+0.00645712	test-mlogloss:0.798936+0.0940427
[559]	train-mlogloss:0.18375+0.0064395	test-mlogloss:0.798888+0.09392
[560]	train-mlogloss:0.183428+0.00644972	test-mlogloss:0.798824+0.0938448
[561]	train-mlogloss:0.183105+0.00642055	test-mlogloss:0.798918+0.093866
[562]	train-mlogloss:0.182796+0.00641665	test-mlogloss:0.798856+0.0938951
[563]	train-mlogloss:0.182503+0.00641246	test-mlogloss:0.7989+0.093801
[564]	train-mlogloss:0.182208+0.00639482	test-mlogloss:0.798795+0.093788
[565]	train-mlogloss:0.181926+0.00639703	test-mlogloss:0.798905+0.0938315
[566]	train-mlogloss:0.181624+0.00638532	test-mlogloss:0.79904+0.0938546
[567]	train-mlogloss:0.181337+0.00637655	test-mlogloss:0.799044+0.0939675
[568]	train-mlogloss:0.181033+0.00637145	test-mlogloss:0.799006+0.0939505
[569]	train-mlogloss:0.180727+0.00637911	test-mlogloss:0.798978+0.0940401
[570]	train-mlogloss:0.18044+0.00635367	test-mlogloss:0.798823+0.0940992
[571]	train-mlogloss:0.180155+0.0063407	test-mlogloss:0.798785+0.0939364
[572]	train-mlogloss:0.179864+0.00633249	test-mlogloss:0.798757+0.0938835
[573]	train-mlogloss:0.179565+0.00631546	test-mlogloss:0.798753+0.0939358
[574]	train-mlogloss:0.179301+0.0062988	test-mlogloss:0.798775+0.0938597
[575]	train-mlogloss:0.179024+0.00628591	test-mlogloss:0.798811+0.0939067
[576]	train-mlogloss:0.178751+0.00629305	test-mlogloss:0.798729+0.0940141
[577]	train-mlogloss:0.178486+0.00628808	test-mlogloss:0.798686+0.0942093
[578]	train-mlogloss:0.178197+0.00626273	test-mlogloss:0.798717+0.0943059
[579]	train-mlogloss:0.177908+0.0062498	test-mlogloss:0.798764+0.094254
[580]	train-mlogloss:0.17764+0.00624706	test-mlogloss:0.79876+0.0942924
[581]	train-mlogloss:0.177378+0.00624029	test-mlogloss:0.798668+0.0941455
[582]	train-mlogloss:0.177128+0.0062228	test-mlogloss:0.798738+0.0939758
[583]	train-mlogloss:0.176854+0.00621474	test-mlogloss:0.798829+0.0940487
[584]	train-mlogloss:0.176567+0.00622624	test-mlogloss:0.79885+0.0940887
[585]	train-mlogloss:0.17629+0.00622265	test-mlogloss:0.798985+0.0941042
[586]	train-mlogloss:0.176027+0.00621135	test-mlogloss:0.798984+0.094016
[587]	train-mlogloss:0.175766+0.00621436	test-mlogloss:0.79899+0.0940062
[588]	train-mlogloss:0.175497+0.00620528	test-mlogloss:0.798932+0.094015
[589]	train-mlogloss:0.17523+0.00617967	test-mlogloss:0.798986+0.0940204
[590]	train-mlogloss:0.174959+0.00619512	test-mlogloss:0.79913+0.094097
[591]	train-mlogloss:0.174681+0.0061862	test-mlogloss:0.799123+0.0940339
[592]	train-mlogloss:0.174426+0.00618458	test-mlogloss:0.799176+0.0941865
[593]	train-mlogloss:0.17417+0.00616498	test-mlogloss:0.799101+0.0940453
[594]	train-mlogloss:0.173918+0.00615591	test-mlogloss:0.798902+0.0940369
[595]	train-mlogloss:0.173651+0.00616893	test-mlogloss:0.798909+0.0938814
[596]	train-mlogloss:0.173416+0.00615551	test-mlogloss:0.798796+0.0938787
[597]	train-mlogloss:0.173166+0.00615325	test-mlogloss:0.798872+0.0938765
[598]	train-mlogloss:0.172933+0.0061259	test-mlogloss:0.798862+0.0936679
[599]	train-mlogloss:0.17269+0.00610734	test-mlogloss:0.798901+0.0936206
[600]	train-mlogloss:0.17243+0.00610672	test-mlogloss:0.798854+0.0937632
[601]	train-mlogloss:0.172183+0.00608665	test-mlogloss:0.798789+0.0937739
[602]	train-mlogloss:0.171943+0.00609819	test-mlogloss:0.798702+0.0936576

Model Report
Accuracy (Train): 0.9883
n_estimators : 554

Predicting...

In [45]:
xgb = XGBClassifier(learning_rate=0.01, n_estimators=n_estimators, objective=objective, num_class=num_class, seed=7, 
                    max_depth=max_depth, min_child_weight=min_child_weight, gamma=gamma, colsample_bytree=colsample_bytree, 
                    subsample=subsample)
In [46]:
final_score = cross_val_score(xgb, x_train_best, y_train, cv=kfold, scoring='accuracy').mean()
print("Final train accuracy : {} ".format(final_score))
Final train accuracy : 0.7598522167487686 
In [47]:
xgb.fit(x_train_best, y_train)
y_pred = xgb.predict(x_test_best)
score = accuracy_score(y_test, y_pred)
print("Final test accuracy : {} ".format(score))
Final test accuracy : 0.7674418604651163 

Note:

Tuning improvements are minimal. Also repeating tuning process for XGBoost moel seems to improve accuracy


Improvement

  • Create a function to repeat tuning process

Contact Me

www.linkedin.com/in/billygustave

billygustave.com