AdaBoost

Use R package fastAdaboost, which use decision trees as weak classifiers as the paper use decision stumps as week learners. They have accuracy rate of 95.8% and we have accuracy rate of 31/34 = 91.2%.


In [1]:
library(fastAdaboost)
set.seed(201702)
load("../transformed data/paper3.rda")
load("DP.rda")

In [2]:
# build the data for R functions
r_train = data.frame(train_cl, Y = factor(golub_train_l))
r_test =data.frame( test_cl, Y = factor(golub_test_l))
# build the classifier iter 100
ada_cl = adaboost(Y~., data = r_train, 100)

# prediction and result
ada_train_pr = predict(ada_cl, r_train)
ada_test_pr = predict(ada_cl, newdata = r_test)
table(Train_Predict = ada_train_pr$class, Train_Actual = golub_train_l)
table(Test_Predict = ada_test_pr$class, Test_Actual = golub_test_l)


             Train_Actual
Train_Predict  0  1
            0 27  0
            1  0 11
            Test_Actual
Test_Predict  0  1
           0 18  1
           1  2 13

In [ ]: