Leren: Programming assignment 2

This assignment can be done in teams of 2

Student 1: Roan de Jong (10791930)
Student 2: Ghislaine van den Boogerd (student_id)


This notebook provides a template for your programming assignment 2. You may want to use parts of your code from the previous assignment(s) as a starting point for this assignment.

The code you hand-in should follow the structure from this document. Write down your functions in the cells they belong to. Note that the structure corresponds with the structure from the actual programming assignment. Make sure you read this for the full explanation of what is expected of you.

Submission:

  • Make sure your code can be run from top to bottom without errors.
  • Include your data files in the zip file.
  • Comment your code

One way be sure you code can be run without errors is by quiting iPython completely and then restart iPython and run all cells again (you can do this by going to the menu bar above: Cell > Run all). This way you make sure that no old definitions of functions or values of variables are left (that your program might still be using).


If you have any questions ask your teaching assistent. We are here for you.


Multivariate Linear Regression


In [1]:
from __future__ import division
import numpy as np
import pandas as pd
import csv
import matplotlib.pylab as plt

class linReg:

    df = None
    input_vars = None
    output_vars = None
    thetas = None
    alpha = 0.0

    # formats the self.df properly
    def __init__(self, fileName, alpha):
        self.df = pd.read_csv(fileName, header=None)
        length_col = len(self.df[self.df.columns[-1]])
        # normalize the values
        x = self.df[self.df.columns[0:-1]].as_matrix()
        y = self.df[self.df.columns[-1]].as_matrix().reshape(length_col, 1)
        self.output_vars = y / y.max(0)
        theta_0 = np.ones((length_col, 1))
        self.input_vars = np.hstack((theta_0, x))
        # add a fake x_0 to make matrix multiplications possible
        thet0 = np.ones((length_col, 1))
        self.input_vars = np.hstack((thet0, x))
        self.thetas = np.ones((5, 1))
        self.alpha = alpha


    @property
    def grad_vec(self):
        return np.dot(self.input_vars, self.thetas)

    @property
    def update(self):
        x = self.output_vars - self.grad_vec
        y = np.dot(self.input_vars.T, x)
        self.thetas = self.thetas + self.alpha * y
        return self.thetas

    @property
    def cost(self):
        summation = (self.grad_vec - self.output_vars)
        return 0.5 * np.dot(summation.T, summation)

    def train(self, iterations):
        for i in range(iterations):
            self.update
            print(self.cost)

1) Reading in data


In [4]:
if __name__ == '__main__':
    trainer = linReg('housesRegr.csv', 0.0000000000001)

2) Gradient function


In [5]:
if __name__ == '__main__':
    trainer = linReg('housesRegr.csv', 0.0000000000001)
    print(trainer.grad_vec)


[[ 135220.]
 [ 137190.]
 [ 138181.]
 [ 139261.]
 [ 137536.]
 [ 138238.]
 [ 138646.]
 [ 138546.]
 [ 140527.]
 [ 138899.]
 [ 139809.]
 [ 140136.]
 [ 140553.]
 [ 140833.]
 [ 141067.]
 [ 141463.]
 [ 141775.]
 [ 142889.]
 [ 141769.]
 [ 141552.]
 [ 144253.]
 [ 143967.]
 [ 141117.]
 [ 141540.]
 [ 141545.]
 [ 142711.]
 [ 142648.]
 [ 142086.]
 [ 142474.]
 [ 143618.]
 [ 142939.]
 [ 143135.]
 [ 142597.]
 [ 143158.]
 [ 142962.]
 [ 142670.]
 [ 143321.]
 [ 144949.]
 [ 144801.]
 [ 143885.]
 [ 143752.]
 [ 145220.]
 [ 145190.]
 [ 145826.]
 [ 144626.]
 [ 148487.]
 [ 144895.]
 [ 145261.]
 [ 145758.]
 [ 145148.]
 [ 145124.]
 [ 144709.]
 [ 144260.]
 [ 149944.]
 [ 144731.]
 [ 144422.]
 [ 145641.]
 [ 145889.]
 [ 146059.]
 [ 146986.]
 [ 145503.]
 [ 145444.]
 [ 146570.]
 [ 146751.]
 [ 146423.]
 [ 145873.]
 [ 146177.]
 [ 145604.]
 [ 145560.]
 [ 146751.]
 [ 146781.]
 [ 146793.]
 [ 146224.]
 [ 147509.]
 [ 146750.]
 [ 148172.]
 [ 146629.]
 [ 147234.]
 [ 147521.]
 [ 147342.]
 [ 147502.]
 [ 147414.]
 [ 146825.]
 [ 147176.]
 [ 147606.]
 [ 146964.]
 [ 148170.]
 [ 149332.]
 [ 148405.]
 [ 147868.]
 [ 147357.]
 [ 147846.]
 [ 149203.]
 [ 147842.]
 [ 147983.]
 [ 147616.]
 [ 148732.]
 [ 148075.]
 [ 147542.]
 [ 148256.]
 [ 148459.]
 [ 148499.]
 [ 147197.]
 [ 148836.]
 [ 148192.]
 [ 149114.]
 [ 148438.]
 [ 148319.]
 [ 148611.]
 [ 150636.]
 [ 150554.]
 [ 151488.]
 [ 150937.]
 [ 148838.]
 [ 148465.]
 [ 149971.]
 [ 150378.]
 [ 149592.]
 [ 148925.]
 [ 148837.]
 [ 149407.]
 [ 149781.]
 [ 150717.]
 [ 149909.]
 [ 149700.]
 [ 151286.]
 [ 151069.]
 [ 148901.]
 [ 149673.]
 [ 149471.]
 [ 150154.]
 [ 148859.]
 [ 149552.]
 [ 149553.]
 [ 150136.]
 [ 150497.]
 [ 150479.]
 [ 149273.]
 [ 150278.]
 [ 149426.]
 [ 150911.]
 [ 150022.]
 [ 149842.]
 [ 150272.]
 [ 150606.]
 [ 150553.]
 [ 150987.]
 [ 150549.]
 [ 150477.]
 [ 150454.]
 [ 151324.]
 [ 152444.]
 [ 150569.]
 [ 150948.]
 [ 151210.]
 [ 150244.]
 [ 150809.]
 [ 150733.]
 [ 151128.]
 [ 151388.]
 [ 151736.]
 [ 151512.]
 [ 150905.]
 [ 150484.]
 [ 150077.]
 [ 152479.]
 [ 150706.]
 [ 150821.]
 [ 151626.]
 [ 151080.]
 [ 151161.]
 [ 150890.]
 [ 152043.]
 [ 150220.]
 [ 151126.]
 [ 152525.]
 [ 150810.]
 [ 150923.]
 [ 153220.]
 [ 151221.]
 [ 151165.]
 [ 151223.]
 [ 152453.]
 [ 151532.]
 [ 150907.]
 [ 150932.]
 [ 151388.]
 [ 152399.]
 [ 150810.]
 [ 150812.]
 [ 150815.]
 [ 150816.]
 [ 150817.]
 [ 151310.]
 [ 153047.]
 [ 151281.]
 [ 151919.]
 [ 151504.]
 [ 152418.]
 [ 151656.]
 [ 150935.]
 [ 156956.]
 [ 151222.]
 [ 151870.]
 [ 151455.]
 [ 151962.]
 [ 152005.]
 [ 151017.]
 [ 151382.]
 [ 151431.]
 [ 151815.]
 [ 154152.]
 [ 151439.]
 [ 151764.]
 [ 151777.]
 [ 153788.]
 [ 151648.]
 [ 151377.]
 [ 152291.]
 [ 155860.]
 [ 151828.]
 [ 152116.]
 [ 151752.]
 [ 151763.]
 [ 151410.]
 [ 151968.]
 [ 152518.]
 [ 152300.]
 [ 153615.]
 [ 152950.]
 [ 151869.]
 [ 152234.]
 [ 151698.]
 [ 152164.]
 [ 152179.]
 [ 151774.]
 [ 151703.]
 [ 152404.]
 [ 153276.]
 [ 153269.]
 [ 152552.]
 [ 152719.]
 [ 151712.]
 [ 152250.]
 [ 152358.]
 [ 152729.]
 [ 152811.]
 [ 152111.]
 [ 155421.]
 [ 152607.]
 [ 153071.]
 [ 152286.]
 [ 152187.]
 [ 151899.]
 [ 152606.]
 [ 152344.]
 [ 152694.]
 [ 152338.]
 [ 154824.]
 [ 153082.]
 [ 153801.]
 [ 152887.]
 [ 152778.]
 [ 152754.]
 [ 151893.]
 [ 152522.]
 [ 152204.]
 [ 153033.]
 [ 154451.]
 [ 155157.]
 [ 152453.]
 [ 152419.]
 [ 153634.]
 [ 152500.]
 [ 152699.]
 [ 153147.]
 [ 153056.]
 [ 152813.]
 [ 152708.]
 [ 152409.]
 [ 152377.]
 [ 153066.]
 [ 153465.]
 [ 153030.]
 [ 152600.]
 [ 152896.]
 [ 155037.]
 [ 152559.]
 [ 153049.]
 [ 153419.]
 [ 154710.]
 [ 152987.]
 [ 152804.]
 [ 153264.]
 [ 154543.]
 [ 153390.]
 [ 152788.]
 [ 152745.]
 [ 153791.]
 [ 153792.]
 [ 153511.]
 [ 152691.]
 [ 153496.]
 [ 155016.]
 [ 153878.]
 [ 152792.]
 [ 152841.]
 [ 152993.]
 [ 153411.]
 [ 153984.]
 [ 153516.]
 [ 152830.]
 [ 154439.]
 [ 154996.]
 [ 153118.]
 [ 153034.]
 [ 153014.]
 [ 153034.]
 [ 154179.]
 [ 153026.]
 [ 153256.]
 [ 153147.]
 [ 152974.]
 [ 153016.]
 [ 152656.]
 [ 153366.]
 [ 153308.]
 [ 154293.]
 [ 153884.]
 [ 154575.]
 [ 153599.]
 [ 153395.]
 [ 153043.]
 [ 153208.]
 [ 153256.]
 [ 155014.]
 [ 153320.]
 [ 153269.]
 [ 154170.]
 [ 154494.]
 [ 152882.]
 [ 153934.]
 [ 153424.]
 [ 153459.]
 [ 153560.]
 [ 153008.]
 [ 153672.]
 [ 153234.]
 [ 152728.]
 [ 155346.]
 [ 154051.]
 [ 154162.]
 [ 153359.]
 [ 153990.]
 [ 153235.]
 [ 153598.]
 [ 154413.]
 [ 153238.]
 [ 154218.]
 [ 153549.]
 [ 154435.]
 [ 153438.]
 [ 153531.]
 [ 153391.]
 [ 154137.]
 [ 153683.]
 [ 153436.]
 [ 158405.]
 [ 153626.]
 [ 154374.]
 [ 154596.]
 [ 153327.]
 [ 153866.]
 [ 154366.]
 [ 154331.]
 [ 153812.]
 [ 153713.]
 [ 153677.]
 [ 153259.]
 [ 155886.]
 [ 153796.]
 [ 154567.]
 [ 153980.]
 [ 153397.]
 [ 155541.]
 [ 154322.]
 [ 154132.]
 [ 154691.]
 [ 153871.]
 [ 153471.]
 [ 154287.]
 [ 154292.]
 [ 153796.]
 [ 154088.]
 [ 154156.]
 [ 154677.]
 [ 153763.]
 [ 156684.]
 [ 153980.]
 [ 153633.]
 [ 154812.]
 [ 155063.]
 [ 153804.]
 [ 153922.]
 [ 154099.]
 [ 154883.]
 [ 153823.]
 [ 158085.]
 [ 154551.]
 [ 154635.]
 [ 154616.]
 [ 155153.]
 [ 153968.]
 [ 155354.]
 [ 153927.]
 [ 154821.]
 [ 154047.]
 [ 153589.]
 [ 154613.]
 [ 156571.]
 [ 154477.]
 [ 153576.]
 [ 154317.]
 [ 154010.]
 [ 154523.]
 [ 154332.]
 [ 154709.]
 [ 154998.]
 [ 154387.]
 [ 154622.]
 [ 154622.]
 [ 154232.]
 [ 154118.]
 [ 155139.]
 [ 154713.]
 [ 156546.]
 [ 154746.]
 [ 154442.]
 [ 154926.]
 [ 154272.]
 [ 154727.]
 [ 154123.]
 [ 154905.]
 [ 158116.]
 [ 154261.]
 [ 154824.]
 [ 155376.]
 [ 154231.]
 [ 154849.]
 [ 155061.]
 [ 155504.]
 [ 155188.]
 [ 155691.]
 [ 154605.]
 [ 154174.]
 [ 154188.]
 [ 154932.]
 [ 155182.]
 [ 154101.]
 [ 156516.]
 [ 154734.]
 [ 155065.]
 [ 154216.]
 [ 154588.]
 [ 156881.]
 [ 154385.]
 [ 154877.]
 [ 155475.]
 [ 154362.]
 [ 155704.]
 [ 154838.]
 [ 154509.]
 [ 154465.]
 [ 154468.]
 [ 154637.]
 [ 155256.]
 [ 154810.]
 [ 154380.]
 [ 154238.]
 [ 154360.]
 [ 155368.]
 [ 154629.]
 [ 154439.]
 [ 155318.]
 [ 154428.]
 [ 156093.]
 [ 154998.]
 [ 155277.]
 [ 155833.]
 [ 154544.]
 [ 154513.]
 [ 155485.]
 [ 155311.]
 [ 154732.]
 [ 156118.]
 [ 154626.]
 [ 155205.]
 [ 158715.]
 [ 155389.]
 [ 154757.]
 [ 154529.]
 [ 155184.]
 [ 155697.]
 [ 154780.]
 [ 155333.]
 [ 155070.]
 [ 154823.]
 [ 154645.]
 [ 154851.]
 [ 155798.]
 [ 154788.]
 [ 155569.]
 [ 154262.]
 [ 154836.]
 [ 154816.]
 [ 154870.]
 [ 155089.]
 [ 155026.]
 [ 155710.]
 [ 154938.]
 [ 156176.]
 [ 155011.]
 [ 155691.]
 [ 155850.]
 [ 155851.]
 [ 155677.]
 [ 155652.]
 [ 154897.]
 [ 154997.]
 [ 156216.]
 [ 156510.]
 [ 155536.]
 [ 155773.]
 [ 154926.]
 [ 154470.]
 [ 155131.]
 [ 154624.]
 [ 154755.]
 [ 157406.]
 [ 155590.]
 [ 154982.]
 [ 155389.]
 [ 155577.]
 [ 155589.]
 [ 154702.]
 [ 155133.]
 [ 155183.]
 [ 155289.]
 [ 156057.]
 [ 155180.]
 [ 155768.]
 [ 155319.]
 [ 155480.]
 [ 155575.]
 [ 156515.]
 [ 155538.]
 [ 155023.]
 [ 155328.]
 [ 155347.]
 [ 155633.]
 [ 156248.]
 [ 159254.]
 [ 154808.]
 [ 156393.]
 [ 155074.]
 [ 155418.]
 [ 155189.]
 [ 155420.]
 [ 156318.]
 [ 157139.]
 [ 155830.]
 [ 156047.]
 [ 155987.]
 [ 155595.]
 [ 155991.]
 [ 155992.]
 [ 156054.]
 [ 155429.]
 [ 155799.]
 [ 155451.]
 [ 155853.]
 [ 156560.]
 [ 155438.]
 [ 155561.]
 [ 155535.]
 [ 155693.]
 [ 155074.]
 [ 155584.]
 [ 155351.]
 [ 157225.]
 [ 157107.]
 [ 155862.]
 [ 155185.]
 [ 156123.]
 [ 155968.]
 [ 155507.]
 [ 155439.]
 [ 155522.]
 [ 156249.]
 [ 155450.]
 [ 156252.]
 [ 156162.]
 [ 155832.]
 [ 155522.]
 [ 155381.]
 [ 155746.]
 [ 155643.]
 [ 155490.]
 [ 156344.]
 [ 156462.]
 [ 155059.]
 [ 156286.]
 [ 156676.]
 [ 155839.]
 [ 156173.]
 [ 157451.]
 [ 155528.]
 [ 155330.]
 [ 156415.]
 [ 155534.]
 [ 155835.]
 [ 155835.]
 [ 155585.]
 [ 155738.]
 [ 155421.]
 [ 155804.]
 [ 155786.]
 [ 155567.]
 [ 155502.]
 [ 155761.]
 [ 155168.]
 [ 155056.]
 [ 155781.]
 [ 156165.]
 [ 158163.]
 [ 156863.]
 [ 155159.]
 [ 156600.]
 [ 156225.]
 [ 159286.]
 [ 155565.]
 [ 156064.]
 [ 155390.]
 [ 156451.]
 [ 157376.]
 [ 157995.]
 [ 155946.]
 [ 156238.]
 [ 156218.]
 [ 158122.]
 [ 155598.]
 [ 155903.]
 [ 155624.]
 [ 156321.]
 [ 156431.]
 [ 156333.]
 [ 156417.]
 [ 157091.]
 [ 155012.]
 [ 156191.]
 [ 155447.]
 [ 155860.]
 [ 156547.]
 [ 156401.]
 [ 155442.]
 [ 155976.]
 [ 155938.]
 [ 155486.]
 [ 155642.]
 [ 156600.]
 [ 155429.]
 [ 155640.]
 [ 155849.]
 [ 155845.]
 [ 156897.]
 [ 156251.]
 [ 156373.]
 [ 155520.]
 [ 156902.]
 [ 156127.]
 [ 155585.]
 [ 155865.]
 [ 156127.]
 [ 155963.]
 [ 155738.]
 [ 156616.]
 [ 156533.]
 [ 155945.]
 [ 156565.]
 [ 155835.]
 [ 155661.]
 [ 156307.]
 [ 158019.]
 [ 156660.]
 [ 156952.]
 [ 157265.]
 [ 156420.]
 [ 155399.]
 [ 155530.]
 [ 155915.]
 [ 156429.]
 [ 155811.]
 [ 158649.]
 [ 154589.]
 [ 157674.]
 [ 155900.]
 [ 156575.]
 [ 156781.]
 [ 156578.]
 [ 157646.]
 [ 155760.]
 [ 155350.]
 [ 155440.]
 [ 155401.]
 [ 155517.]
 [ 155841.]
 [ 156867.]
 [ 157491.]
 [ 157491.]
 [ 156329.]
 [ 155679.]
 [ 155828.]
 [ 155322.]
 [ 156596.]
 [ 155433.]
 [ 155839.]
 [ 155932.]
 [ 158185.]
 [ 154796.]
 [ 157218.]
 [ 156599.]
 [ 156078.]
 [ 157134.]
 [ 156253.]
 [ 155850.]
 [ 155548.]
 [ 156035.]
 [ 156239.]
 [ 155891.]
 [ 155657.]
 [ 155841.]
 [ 156369.]
 [ 156285.]
 [ 156153.]
 [ 157218.]
 [ 155839.]
 [ 155906.]
 [ 156827.]
 [ 156143.]
 [ 155676.]
 [ 156201.]
 [ 156072.]
 [ 155881.]
 [ 159596.]
 [ 156534.]
 [ 156071.]
 [ 155460.]
 [ 156399.]
 [ 157180.]
 [ 157938.]
 [ 156392.]
 [ 155544.]
 [ 157267.]
 [ 155537.]
 [ 156289.]
 [ 155989.]
 [ 156228.]
 [ 157358.]
 [ 155458.]
 [ 157069.]
 [ 156001.]
 [ 156802.]
 [ 157290.]
 [ 156337.]
 [ 156514.]
 [ 156287.]
 [ 156174.]
 [ 156448.]
 [ 155676.]
 [ 156556.]
 [ 156979.]]

3) Parameter updating


In [6]:
if __name__ == '__main__':
    trainer = linReg('housesRegr.csv', 0.0000000000001)
    print(trainer.update)


[[ 0.99998805]
 [-0.80805272]
 [ 0.99996242]
 [ 0.99997181]
 [ 0.97899608]]

4) Cost function


In [7]:
if __name__ == '__main__':
    trainer = linReg('housesRegr.csv', 0.0000000000001)
    print(trainer.cost)


[[  9.14566757e+12]]

5) Optimization learning rate and iterations


In [ ]:
if __name__ == '__main__':
    # the optimized learning rate
    trainer = linReg('housesRegr.csv', 0.0000000000001)
    trainer.train(1000000)

Polynomial Regression

1) Extension to polynomial regression


In [ ]:

2) Cost function


In [ ]:

3) Optimization learning rate and iterations


In [ ]:

Discussion:

[You discussion comes here]


Logistic Regression

1) Reading in data


In [2]:
from __future__ import division
import numpy as np
import pandas as pd
import csv
import math

class logReg:

    df = None
    input_vars = None
    classifying_vars = None
    thetas = None
    alpha = 0.0

    def __init__(self, fileName, alpha):
        self.df = pd.read_csv(fileName, header=None)
        length_col = len(self.df[self.df.columns[-1]])
        self.classifying_vars = self.df[self.df.columns[-1]].as_matrix()\
                                                            .reshape(length_col, 1)
        x = self.df[self.df.columns[0:-1]].as_matrix()
        # this is the column for x_0
        temp_arr = np.ones((1, len(x.T[0])))
        for column in x.T:
            if column.max(0) > 0:
                column = column / column.max(0)
            temp_arr = np.vstack((temp_arr, column))
        self.input_vars = temp_arr.T
        self.thetas = np.full((len(self.input_vars[0]), 1), 0.5)
        self.alpha = alpha

    @property
    def gradient(self):
        theta_x = np.dot(self.input_vars, self.thetas)
        # An ugly way to make a np.array
        h_x = np.array([0.0])
        for example in theta_x:
            h_x = np.vstack((h_x, 1 / (1 + math.e**(-example))))
        # We added this range to get rid of the useless 1st index: 0.0
        return h_x[1:]

    # Update the theta's as described in the lecture notes
    def update(self, classifier):
        output_vars = self.classifying_vars
        np.place(output_vars, output_vars != classifier, [0])
        np.place(output_vars, output_vars == classifier, [1])
        x = self.gradient - output_vars
        y = np.dot(self.input_vars.T, x)
        self.thetas = self.thetas - self.alpha * y
        return self.thetas

    # calculate the cost
    def cost(self, classifier):
        h_x = self.gradient
        cost = 0.0
        for training_example in zip(h_x, self.classifying_vars):
            if training_example[1] == classifier:
                cost = cost + math.log(training_example[0])
            else:
                cost = cost + math.log(1 - training_example[0])
        cost = -(1/len(self.classifying_vars)) * cost
        return cost

    # train the model on a certain number
    def train(self, classifier, iterations):
        for i in range(0, iterations):
            self.update(classifier)
        print(self.cost(classifier))

1) Reading the data


In [3]:
if __name__ == '__main__':
    trainer = logReg('digits123.csv', 0.0001)

2) Gradient calculating and parameter updating


In [4]:
if __name__ == '__main__':
    trainer = logReg('digits123.csv', 0.0001)
    print(trainer.update(1))


[[ 0.46400339]
 [ 0.5       ]
 [ 0.49528348]
 [ 0.47971354]
 [ 0.46810254]
 [ 0.47308355]
 [ 0.48880713]
 [ 0.49891342]
 [ 0.4999    ]
 [ 0.4998    ]
 [ 0.48864047]
 [ 0.47025168]
 [ 0.47632724]
 [ 0.47320255]
 [ 0.48019528]
 [ 0.49712519]
 [ 0.49985   ]
 [ 0.49990001]
 [ 0.49243789]
 [ 0.48708218]
 [ 0.49142035]
 [ 0.47341506]
 [ 0.4827763 ]
 [ 0.49834298]
 [ 0.5       ]
 [ 0.50000001]
 [ 0.49852878]
 [ 0.49641314]
 [ 0.48453322]
 [ 0.47029652]
 [ 0.48868214]
 [ 0.49926256]
 [ 0.5       ]
 [ 0.5       ]
 [ 0.49975565]
 [ 0.49744416]
 [ 0.48385802]
 [ 0.47461504]
 [ 0.48458865]
 [ 0.49624561]
 [ 0.5       ]
 [ 0.5       ]
 [ 0.49836373]
 [ 0.49247555]
 [ 0.48577659]
 [ 0.48692051]
 [ 0.48384492]
 [ 0.49171293]
 [ 0.4997    ]
 [ 0.4999    ]
 [ 0.4974376 ]
 [ 0.47949494]
 [ 0.47708333]
 [ 0.47724593]
 [ 0.47334545]
 [ 0.48527578]
 [ 0.49893854]
 [ 0.4999    ]
 [ 0.49676261]
 [ 0.4781761 ]
 [ 0.46779627]
 [ 0.46950898]
 [ 0.47703295]
 [ 0.4895132 ]
 [ 0.4977877 ]]

3) Cost function


In [7]:
if __name__ == '__main__':
    trainer = logReg('digits123.csv', 0.0001)
    trainer.train(3, 100)


0.0212673300439

4) Pairwise comparison of classess


In [8]:
if __name__ == '__main__':
    trainer = logReg('digits123.csv', 0.0001)
    trainer.train(1, 100)
    trainer.train(2, 100)
    trainer.train(3, 100)
    # the costs are quite similar right now, but it does seem to be the best for '3'


0.294619980502
0.0175256335981
0.00852889613945

5) Optimization learning rate and iterations


In [ ]:
if __name__ == '__main__':
    trainer = logReg('digits123.csv', 0.000000000001)
    trainer.train(1, 1000)

Discussion:

[You discussion comes here]