Write a program using tensorflow to build a batch gradient descent model for logistic regression.

Batch gradient descent requires 3 parameters:

Learning Rate: Used to limit the amount each coefficient is corrected each time it is updated.

Epochs: The number of times to run through the training data while updating the coefficients.

Batch_no : Update parameters after no. of iterations

There are 3 loops we need to perform in the function:

a. Loop over each epoch.
b. Loop over each row in the training data for an epoch.
c. Loop over each coefficient and update it for a row in an epoch.

Part 1

$$0 \leq h_\theta(x) \leq 1$$

. Prediction value should be either 0 or 1.

  1. h$_\theta(x)$ hypothesis is given by sigmoid function

    $$h_\theta(x) = \frac{1}{1 + e^-\theta^T x}$$

  2. Using cost function

$$Cost(h_\theta(x),y) = -y*log(h_\theta(x)) - (1-y) log(1- h_\theta(x))$$

Update function is given by

$$\theta_j : = \theta_j - \frac{\alpha}{m}\sum_{i=1}^m(h_\theta(x^{(i)}) - y^{(i)}) x^{(i)} $$

  1. Find the values of $\theta_1$ & $\theta_0$ by updating the values with learning rate of 0.005, 0.0005 etc and epoch value 100,500,1000
  2. Stop the iterations when error falls below threshold

Part 2

  1. Display the graph of decreasing cost function wrt learning rate and epoch
  2. Compare time complexity of tensorflow api and user defined function

Part 3

  1. Write analysis on how optimization was improved.
  2. Implement 3 cost functions

In [ ]:
#Sample Data : 
dataset = [[2.7810836,2.550537003,0],
	[1.465489372,2.362125076,0],
	[3.396561688,4.400293529,0],
	[1.38807019,1.850220317,0],
	[3.06407232,3.005305973,0],
	[7.627531214,2.759262235,1],
	[5.332441248,2.088626775,1],
	[6.922596716,1.77106367,1],
	[8.675418651,-0.242068655,1],
	[7.673756466,3.508563011,1]]