In this notebook, we want to build a logistic regression classifier in Tensorflow for MNIST.

The logistic regression classifier is defined as: $y = sigmoid(Wx + b)$


In [1]:
from __future__ import print_function
import tensorflow as tf
import math

In [2]:
# Get the MNIST dataset
from tensorflow.examples.tutorials.mnist import input_data

In [3]:
# Step 0: Read in the data into the "/data/mnist" folder
mnist = input_data.read_data_sets('./data/mnist', one_hot=True)


Extracting ./data/mnist/train-images-idx3-ubyte.gz
Extracting ./data/mnist/train-labels-idx1-ubyte.gz
Extracting ./data/mnist/t10k-images-idx3-ubyte.gz
Extracting ./data/mnist/t10k-labels-idx1-ubyte.gz

In [4]:
# Step 1: Create placeholders to feed your inputs and labels into
# Each MNIST image has 28*28 = 784 pixels. So you can represent it as a 1x784 Tensor.
# There are 10 possible classes for each image, corresponding to digits 0-9.
# Name the input placeholder "mnist_inputs" and the labels placeholder "mnist_labels"

In [5]:
# Step 2: Create Variables for the parameters of the model: the weights and the bias.
# Initialize the bias to a 0 tensor. (hint: tf.zeros)
# Initialize the weights with a random uniform distribution, with a max of 1 and a min of -1. (hint: tf.random_uniform)
# Be sure to think carefully about the shapes of these tensors.


# Optional: Define a global_step variable for use in tensorboard.

In [6]:
# Step 3: Build the model, stringing together your placeholders and variables to create
# two ops: one for the logits (output right before sigmoid), and one for the probability
# distribution generated from the model (output right after sigmoid/softmax operation).
# tf.nn.softmax may come in handy for generating the probabilities.

# Name the logits operation "logits", and the probability operation "predictions".

In [7]:
# Step 4: Define your loss function. Use the cross entropy loss function, and use tensorflow's
# built in "tf.nn.sofmtax_cross_entropy_with_logits(logits, labels)" function to get the xentropy
# of each instance in the batch. Then, get the average loss of the batch.
# Name the loss op "loss"

In [8]:
# Step 5: Define a function to get the accuracy of your model on this batch. Name it "accuracy"

In [9]:
# Step 6: Define an optimizer that you want to use, and create the training operation to 
# use the optimizer to minimize the loss. Name the training operation "train_op"

In [10]:
# Define summary ops for TensorBoard (optional). Name the summary op "summary_op".

In [11]:
# Step 7: Create a session for the model to run in, and then set up a train loop
# to optimize the weights given the mnist data. Optionally, add tensorboard visualization too.
nb_train_examples = mnist.train.num_examples
batch_size = 128
nb_epochs = 30
batches_per_epoch = int(math.ceil(nb_train_examples/batch_size))
log_period = 250

with tf.Session() as sess:
    # Step 7.1 Initialize your Variables
    # Set up tensorboard writer (optional)
    for epoch in range(nb_epochs):
        epoch_total_loss = 0
        epoch_total_accuracy = 0
        
        for batch in range(batches_per_epoch):
            loop_global_step = sess.run(global_step) + 1
            batch_inputs, batch_labels = mnist.train.next_batch(batch_size)
            # Step 7.2 Get the batch loss, batch accuracy, and run the training op.
            # If the log period is up, write summaries to tensorboard.

                
            epoch_total_loss += batch_loss
            epoch_total_accuracy += batch_acc
        
        epoch_average_loss = epoch_total_loss / batches_per_epoch
        epoch_average_accuracy = epoch_total_accuracy / batches_per_epoch
        print("Epoch {} done! Average Train Loss: {}, Average Train Accuracy: {}".format(epoch, 
                                                                                 epoch_average_loss, 
                                                                                 epoch_average_accuracy))
    print("Finished {} epochs".format(nb_epochs))


Epoch 0 done! Average Train Loss: 3.5257930881755297, Average Train Accuracy: 0.37672601744186046
Epoch 1 done! Average Train Loss: 1.0151619504357494, Average Train Accuracy: 0.7217841569767441
Epoch 2 done! Average Train Loss: 0.6782514446696569, Average Train Accuracy: 0.8081758720930232
Epoch 3 done! Average Train Loss: 0.5533750484156054, Average Train Accuracy: 0.8457303779069767
Epoch 4 done! Average Train Loss: 0.48077442431172657, Average Train Accuracy: 0.8654251453488372
Epoch 5 done! Average Train Loss: 0.44223232969295145, Average Train Accuracy: 0.8766351744186046
Epoch 6 done! Average Train Loss: 0.41040772937758024, Average Train Accuracy: 0.885828488372093
Epoch 7 done! Average Train Loss: 0.3893870830882427, Average Train Accuracy: 0.892078488372093
Epoch 8 done! Average Train Loss: 0.3689094877520273, Average Train Accuracy: 0.8972565406976745
Epoch 9 done! Average Train Loss: 0.35493913455757986, Average Train Accuracy: 0.9010901162790698
Epoch 10 done! Average Train Loss: 0.34394710278788276, Average Train Accuracy: 0.903452034883721
Epoch 11 done! Average Train Loss: 0.33361118087588354, Average Train Accuracy: 0.9061228197674419
Epoch 12 done! Average Train Loss: 0.3244342766182367, Average Train Accuracy: 0.9091933139534883
Epoch 13 done! Average Train Loss: 0.31804617849546807, Average Train Accuracy: 0.9111918604651162
Epoch 14 done! Average Train Loss: 0.3108990905243297, Average Train Accuracy: 0.9140261627906977
Epoch 15 done! Average Train Loss: 0.3080461789182452, Average Train Accuracy: 0.9136991279069767
Epoch 16 done! Average Train Loss: 0.3008717404028704, Average Train Accuracy: 0.9157158430232558
Epoch 17 done! Average Train Loss: 0.29575165127252423, Average Train Accuracy: 0.9181140988372093
Epoch 18 done! Average Train Loss: 0.2931524786664996, Average Train Accuracy: 0.9181686046511628
Epoch 19 done! Average Train Loss: 0.28728850321021193, Average Train Accuracy: 0.9200944767441861
Epoch 20 done! Average Train Loss: 0.2850607742577098, Average Train Accuracy: 0.9210210755813953
Epoch 21 done! Average Train Loss: 0.28183248285637347, Average Train Accuracy: 0.9212936046511628
Epoch 22 done! Average Train Loss: 0.2804785714766314, Average Train Accuracy: 0.9217114825581395
Epoch 23 done! Average Train Loss: 0.2800791971558748, Average Train Accuracy: 0.9222202034883721
Epoch 24 done! Average Train Loss: 0.27415125137844754, Average Train Accuracy: 0.9240188953488372
Epoch 25 done! Average Train Loss: 0.2722833371266376, Average Train Accuracy: 0.9244549418604651
Epoch 26 done! Average Train Loss: 0.2698032872448134, Average Train Accuracy: 0.9248728197674418
Epoch 27 done! Average Train Loss: 0.26874305933367376, Average Train Accuracy: 0.9254542151162791
Epoch 28 done! Average Train Loss: 0.2671904688484447, Average Train Accuracy: 0.9261991279069768
Epoch 29 done! Average Train Loss: 0.264359643171693, Average Train Accuracy: 0.9265443313953489
Finished 30 epochs

In [ ]: