In this notebook, we want to build a logistic regression classifier in Tensorflow for MNIST.

The logistic regression classifier is defined as: $y = sigmoid(Wx + b)$


In [1]:
from __future__ import print_function
import tensorflow as tf
import math

In [2]:
# Get the MNIST dataset
from tensorflow.examples.tutorials.mnist import input_data

In [3]:
# Step 0: Read in the data into the "/data/mnist" folder
mnist = input_data.read_data_sets('./data/mnist', one_hot=True)


Extracting ./data/mnist/train-images-idx3-ubyte.gz
Extracting ./data/mnist/train-labels-idx1-ubyte.gz
Extracting ./data/mnist/t10k-images-idx3-ubyte.gz
Extracting ./data/mnist/t10k-labels-idx1-ubyte.gz

In [4]:
# Step 1: Create placeholders to feed your inputs and labels into
# Each MNIST image has 28*28 = 784 pixels. So you can represent it as a 1x784 Tensor.
# There are 10 possible classes for each image, corresponding to digits 0-9.
# Name the input placeholder mnist_inputs and the labels placeholder mnist_labels
mnist_inputs = tf.placeholder("float", [None, 784], name="mnist_inputs_placeholder") 
mnist_labels = tf.placeholder("float", [None, 10], name='mnist_labels_placeholder')

In [5]:
# Step 2: Create Variables for the parameters of the model: the weights and the bias.
# Initialize the bias to a 0 tensor. (hint: tf.zeros)
# Initialize the weights with a random uniform distribution, with a max of 1 and a min of -1. (hint: tf.random_uniform)
# Be sure to think carefully about the shapes of these tensors.
W = tf.Variable(tf.random_uniform([784, 10], -1, 1), name="weights")
b = tf.Variable(tf.zeros(shape=[1, 10]), name="bias")

# Optional: Define a global_step variable for use in tensorboard
global_step = tf.Variable(0, name='global_step', trainable=False)

In [6]:
# Step 3: Build the model, stringing together your placeholders and variables to create
# two ops: one for the logits (output right before sigmoid), and one for the probability
# distribution generated from the model (output right after sigmoid/softmax operation).
# tf.nn.softmax may come in handy for generating the probabilities.

# Name the logits operation "logits", and the probability operation "predictions".
with tf.name_scope("model"):
    logits = tf.matmul(mnist_inputs, W) + b
    predictions = tf.nn.softmax(logits)

In [7]:
# Step 4: Define your loss function. Use the cross entropy loss function, and use tensorflow's
# built in "tf.nn.sofmtax_cross_entropy_with_logits(logits, labels)" function to get the xentropy
# of each instance in the batch. Then, get the average loss of the batch.
# Name the loss op "loss"
with tf.name_scope("loss"):
    loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=mnist_labels, name="xentropy"))

In [8]:
# Step 5: Define a function to get the accuracy of your model on this batch. Name it "accuracy"
with tf.name_scope("accuracy"):
    correct_preds = tf.equal(tf.argmax(predictions, 1), tf.argmax(mnist_labels, 1))
    num_correct_preds = tf.reduce_sum(tf.cast(correct_preds, "float"))
    accuracy = num_correct_preds / tf.cast(tf.shape(mnist_inputs)[0], "float")

In [9]:
# Step 6: Define an optimizer that you want to use, and create the training operation to 
# use the optimizer to minimize the loss. Name the training operation "train_op"
with tf.name_scope("train_op"):
    train_op = tf.train.AdamOptimizer().minimize(loss, global_step=global_step)

In [10]:
# Define summary ops for TensorBoard (optional). Name the summary op "summary_op".
with tf.name_scope("summaries"):
    tf.summary.scalar("loss", loss)
    tf.summary.scalar("accuracy", accuracy)
    summary_op = tf.summary.merge_all()

In [11]:
# Step 7: Create a session for the model to run in, and then set up a train loop
# to optimize the weights given the mnist data. Optionally, add tensorboard visualization too.
nb_train_examples = mnist.train.num_examples
batch_size = 128
nb_epochs = 30
batches_per_epoch = int(math.ceil(nb_train_examples/batch_size))
log_period = 250

with tf.Session() as sess:
    # Step 7.1 Initialize your Variables
    sess.run(tf.global_variables_initializer())
    # Set up tensorboard writer (optional)
    train_writer = tf.summary.FileWriter("./logs/train",
                                         sess.graph)
    for epoch in range(nb_epochs):
        epoch_total_loss = 0
        epoch_total_accuracy = 0
        
        for batch in range(batches_per_epoch):
            loop_global_step = sess.run(global_step) + 1
            batch_inputs, batch_labels = mnist.train.next_batch(batch_size)
            # Step 7.2 Get the batch loss, batch accuracy, and run the training op.
            # If the log period is up, write summaries to tensorboard.
            batch_loss, batch_acc, _ = sess.run([loss, accuracy, train_op], 
                                         feed_dict={mnist_inputs: batch_inputs, 
                                                    mnist_labels: batch_labels})
            if loop_global_step % log_period == 0:
                train_summary = sess.run(summary_op, 
                                         feed_dict={mnist_inputs: batch_inputs, 
                                                    mnist_labels: batch_labels})
                train_writer.add_summary(train_summary, loop_global_step)
                
            epoch_total_loss += batch_loss
            epoch_total_accuracy += batch_acc
        
        epoch_average_loss = epoch_total_loss / batches_per_epoch
        epoch_average_accuracy = epoch_total_accuracy / batches_per_epoch
        print("Epoch {} done! Average Train Loss: {}, Average Train Accuracy: {}".format(epoch, 
                                                                                 epoch_average_loss, 
                                                                                 epoch_average_accuracy))
    print("Finished {} epochs".format(nb_epochs))


Epoch 0 done! Average Train Loss: 2.897015596267789, Average Train Accuracy: 0.4299963662790698
Epoch 1 done! Average Train Loss: 0.8752340503903322, Average Train Accuracy: 0.7549236918604652
Epoch 2 done! Average Train Loss: 0.6169485040182291, Average Train Accuracy: 0.8237281976744186
Epoch 3 done! Average Train Loss: 0.5141631118779959, Average Train Accuracy: 0.8528161337209302
Epoch 4 done! Average Train Loss: 0.4546051041677941, Average Train Accuracy: 0.8694585755813954
Epoch 5 done! Average Train Loss: 0.41702714690635373, Average Train Accuracy: 0.8816133720930233
Epoch 6 done! Average Train Loss: 0.39027175761239474, Average Train Accuracy: 0.8897165697674418
Epoch 7 done! Average Train Loss: 0.3706014162925787, Average Train Accuracy: 0.8947674418604651
Epoch 8 done! Average Train Loss: 0.35531047974214996, Average Train Accuracy: 0.9000363372093023
Epoch 9 done! Average Train Loss: 0.3424083832044934, Average Train Accuracy: 0.9030523255813954
Epoch 10 done! Average Train Loss: 0.3321727248298567, Average Train Accuracy: 0.9061773255813953
Epoch 11 done! Average Train Loss: 0.32367964213670686, Average Train Accuracy: 0.9086664244186047
Epoch 12 done! Average Train Loss: 0.3159156525031079, Average Train Accuracy: 0.9105377906976744
Epoch 13 done! Average Train Loss: 0.3097010365232479, Average Train Accuracy: 0.912718023255814
Epoch 14 done! Average Train Loss: 0.30510869821490244, Average Train Accuracy: 0.9147347383720931
Epoch 15 done! Average Train Loss: 0.29900371006061865, Average Train Accuracy: 0.9164970930232558
Epoch 16 done! Average Train Loss: 0.29407696250912757, Average Train Accuracy: 0.9177143895348837
Epoch 17 done! Average Train Loss: 0.2915506835767003, Average Train Accuracy: 0.9183684593023256
Epoch 18 done! Average Train Loss: 0.28732495084404946, Average Train Accuracy: 0.920530523255814
Epoch 19 done! Average Train Loss: 0.2851139917449896, Average Train Accuracy: 0.9208393895348838
Epoch 20 done! Average Train Loss: 0.28074602621585826, Average Train Accuracy: 0.9220021802325581
Epoch 21 done! Average Train Loss: 0.27800248303385666, Average Train Accuracy: 0.922983284883721
Epoch 22 done! Average Train Loss: 0.2772585492147956, Average Train Accuracy: 0.9238190406976744
Epoch 23 done! Average Train Loss: 0.2732218886357407, Average Train Accuracy: 0.9239825581395349
Epoch 24 done! Average Train Loss: 0.27258983077697974, Average Train Accuracy: 0.9244004360465117
Epoch 25 done! Average Train Loss: 0.2701580643307331, Average Train Accuracy: 0.9251090116279069
Epoch 26 done! Average Train Loss: 0.26765816527050595, Average Train Accuracy: 0.9261082848837209
Epoch 27 done! Average Train Loss: 0.2651886691707511, Average Train Accuracy: 0.9265261627906977
Epoch 28 done! Average Train Loss: 0.26430266802740654, Average Train Accuracy: 0.9260174418604651
Epoch 29 done! Average Train Loss: 0.26353419972367065, Average Train Accuracy: 0.9278888081395349
Finished 30 epochs