Introduction to TensorFlow, fitting point by point

In this notebook, we introduce TensorFlow by fitting a line of the form y=m*x+b point by point. This is a derivation of Jared Ostmeyer's Naked Tensor code.

Load dependencies and set seeds for reproducibility


In [ ]:
import numpy as np
np.random.seed(42)
import pandas as pd
import matplotlib.pyplot as plt
%matplotlib inline
import tensorflow as tf
tf.set_random_seed(42)

Create a very small data set


In [ ]:
xs = [0., 1., 2., 3., 4., 5., 6., 7.] 
ys = [-.82, -.94, -.12, .26, .39, .64, 1.02, 1.]

In [ ]:
fig, ax = plt.subplots()
_ = ax.scatter(xs, ys)

Define variables -- the model parameters we'll learn -- and initialize them with "random" values


In [ ]:
m = tf.Variable(-0.5)
b = tf.Variable(1.0)

One single point at a time, define the error between the true label and the model's prediction of the label


In [ ]:
total_error = # DEFINE
for x,y in zip(xs, ys):
    y_model = # DEFINE
    total_error += # DEFINE

Define optimizer as SSE-minimizing gradient descent


In [ ]:
optimizer_operation = tf.train.GradientDescentOptimizer(learning_rate=0.001).minimize(total_error)

Define an operator that will initialize the graph with all available global variables


In [ ]:
initializer_operation = tf.global_variables_initializer()

With the computational graph designed, we initialize a session to execute it


In [ ]:
with tf.Session() as session:
    
    session.run(initializer_operation) 
    
    n_epochs = # DEFINE
    for iteration in range(n_epochs):
        session.run(optimizer_operation) 
    
    slope, intercept = session.run([m, b])

In [ ]:
slope

In [ ]:
intercept

Calculate the predicted model outputs given the inputs xs


In [ ]:
y_hat = intercept + slope*np.array(xs)

In [ ]:
pd.DataFrame(list(zip(ys, y_hat)), columns=['y', 'y_hat'])

In [ ]:
fig, ax = plt.subplots()

ax.scatter(xs, ys)
x_min, x_max = ax.get_xlim()
y_min, y_max = intercept, intercept + slope*(x_max-x_min)

ax.plot([x_min, x_max], [y_min, y_max])
_ = ax.set_xlim([x_min, x_max])

In [ ]: