Introduction to TensorFlow, now leveraging tensors!

In this notebook, we modify our intro to TensorFlow notebook to use tensors in place of our for loop. This is a derivation of Jared Ostmeyer's Naked Tensor code.

The initial steps are identical to the earlier notebook


In [ ]:
import numpy as np
np.random.seed(42)
import pandas as pd
import matplotlib.pyplot as plt
%matplotlib inline
import tensorflow as tf
tf.set_random_seed(42)

In [ ]:
xs = [0., 1., 2., 3., 4., 5., 6., 7.] 
ys = [-.82, -.94, -.12, .26, .39, .64, 1.02, 1.]

In [ ]:
fig, ax = plt.subplots()
_ = ax.scatter(xs, ys)

In [ ]:
m = tf.Variable(-0.5)
b = tf.Variable(1.0)

Define the cost as a tensor -- more elegant than a for loop and enables distributed computing in TensorFlow


In [ ]:
ys_model = m*xs+b
total_error = # DEFINE

The remaining steps are also identical to the earlier notebook!


In [ ]:
optimizer_operation = tf.train.GradientDescentOptimizer(learning_rate=0.001).minimize(total_error)

In [ ]:
initializer_operation = tf.global_variables_initializer()

In [ ]:
with tf.Session() as session:
    
    session.run(initializer_operation)
    
    n_epochs = 1000 
    for iteration in range(n_epochs):
        session.run(optimizer_operation)
        
    slope, intercept = session.run([m, b])

In [ ]:
slope

In [ ]:
intercept

In [ ]:
y_hat = intercept + slope*np.array(xs)

In [ ]:
pd.DataFrame(list(zip(ys, y_hat)), columns=['y', 'y_hat'])

In [ ]:
fig, ax = plt.subplots()

ax.scatter(xs, ys)
x_min, x_max = ax.get_xlim()
y_min, y_max = intercept, intercept + slope*(x_max-x_min)

ax.plot([x_min, x_max], [y_min, y_max])
_ = ax.set_xlim([x_min, x_max])

In [ ]: