Regression Week 1: Simple Linear Regression

In this notebook we will use data on house sales in King County to predict house prices using simple (one input) linear regression. You will:

  • Use graphlab SArray and SFrame functions to compute important summary statistics
  • Write a function to compute the Simple Linear Regression weights using the closed form solution
  • Write a function to make predictions of the output given the input feature
  • Turn the regression around to predict the input given the output
  • Compare two different models for predicting house prices

In this notebook you will be provided with some already complete code as well as some code that you should complete yourself in order to answer quiz questions. The code we provide to complte is optional and is there to assist you with solving the problems but feel free to ignore the helper code and write your own.

Import module


In [23]:
import pandas as pd
import numpy as np

In [24]:
dtype_dict = {'bathrooms':float, 'waterfront':int, 'sqft_above':int, 'sqft_living15':float, 
              'grade':int, 'yr_renovated':int, 'price':float, 'bedrooms':float, 'zipcode':str, 
              'long':float, 'sqft_lot15':float, 'sqft_living':float, 'floors':str, 'condition':int, 
              'lat':float, 'date':str, 'sqft_basement':int, 'yr_built':int, 'id':str, 'sqft_lot':int, 'view':int}

In [25]:
sales = pd.read_csv('kc_house_data.csv', dtype=dtype_dict)
train_data = pd.read_csv('kc_house_train_data.csv', dtype=dtype_dict)
test_data = pd.read_csv('kc_house_test_data.csv', dtype=dtype_dict)

In [26]:
sales.head()


Out[26]:
id date price bedrooms bathrooms sqft_living sqft_lot floors waterfront view ... grade sqft_above sqft_basement yr_built yr_renovated zipcode lat long sqft_living15 sqft_lot15
0 7129300520 20141013T000000 221900.0 3.0 1.00 1180.0 5650 1 0 0 ... 7 1180 0 1955 0 98178 47.5112 -122.257 1340.0 5650.0
1 6414100192 20141209T000000 538000.0 3.0 2.25 2570.0 7242 2 0 0 ... 7 2170 400 1951 1991 98125 47.7210 -122.319 1690.0 7639.0
2 5631500400 20150225T000000 180000.0 2.0 1.00 770.0 10000 1 0 0 ... 6 770 0 1933 0 98028 47.7379 -122.233 2720.0 8062.0
3 2487200875 20141209T000000 604000.0 4.0 3.00 1960.0 5000 1 0 0 ... 7 1050 910 1965 0 98136 47.5208 -122.393 1360.0 5000.0
4 1954400510 20150218T000000 510000.0 3.0 2.00 1680.0 8080 1 0 0 ... 8 1680 0 1987 0 98074 47.6168 -122.045 1800.0 7503.0

5 rows × 21 columns

Useful pandas summary functions

In order to make use of the closed form soltion as well as take advantage of graphlab's built in functions we will review some important ones. In particular:

  • Computing the sum of an SArray
  • Computing the arithmetic average (mean) of an SArray
  • multiplying SArrays by constants
  • multiplying SArrays by other SArrays

In [27]:
# Let's compute the mean of the House Prices in King County in 2 different ways.
prices = sales['price'] # extract the price column of the sales SFrame -- this is now an SArray

# recall that the arithmetic average (the mean) is the sum of the prices divided by the total number of houses:
sum_prices = prices.sum()
num_houses = len(prices) # when prices is an SArray .size() returns its length
avg_price_1 = sum_prices/num_houses
avg_price_2 = prices.mean() # if you just want the average, the .mean() function
print "average price via method 1: " + str(avg_price_1)
print "average price via method 2: " + str(avg_price_2)


average price via method 1: 540088.141767
average price via method 2: 540088.141767

As we see we get the same answer both ways


In [28]:
# if we want to multiply every price by 0.5 it's a simple as:
half_prices = 0.5*prices
# Let's compute the sum of squares of price. We can multiply two SArrays of the same length elementwise also with *
prices_squared = prices*prices
sum_prices_squared = prices_squared.sum() # price_squared is an SArray of the squares and we want to add them up.
print "the sum of price squared is: " + str(sum_prices_squared)


the sum of price squared is: 9.21732513847e+15

Aside: The python notation x.xxe+yy means x.xx * 10^(yy). e.g 100 = 10^2 = 1*10^2 = 1e2

Build a generic simple linear regression function

Armed with these SArray functions we can use the closed form solution found from lecture to compute the slope and intercept for a simple linear regression on observations stored as SArrays: input_feature, output.

Complete the following function (or write your own) to compute the simple linear regression slope and intercept:


In [29]:
def simple_linear_regression(input_feature, output):
    n = len(input_feature)
    x = input_feature
    y = output
    
    # compute the mean of  input_feature and output
    x_mean = x.mean()
    y_mean = y.mean()
    
    # compute the product of the output and the input_feature and its mean
    sum_xy = (y * x).sum()
    xy_by_n = (y.sum() * x.sum())/n
               
    # compute the squared value of the input_feature and its mean
    x_square = (x**2).sum()
    xx_by_n = (x.sum() * x.sum())/n
               
    # use the formula for the slope
    slope = (sum_xy - xy_by_n) / (x_square - xx_by_n)
    
    # use the formula for the intercept
    intercept = y_mean - (slope * x_mean)
    return (intercept, slope)

We can test that our function works by passing it something where we know the answer. In particular we can generate a feature and then put the output exactly on a line: output = 1 + 1*input_feature then we know both our slope and intercept should be 1


In [30]:
test_feature = np.array(range(5))
test_output = np.array(1 + 1*test_feature)
(test_intercept, test_slope) =  simple_linear_regression(test_feature, test_output)
print "Intercept: " + str(test_intercept)
print "Slope: " + str(test_slope)


Intercept: 1.0
Slope: 1

Now that we know it works let's build a regression model for predicting price based on sqft_living. Rembember that we train on train_data!


In [31]:
sqft_intercept, sqft_slope = simple_linear_regression(train_data['sqft_living'].values, train_data['price'].values)

print "Intercept: " + str(sqft_intercept)
print "Slope: " + str(sqft_slope)


Intercept: -47116.0790729
Slope: 281.95883963

Predicting Values

Now that we have the model parameters: intercept & slope we can make predictions. Using SArrays it's easy to multiply an SArray by a constant and add a constant value. Complete the following function to return the predicted output given the input_feature, slope and intercept:


In [32]:
def get_regression_predictions(input_feature, intercept, slope):
    # calculate the predicted values:
    predicted_values = intercept + (slope * input_feature)
    return predicted_values

Now that we can calculate a prediction given the slop and intercept let's make a prediction. Use (or alter) the following to find out the estimated price for a house with 2650 squarefeet according to the squarefeet model we estiamted above.

Quiz Question: Using your Slope and Intercept from (4), What is the predicted price for a house with 2650 sqft?


In [33]:
my_house_sqft = 2650
estimated_price = get_regression_predictions(my_house_sqft, sqft_intercept, sqft_slope)
print "The estimated price for a house with %d squarefeet is $%.2f" % (my_house_sqft, estimated_price)


The estimated price for a house with 2650 squarefeet is $700074.85

Residual Sum of Squares

Now that we have a model and can make predictions let's evaluate our model using Residual Sum of Squares (RSS). Recall that RSS is the sum of the squares of the residuals and the residuals is just a fancy word for the difference between the predicted output and the true output.

Complete the following (or write your own) function to compute the RSS of a simple linear regression model given the input_feature, output, intercept and slope:


In [34]:
def get_residual_sum_of_squares(input_feature, output, intercept, slope):
    # First get the predictions
    predicted_values = intercept + (slope * input_feature)
    # then compute the residuals (since we are squaring it doesn't matter which order you subtract)
    residuals = output - predicted_values
    # square the residuals and add them up
    RSS = (residuals * residuals).sum()
    return(RSS)

Let's test our get_residual_sum_of_squares function by applying it to the test model where the data lie exactly on a line. Since they lie exactly on a line the residual sum of squares should be zero!


In [35]:
print get_residual_sum_of_squares(test_feature, test_output, test_intercept, test_slope) # should be 0.0


0.0

Now use your function to calculate the RSS on training data from the squarefeet model calculated above.

Quiz Question: According to this function and the slope and intercept from the squarefeet model What is the RSS for the simple linear regression using squarefeet to predict prices on TRAINING data?


In [36]:
rss_prices_on_sqft = get_residual_sum_of_squares(train_data['sqft_living'], train_data['price'], sqft_intercept, sqft_slope)
print 'The RSS of predicting Prices based on Square Feet is : ' + str(rss_prices_on_sqft)


The RSS of predicting Prices based on Square Feet is : 1.20191835418e+15

Predict the squarefeet given price

What if we want to predict the squarefoot given the price? Since we have an equation y = a + b*x we can solve the function for x. So that if we have the intercept (a) and the slope (b) and the price (y) we can solve for the estimated squarefeet (x).

Comlplete the following function to compute the inverse regression estimate, i.e. predict the input_feature given the output!


In [37]:
def inverse_regression_predictions(output, intercept, slope):
    # solve output = intercept + slope*input_feature for input_feature. Use this equation to compute the inverse predictions:
    estimated_feature = (output - intercept)/slope
    return estimated_feature

Now that we have a function to compute the squarefeet given the price from our simple regression model let's see how big we might expect a house that coses $800,000 to be.

Quiz Question: According to this function and the regression slope and intercept from (3) what is the estimated square-feet for a house costing $800,000?


In [38]:
my_house_price = 800000
estimated_squarefeet = inverse_regression_predictions(my_house_price, sqft_intercept, sqft_slope)
print "The estimated squarefeet for a house worth $%.2f is %d" % (my_house_price, estimated_squarefeet)


The estimated squarefeet for a house worth $800000.00 is 3004

New Model: estimate prices from bedrooms

We have made one model for predicting house prices using squarefeet, but there are many other features in the sales SFrame. Use your simple linear regression function to estimate the regression parameters from predicting Prices based on number of bedrooms. Use the training data!


In [39]:
# Estimate the slope and intercept for predicting 'price' based on 'bedrooms'
sqft_intercept, sqft_slope = simple_linear_regression(train_data['bedrooms'].values, train_data['price'].values)

print "Intercept: " + str(sqft_intercept)
print "Slope: " + str(sqft_slope)


Intercept: 109473.177623
Slope: 127588.952934

Test your Linear Regression Algorithm

Now we have two models for predicting the price of a house. How do we know which one is better? Calculate the RSS on the TEST data (remember this data wasn't involved in learning the model). Compute the RSS from predicting prices using bedrooms and from predicting prices using squarefeet.

Quiz Question: Which model (square feet or bedrooms) has lowest RSS on TEST data? Think about why this might be the case.


In [40]:
# Compute RSS when using bedrooms on TEST data:
sqft_intercept, sqft_slope = simple_linear_regression(train_data['bedrooms'].values, 
                                                      train_data['price'].values)
rss_prices_on_bedrooms = get_residual_sum_of_squares(test_data['bedrooms'].values, 
                                                     test_data['price'].values, 
                                                     sqft_intercept, sqft_slope)
print 'The RSS of predicting Prices based on Bedrooms is : ' + str(rss_prices_on_bedrooms)


The RSS of predicting Prices based on Bedrooms is : 4.9336458596e+14

In [41]:
# Compute RSS when using squarfeet on TEST data:
sqft_intercept, sqft_slope = simple_linear_regression(train_data['sqft_living'].values, 
                                                      train_data['price'].values)
rss_prices_on_sqft = get_residual_sum_of_squares(test_data['sqft_living'].values, 
                                                 test_data['price'].values, 
                                                 sqft_intercept, sqft_slope)
print 'The RSS of predicting Prices based on Square Feet is : ' + str(rss_prices_on_sqft)


The RSS of predicting Prices based on Square Feet is : 2.75402933618e+14