Big Query Machine Learning (BQML)

Learning Objectives

  • Understand that it is possible to build ML models in Big Query
  • Understand when this is appropriate
  • Experience building a model using BQML

Introduction

BigQuery is more than just a data warehouse, it also has some ML capabilities baked into it.

As of January 2019 it is limited to linear models, but what it gives up in complexity, it gains in ease of use.

BQML is a great option when a linear model will suffice, or when you want a quick benchmark to beat, but for more complex models such as neural networks you will need to pull the data out of BigQuery and into an ML Framework like TensorFlow.

In this notebook, we will build a naive model using BQML. This notebook is intended to inspire usage of BQML, we will not focus on model performance.

Set up environment variables and load necessary libraries


In [ ]:
PROJECT = "cloud-training-demos"  # Replace with your PROJECT
REGION = "us-central1"            # Choose an available region for Cloud MLE

In [ ]:
import os
os.environ["PROJECT"] = PROJECT
os.environ["REGION"] = REGION

In [ ]:
!pip freeze | grep google-cloud-bigquery==1.21.0 || pip install google-cloud-bigquery==1.21.0

In [ ]:
%load_ext google.cloud.bigquery

Create BigQuery dataset

Prior to now we've just been reading an existing BigQuery table, now we're going to create our own so so we need some place to put it. In BigQuery parlance, Dataset means a folder for tables.

We will take advantage of BigQuery's Python Client to create the dataset.


In [ ]:
from google.cloud import bigquery
bq = bigquery.Client(project = PROJECT)

dataset = bigquery.Dataset(bq.dataset("bqml_taxifare"))
try:
    bq.create_dataset(dataset) # will fail if dataset already exists
    print("Dataset created")
except:
    print("Dataset already exists")

Create model

To create a model (Documentation)

  1. Use CREATE MODEL and provide a destination table for resulting model. Alternatively we can use CREATE OR REPLACE MODEL which allows overwriting an existing model.
  2. Use OPTIONS to specify the model type (linear_reg or logistic_reg). There are many more options we could specify, such as regularization and learning rate, but we'll accept the defaults.
  3. Provide the query which fetches the training data

Exercise 1

Use the query we created in the previous lab to Clean the Data to now train a Linear Regression model with BQML called taxifare_model. This should amount to adding a line to create the model and adding OPTIONS to specify the model type. Our label will be sum of tolls_amount and fare_amount and for features will use the pickup datetime and pickup & dropoff latitude and longitude.

HINT: Have a look at Step Two of this tutorial if you get stuck or if you want to see another example.

Your query could take about two minutes to complete.


In [ ]:
%%bigquery --project $PROJECT
# TODO: Your code goes here

Get training statistics

Because the query uses a CREATE MODEL statement to create a table, you do not see query results. The output is an empty string.

To get the training results we use the ML.TRAINING_INFO function.

Exercise 2

After completing the exercise above, query the training information of the model you created. Have a look at Step Three and Four of this tutorial to see a similar example.


In [ ]:
%%bigquery --project $PROJECT
# TODO: Your code goes here

'eval_loss' is reported as mean squared error. Your RMSE should be about 8.29. Your results may vary.

Predict

To use our model to make predictions, we use ML.PREDICT

Exercise 3

Lastly, use the taxifare_model you trained above to infer the cost of a taxi ride that occurs at 10:00 am on January 3rd, 2014 going from the Google Office in New York (latitude: 40.7434, longitude: -74.0080) to the JFK airport (latitude: 40.6413, longitude: -73.7781)

Hint: Have a look at Step Five of this tutorial if you get stuck or if you want to see another example.


In [ ]:
%%bigquery --project $PROJECT
# TODO: Your code goes here

Recap

The value of BQML is its ease of use:

  • We created a model with just two additional lines of SQL
  • We never had to move our data out of BigQuery
  • We didn't need to use an ML Framework or code, just SQL

There's lots of work going on behind the scenes make this look easy. For example BQML is automatically creating a training/evaluation split, tuning our learning rate, and one-hot encoding features if neccesary. When we move to TensorFlow these are all things we'll need to do ourselves.

This notebook was just to inspire usagage of BQML, the current model is actually very poor. We'll prove this in the next lesson by beating it with a simple heuristic.

We could improve our model considerably with some feature engineering but we'll save that for a future lesson. Also there are additional BQML functions such as ML.WEIGHTS and ML.EVALUATE that we haven't even explored. If you're interested in learning more about BQML I encourage you to read the offical docs.

From here on out we'll focus on pulling data out of BigQuery and building models using TensorFlow, which is more effort but also offers much more flexibility.

Copyright 2019 Google Inc. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.