Deploying and predicting with model

This notebook illustrates:

  1. Deploying model
  2. Predicting with model

In [ ]:
!sudo chown -R jupyter:jupyter /home/jupyter/training-data-analyst

In [ ]:
# Ensure the right version of Tensorflow is installed.
!pip freeze | grep tensorflow==2.1

In [ ]:
# change these to try this notebook out
BUCKET = 'cloud-training-demos-ml'
PROJECT = 'cloud-training-demos'
REGION = 'us-central1'

In [ ]:
import os
os.environ['BUCKET'] = BUCKET
os.environ['PROJECT'] = PROJECT
os.environ['REGION'] = REGION
os.environ['TFVERSION'] = '2.1'

In [ ]:
%%bash
if ! gsutil ls | grep -q gs://${BUCKET}/; then
  gsutil mb -l ${REGION} gs://${BUCKET}
fi

In [ ]:
%%bash
# copy solution to Lab #5 (skip this step if you still have results from Lab 5 in your bucket)
gsutil -m cp -R gs://cloud-training-demos/babyweight/trained_model gs://${BUCKET}/babyweight/trained_model

Task 1

What files are present in the model trained directory (gs://${BUCKET}/babyweight/trained_model)?

Hint (highlight to see):

Run gsutil ls in a bash cell. Answer: model checkpoints are in the trained model directory and several exported models (model architecture + weights) are in the export/exporter subdirectory

Task 2: Deploy trained, exported model

Uncomment and run the the appropriate gcloud lines ONE-BY-ONE to deploy the trained model to act as a REST web service.

Hint (highlight to see):

The very first time, you need only the last two gcloud calls to create the model and the version. To experiment later, you might need to delete any deployed version, but should not have to recreate the model


In [ ]:
%%bash
gsutil ls gs://${BUCKET}/babyweight/trained_model/export/exporter/

In [ ]:
%%bash
MODEL_NAME="babyweight"
MODEL_VERSION="ml_on_gcp"
MODEL_LOCATION=$(gsutil ls gs://${BUCKET}/babyweight/trained_model/export/exporter/ | tail -1)
echo "Deleting and deploying $MODEL_NAME $MODEL_VERSION from $MODEL_LOCATION ... this will take a few minutes"
#gcloud ai-platform versions delete ${MODEL_VERSION} --model ${MODEL_NAME}
#gcloud ai-platform models delete ${MODEL_NAME}
#gcloud ai-platform models create ${MODEL_NAME} --regions $REGION
#gcloud ai-platform versions create ${MODEL_VERSION} --model ${MODEL_NAME} --origin ${MODEL_LOCATION} --runtime-version $TFVERSION

Task 3: Write Python code to invoke the deployed model (online prediction)

Send a JSON request to the endpoint of the service to make it predict a baby's weight. The order of the responses are the order of the instances.

The deployed model requires the input instances to be formatted as follows:

{
      'key': 'b1',
      'is_male': 'True',
      'mother_age': 26.0,
      'plurality': 'Single(1)',
      'gestation_weeks': 39
},
The key is an arbitrary string. Allowed values for is_male are True, False and Unknown. Allowed values for plurality are Single(1), Twins(2), Triplets(3), Multiple(2+)


In [ ]:
from oauth2client.client import GoogleCredentials
import requests
import json

MODEL_NAME = 'babyweight'
MODEL_VERSION = 'ml_on_gcp'

token = GoogleCredentials.get_application_default().get_access_token().access_token
api = 'https://ml.googleapis.com/v1/projects/{}/models/{}/versions/{}:predict' \
         .format(PROJECT, MODEL_NAME, MODEL_VERSION)
headers = {'Authorization': 'Bearer ' + token }
data = {
  'instances': [
# TODO: complete
    {
      'key': 'b1',
      'is_male': 'True',
      'mother_age': 26.0,
      'plurality': 'Single(1)',
      'gestation_weeks': 39
    },
  ]
}
response = requests.post(api, json=data, headers=headers)
print(response.content)

Task 4: Try out batch prediction

Batch prediction is commonly used when you thousands to millions of predictions. Create a file withe one instance per line and submit using gcloud.


In [ ]:
%%writefile inputs.json
{"key": "b1", "is_male": "True", "mother_age": 26.0, "plurality": "Single(1)", "gestation_weeks": 39}
{"key": "g1", "is_male": "False", "mother_age": 26.0, "plurality": "Single(1)", "gestation_weeks": 39}

In [ ]:
%%bash
INPUT=gs://${BUCKET}/babyweight/batchpred/inputs.json
OUTPUT=gs://${BUCKET}/babyweight/batchpred/outputs
gsutil cp inputs.json $INPUT
gsutil -m rm -rf $OUTPUT 
gcloud ai-platform jobs submit prediction babypred_$(date -u +%y%m%d_%H%M%S) \
  --data-format=TEXT --region ${REGION} \
  --input-paths=$INPUT \
  --output-path=$OUTPUT \
  --model=babyweight --version=ml_on_gcp

Copyright 2017 Google Inc. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License