# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
This Colab demonstrates how you can use the Firebase Admin Python SDK from a Jupyter notebook to manage your Firebase-hosted ML models.
Install the Firebase Admin SDK and TensorFlow. If you're running this notebook in a Google Colab environment, you can skip this step.
In [0]:
%pip install 'firebase_admin>=4.1.0'
%pip install 'tensorflow>=2.1.0'
Before you can continue, you need to set up a Firebase project:
If you don't already have a Firebase project, create a new project in the Firebase console. Then, open your project and do the following:
On the Settings page, create a service account and download the service account key file. Keep this file safe, since it grants administrator access to your project.
On the Storage page, enable Cloud Storage. Take note of your bucket name.
You need a Storage bucket to temporarily store model files while adding them to your Firebase project. If you are on the Blaze plan, you can create and use a bucket other than the default for this purpose.
On the ML Kit page, click Get started if you haven't yet enabled ML Kit.
In the Google APIs console, open your Firebase project and enable the Firebase ML API.
Then, upload the service account key file you got in the previous step:
In [0]:
import ipywidgets
uploader = ipywidgets.FileUpload(
accept='.json',
multiple=False
)
service_acct_file = {}
def handle_upload(change):
service_acct_file['name'] = next(iter(change['owner'].value))
service_acct_file['data'] = change['owner'].value[service_acct_file['name']]['content']
with open(service_acct_file['name'], 'wb') as f:
f.write(service_acct_file['data'])
print('Uploaded {}'.format(service_acct_file['name']))
uploader.observe(handle_upload, names='data')
display(uploader)
Set the GOOGLE_APPLICATION_CREDENTIALS environmental variable to the location of the key file:
In [0]:
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = os.path.realpath(service_acct_file['name'])
Import the firebase_admin module and initialize the SDK with the name of your Storage bucket. Be sure the Storage bucket is in the same Firebase project as your service account. Your project's default bucket looks like your-project-id.appspot.com.
In [0]:
storage_bucket = input('Storage bucket (no "gs://"): ')
In [0]:
import firebase_admin
from firebase_admin import ml
firebase_admin.initialize_app(options={'storageBucket': storage_bucket})
Next, train your model.
In a real notebook, you'd use a model architecture designed for your use case and provide your own training data. For this demo, just train a trivial model:
In [0]:
import tensorflow as tf
# Create a simple Keras model.
x = [-1, 0, 1, 2, 3, 4]
y = [-3, -1, 1, 3, 5, 7]
model_binary = tf.keras.models.Sequential(
[tf.keras.layers.Dense(units=1, input_shape=[1])])
model_binary.compile(optimizer='sgd', loss='mean_squared_error')
model_binary.fit(x, y, epochs=3)
Now that you have a trained model, you can upload it to Firebase and make it available to your iOS and Android apps.
First, convert the model to TensorFlow Lite and upload it to Cloud Storage. With the Admin SDK, this is a single call:
In [0]:
# This takes the Keras model, converts it to a TFLite model, and uploads it to
# your bucket as my_model.tflite
source = ml.TFLiteGCSModelSource.from_keras_model(model_binary, 'my_model.tflite')
print(source.gcs_tflite_uri)
If you get a toco_from_protos: command not found error, make sure the Python binary directory is in your PATH, then try again.
In [0]:
import os
import sys
py_bin_dir = os.path.dirname(sys.executable)
os.environ['PATH'] = '{}:{}'.format(os.environ['PATH'], py_bin_dir)
Next, create a Model object, specifying the model's Cloud Storage source and the name of your model. (You will use the name you specify here to download the model in your iOS and Android apps.)
In [0]:
model_format = ml.TFLiteFormat(model_source=source)
sdk_model_1 = ml.Model(display_name="my_model_1", model_format=model_format)
Add the model to your Firebase project by calling create_model(). When you do so, the model gets copied from Cloud Storage.
Note that this step will fail if your project already has a model named my_model_1. If this happens, delete the model with the Firebase console and try again.
In [0]:
firebase_model_1 = ml.create_model(sdk_model_1)
if firebase_model_1.validation_error:
raise Exception(firebase_model_1.validation_error)
print(firebase_model_1.as_dict())
Finally, publish your model:
In [0]:
model_id = firebase_model_1.model_id
firebase_model_1 = ml.publish_model(model_id)
Now that you've published the model, you can use it in your apps.
You can update a published model with a new model file. When you do so, client apps automatically download and use the new model.
For demonstration purposes, first save one of Keras's prepackaged models to a saved model directory:
In [0]:
tf.saved_model.save(tf.keras.applications.MobileNet(), '/tmp/saved_model/1')
Now, convert the saved model to TensorFlow Lite and upload it to Cloud Storage. This time, you're converting a TensorFlow saved model to TensorFlow Lite, but you could also convert a Keras model like you did earlier, or convert a Keras model saved as an HDF5 (.h5) file.
In [0]:
# This takes the saved model directory, converts it to TFLite and writes it to your bucket as my_model_2.tflite
source2 = ml.TFLiteGCSModelSource.from_saved_model('/tmp/saved_model/1', 'my_model_2.tflite')
Change the original Model object's model source and (optionally) metadata, then call update_model():
In [0]:
model_format2 = ml.TFLiteFormat(model_source=source2)
firebase_model_1.model_format = model_format2
firebase_model_1.tags = ['tag1', 'tag2'] # replaces any existing tags with these tags.
firebase_model_2 = ml.update_model(firebase_model_1)
if firebase_model_2.validation_error:
raise Exception(firebase_model_2.validation_error)
print(firebase_model_2.as_dict())
After you update the model, re-publish it:
In [0]:
firebase_model_2 = ml.publish_model(model_id)
print(firebase_model_2.as_dict())
If you need to get a Model object from one of your project's models, use get_model():
In [0]:
firebase_model_get = ml.get_model(model_id)
print(firebase_model_get.as_dict())
To list your project's models, iterate over the result of list_models():
In [0]:
firebase_models_list = ml.list_models()
iterator = firebase_models_list.iterate_all()
for m in iterator:
print(m.as_dict())
The Admin SDK can help you manage projects with many models.
To demonstrate this, create some more models:
In [0]:
list_model_1 = ml.create_model(ml.Model(display_name='my_model_2', tags=['tag2', 'tag3'], model_format=model_format))
list_model_2 = ml.create_model(ml.Model(display_name='my_model_3', tags=['tag3'], model_format=model_format))
list_model_3 = ml.create_model(ml.Model(display_name='cat_model_1', tags=['cat'], model_format=model_format))
list_model_4 = ml.create_model(ml.Model(display_name='cat_model_2', tags=['cat'], model_format=model_format))
list_model_5 = ml.create_model(ml.Model(display_name='new_cat_model_007', tags=['cat'], model_format=model_format))
And publish some of them:
In [0]:
list_model_2 = ml.publish_model(list_model_2.model_id)
list_model_4 = ml.publish_model(list_model_4.model_id)
You can specify how many results to return at a time:
In [0]:
firebase_models_list_2 = ml.list_models(page_size=3)
for m in firebase_models_list_2.models:
print (m.as_dict())
Get the next page of results:
In [0]:
firebase_models_list_3 = firebase_models_list_2.get_next_page()
for m in firebase_models_list_3.models:
print (m.as_dict())
When you retrieve the final page, get_next_page() returns None.
You can also filter the results.
Filter by display name:
In [0]:
firebase_models_list = ml.list_models(list_filter='display_name=cat_model_1')
for m in firebase_models_list.models:
print (m.as_dict())
Filter by display name prefix (note that only prefix matching is supported; you can't do general wildcard matching):
In [0]:
firebase_models_list = ml.list_models(list_filter='display_name:cat_*')
for m in firebase_models_list.models:
print (m.as_dict())
Filter by tag:
In [0]:
firebase_models_list = ml.list_models(list_filter='tags: cat')
for m in firebase_models_list.models:
print (m.as_dict())
Filter by publish state:
In [0]:
firebase_models_list = ml.list_models(list_filter='state.published = true')
for m in firebase_models_list.models:
print (m.as_dict())
Combine filters:
In [0]:
firebase_models_list = ml.list_models(list_filter='(display_name: cat_* OR tags: tag3) AND NOT state.published = true')
for m in firebase_models_list.models:
print (m.as_dict())
That's it!
Clean up by deleting the example models:
In [0]:
ml.delete_model(model_id)
ml.delete_model(list_model_1.model_id)
ml.delete_model(list_model_2.model_id)
ml.delete_model(list_model_3.model_id)
ml.delete_model(list_model_4.model_id)
ml.delete_model(list_model_5.model_id)