Run many Batch Normalization experiments using Cloud using ML Engine


In [1]:
# change these to try this notebook out
BUCKET = 'crawles-sandbox'  # change this to your GCP bucket
PROJECT = 'crawles-sandbox'  # change this to your GCP project
REGION = 'us-central1'

# Import os environment variables
import os
os.environ['BUCKET'] = BUCKET
os.environ['PROJECT'] = PROJECT
os.environ['REGION'] = REGION

Let’s test how Batch Normalization impacts models of varying depths. We can launch many experiments in parallel using Google Cloud ML Engine. We will fire off 14 jobs with varying hyperparameters:

  • With and without Batch Normalization
  • Varying model depths from 1 hidden layer to 7 hidden layers

We use the tf.estimator API to build a model and deploy it using Cloud ML Engine.


In [2]:
!ls mnist_classifier/
!ls mnist_classifier/trainer/


PKG-INFO  setup.cfg setup.py  trainer
__init__.py __pycache__ model.py    task.py

In [ ]:
%%bash
submitMLEngineJob() {
    gcloud ml-engine jobs submit training $JOBNAME \
        --package-path=$(pwd)/mnist_classifier/trainer \
        --module-name trainer.task \
        --region $REGION \
        --staging-bucket=gs://$BUCKET \
        --scale-tier=BASIC \
        --runtime-version=1.4 \
        -- \
        --outdir $OUTDIR \
        --hidden_units $net \
        --num_steps 10 \
        $batchNorm
}

# submit for different layer sizes
export PYTHONPATH=${PYTHONPATH}:${PWD}/mnist_classifier
for batchNorm in '' '--use_batch_normalization'
do
    net=''
    for layer in 500 400 300 200 100 50 25;
    do
        net=$net$layer
        netname=${net//,/_}${batchNorm/--use_batch_normalization/_bn}
        echo $netname
        JOBNAME=mnist$netname_$(date -u +%y%m%d_%H%M%S)
        OUTDIR=gs://${BUCKET}/mnist_models/mnist_model$netname/trained_model
        echo $OUTDIR $REGION $JOBNAME
        gsutil -m rm -rf $OUTDIR
        submitMLEngineJob
        net=$net,    
    done 
done

Copyright 2018 Google Inc. All Rights Reserved. you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.