Predict with Model

View Config


In [ ]:
%%bash 

pio init-model \
  --model-server-url http://prediction-python3.community.pipeline.io \
  --model-type python3 \
  --model-namespace default \
  --model-name python3_zscore \
  --model-version v1 \
  --model-path .

Predict with Model (CLI)


In [ ]:
%%bash

pio predict \
  --model-test-request-path ./data/test_request.json

Predict with Model under Mini-Load (CLI)

This is a mini load test to provide instant feedback on relative performance.


In [ ]:
%%bash

pio predict_many \
  --model-test-request-path ./data/test_request.json \
  --num-iterations 5

Predict with Model (REST)

Setup Prediction Inputs


In [ ]:
import requests

model_type = 'python3'
model_namespace = 'default'
model_name = 'python3_zscore'
model_version = 'v1'

deploy_url = 'http://prediction-%s.community.pipeline.io/api/v1/model/predict/%s/%s/%s/%s' % (model_type, model_type, model_namespace, model_name, model_version)
print(deploy_url)
with open('./data/test_request.json', 'rb') as fh:
    model_input_binary = fh.read()

response = requests.post(url=deploy_url,
                         data=model_input_binary,
                         timeout=30)

print("Success!\n\n%s" % response.text)

In [ ]: