Peltarion prediction server

Peltarion prediction server (Model with Docker container definition) lets you host your downloaded model as a container with Docker.

Docker is the most wanted and second most loved developer tool, and helps millions of developers build, share and run any app, anywhere - on-prem or in the cloud.
You can use it to wrap the models you’ve trained on the platform and deploy them easily where you want.

Prerequisites :

  • Build and train a model.

  • Download and install Docker to create and deploy the image containing your model.


Run model as Docker container

To run your model as a container:

  1. Download your model
    Make sure to select Model with Docker container definition option to get all the necessary files.

  2. Extract the downloaded zip.
    The zip contains the SavedModel file and all extra files that define how your model can be deployed.

  3. In a terminal, navigate to the extracted zip folder.
    Run this command to generate the Docker image containing your trained model:

docker build -t model-export -f Dockerfile .

Now your model is ready to run locally or within your registry.

Run locally

In a new terminal, navigate to the extracted zip folder.
Run this command to deploy the base image locally.

docker run -p 8000:8000 model-export

You are now running your deep learning model on your computer.

Run with your Docker registry

To run you model within your Docker registry can be done in many ways. Please read the Docker documentation for best advice.


Get predictions from the deployed model

There are three endpoints:

Note that the API used by the containerized model differs slightly from the Deployment API used on the Peltarion Platform. That is the example shown in the Deployment view.

The response is forwarded directly from TF Serving REST API without post-processing.


Regress endpoint

Use the regress endpoint if:

  • Your deployed model is solving a regression problem.

  • The target feature encoding is Numeric.

Input

The input data format should be a list of records, where each record is a mapping from feature labels to the feature data. You can send one or several instances in one batch.

Example:
Generic example with two example instances (feature_1_Label, instance1_feature_1_Data, etc. are dummy names. Change to your model’s data).

curl -X POST http://localhost:8000/regress -H "Content-Type: application/json" -d \
  '{"instances":[ \
    {"feature_1_Label": instance1_feature_1_Data,"feature_2_Label": instance1_feature_2_Data},{"feature_1_Label": instance2_feature_1_Data,"feature_2_Label": instance2_feature_2_Data}]}'

Example:
Regress endpoint with one example based on the tutorial Sales forecasting with spreadsheet integration (but we only used the features Store Name and Location for this model):

curl -X POST http://localhost:8000/regress -H "Content-Type: application/json" -d \
  '{"instances":[{"Store Name": "Gazelle","Location": "Göteborg"},{"Store Name": "Quail","Location": "Stockholm"}]}'

Output response

The response from the regress endpoint is:

{
  // One regression value for each example in the request in the same order.
  "result": [ <value1>, <value2>, <value3>, ...]
}

Classify endpoint

Use the classify endpoint if:

  • Your model is solving a classification problem.

  • The target feature encoding is Binary or Categorical.

Input

The input data format should be a list of records, where each record is a mapping from feature labels to the feature data.

Example:
A batch prediction with 2 examples instances for a tabular classify endpoint (feature_1_Label, instance1_feature_1_Data, etc. are dummy names. Change to your model’s data).

curl -X POST http://localhost:8000/classify -H "Content-Type: application/json" -d \
  '{"instances":[ \
    {"feature_1_Label": instance1_feature_1_Data,"feature_2_Label": instance1_feature_2_Data},{"feature_1_Label": instance2_feature_1_Data,"feature_2_Label": instance2_feature_2_Data}]}'

Example:
Text classification endpoint with one example based on the tutorial Movie review feelings:

curl -X POST http://localhost:8000/classify -H "Content-Type: application/json" -d  \
'{"instances":[{"review": "The movie was a great!"}]}'

Output response

The response from the classify endpoints is:

{
  "result": [
    [ [<label1>, <score1>], [<label2>, <score2>], ... ],
    [ [<label1>, <score1>], [<label2>, <score2>], ... ]
    ...
    ]
}

Predict endpoint

Use the predict endpoint:

Input

The input data format should be a list of records, where each record is a mapping from feature labels to the feature data.

Example:
A batch prediction with 2 example instances for a predict endpoint (feature_1_Label, instance1_feature_1_Data, etc. are dummy names. Change to your model’s data).

curl -X POST http://localhost:8000/predict -H "Content-Type: application/json" -d \
'{ \
    "instances": [{"<feature_1_Label>": <instance1_feature_1_Data>,"<feature_2_Label>": <instance2_feature_2_Data>}]}'

Example:
Predict endpoint with one example based on the tutorial Predict real estate prices:

image=$(base64 Challenge.png)
curl -X POST http://localhost:8000/predict -H "Content-Type: application/json" -d \
'{"instances": [{"housingMedianAge": 52.0, "totalRooms": 3104.0, "totalBedrooms": 280.0, "population": 648.0, "households": 331.0, "medianIncome": 2.125, "image_path": "'$image'"}]}'

Example:
Predict endpoint with one example based on the tutorial Build your own music critic:

image=$(base64 digit-8.jpg)
curl -X POST http://localhost:8000/classify -H "Content-Type: application/json" -d  \
'{"instances":[{"Image": "'$image'"}]}'

Output response

The response from the predict endpoint is:

{
  "predictions": <value>|<(nested)list>|<list-of-objects>
}

Sending images

Images need to be base64 encoded.

Example:
Save the image you want to use in the extracted zip folder (we’ve named it digit-8.jpg in this example).
In a terminal, navigate to the extracted zip folder. Run this curl command.

image=$(base64 digit-8.jpg)

Now call your model to get predictions with this curl command (Input_feature_name is a dummy name. Change to the name of your input feature):

curl -X POST http://localhost:8000/classify -H "Content-Type: application/json" -d  \
'{"instances":[{"Input_feature_name": "'$image'"}]}'

Example
Python example

import base64

with open("<path to image>", "rb") as f:
    image_bytes = f.read()

data = {
    "instances": [
        {
            "Image": base64.encodebytes(image_bytes).decode("utf-8")
        }
    ]
}
Was this page helpful?
YesNo