Batch Prediction Jobs
Pre-requisites
MinIO should already be installed with Seldon Enterprise Platform. The MinIO browser should be exposed on /minio/ (note the trailing forward slash).
For trials, the credentials will by default be the same as the Enterprise Platform login, with MinIO using the email as its Access Key and the password as its Secret Key.
On a production cluster, the namespace needs to have been set up with a service account. This can be found under the argo install documentation.
This demo helps you learn about:
Deploying a deployment with a pre trained SKlearn iris model
Running a batch job to get predictions
Checking the output
Create a Deployment
Click on
Create new deploymentbutton.Enter the deployment details as follows:
Name: batch-demo
Namespace: seldon
Type: Seldon Deployment

deployment wizard details step Configure the default predictor as follows:
Runtime: Scikit Learn
Model URI:
gs://seldon-models/scv2/samples/mlserver_1.6.0/iris-sklearnModel Project: default
Storage Secret: (leave blank/none)
Model Name: iris
The Model Name is linked to the name described in the model-settings.json file, located in the Google Cloud Storage location. Changing the name in the JSON file would also require changing the Model Name, and vice versa.
Skip
Nextfor the remaining steps, then clickLaunch.If your deployment is launched successfully, it will have
Availablestatus, on the overview page.
Setup Input Data
Download the input data file
iris-input.txt.
The first few lines of the input file `iris-input.txt` should show the following format:
{"inputs":[{"name":"predict","data":[0.38606369295833043,0.006894049558299753,0.6104082981607108,0.3958954239450676],"datatype":"FP64","shape":[1,4]}]}
{"inputs":[{"name":"predict","data":[0.7223678219956075,0.608521741883582,0.8596266157372878,0.20041864827775757],"datatype":"FP64","shape":[1,4]}]}
{"inputs":[{"name":"predict","data":[0.8659159480026418,0.2383384971368594,0.7743518759043038,0.8748919374334038],"datatype":"FP64","shape":[1,4]}]}Go to the MinIO browser and use the button in the bottom-right to create a bucket. Call it
data.Again from the bottom-right choose to upload the
iris-input.txtfile to thedatabucket.
Run a Batch Job
Click the new deployment
batch-demoin the Overview page.Click the Batch Jobs in the left pane.
Click Create Your First Job, enter the following details, and click Submit:
Input Data Location:
minio://data/iris-input.txtOutput Data Location:
minio://data/iris-output-{{workflow.name}}.txtNumber of Workers: 5
Number of Retries: 3
Batch Size: 10
Minimum Batch Wait Interval (sec) : 0
Method: Predict
Transport Protocol: REST
Input Data Type: Open Inference Protocol (OIP)
Object Store Secret Name: minio-bucket-envvars
4. Give the job a couple of minutes to complete, then refresh the page to see the status.
5. Inspect the output file in MinIO:

If you open that file you should see contents such as:
{"model_name":"","outputs":[{"data":[0],"name":"predict","shape":[1],"datatype":"INT64"}],"parameters":{"batch_index":0}}
{"model_name":"","outputs":[{"data":[0],"name":"predict","shape":[1],"datatype":"INT64"}],"parameters":{"batch_index":2}}
{"model_name":"","outputs":[{"data":[1],"name":"predict","shape":[1],"datatype":"INT64"}],"parameters":{"batch_index":4}}
{"model_name":"","outputs":[{"data":[0],"name":"predict","shape":[1],"datatype":"INT64"}],"parameters":{"batch_index":1}}If not, see the argo section for troubleshooting.
Last updated
Was this helpful?
