Triton Examples

Prerequisites

  • For the test data you will need to install torch, torchvision and tensorflow

  • For the visualization matplotlib

  • For calling the service curl

Setup Seldon Core

Follow the instructions to Setup Cluster with Ambassador Ingress and Install Seldon Core.

Then port-forward to that ingress on localhost:8003 in a separate terminal either with:

  • Ambassador: kubectl port-forward $(kubectl get pods -n seldon -l app.kubernetes.io/name=ambassador -o jsonpath='{.items[0].metadata.name}') -n seldon 8003:8080

  • Istio: kubectl port-forward $(kubectl get pods -l istio=ingressgateway -n istio-system -o jsonpath='{.items[0].metadata.name}') -n istio-system 8003:8080

Create Namespace for experimentation

We will first set up the namespace of Seldon where we will be deploying all our models

!kubectl create namespace seldon
namespace/seldon created

And then we will set the current workspace to use the seldon namespace so all our commands are run there by default (instead of running everything in the default namespace.)

!kubectl config set-context $(kubectl config current-context) --namespace=seldon

Triton Model Naming

You need to name the model in the graph with the same name as the triton model loaded as this name will be used in the path to triton.

Tensorflow CIFAR10 Model

png

ONNX CIFAR10 Model

0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 63913 100 344 100 63569 20081 3623k --:--:-- --:--:-- --:--:-- 3651k

TorchScript CIFAR10 Model

0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 63672 100 309 100 63363 2175 435k --:--:-- --:--:-- --:--:-- 435k 100 63672 100 309 100 63363 2174 435k --:--:-- --:--:-- --:--:-- 432k

png

Multi-Model Serving

Last updated

Was this helpful?