TFserving MNIST

This example shows how you can combine Seldon with Tensorflo Serving. We will use a Seldon Tensorflow Serving proxy model image that will forward Seldon internal microservice prediction calls out to a Tensorflow serving server.

The example will use the MNIST digit classification task with the example MNIST model.

Setup

!pip install seldon-core
%matplotlib inline
import json
import sys
from random import randint, random

import numpy as np
import requests
from matplotlib import pyplot as plt
from tensorflow.examples.tutorials.mnist import input_data

sys.path.append("../../../notebooks")
import grpc
import tensorflow as tf
from tensorflow.core.framework.tensor_pb2 import TensorProto
from visualizer import get_graph

from seldon_core.proto import prediction_pb2, prediction_pb2_grpc

Create MNIST Model Repository

You will need tensorflow installed to run these steps.

Train Tensorflow MNIST example model

Copy Model to Google Bucket

Test From GCP Cluster

Setup Seldon Core

Instructions also online.

Run MNIST Inference Graph

svg

Port forward Ambassador

png

Analytics and Load Test

You should port-forward the grafana dashboard

You can then view an analytics dashboard inside the cluster at http://localhost:3000/dashboard/db/prediction-analytics?refresh=5s&orgId=1.

Last updated

Was this helpful?