Outlier Detection

In a production environment, monitoring the data used for your machine learning model's inferences is essential, as data changes can significantly impact the performance of the model.

Using Alibi Detect's VAE outlier detection method for tabular datasets, this demo helps you to identify outliers in your inference data by:

  • Launching an image classifier model trained on the CIFAR-10 dataset. The data instances contain 32x32x3 pixels images that are classified into 10 classes such as truck, frog, cat, and others.

  • Setting up a VAE outlier detector for this model.

  • Sending a request to get an image classification.

  • Sending a perturbed request to identify an outlier instance.

Create a Seldon ML Pipeline

  1. In the Overview page, click Create new deployment.

  2. Enter the deployment details as follows:

    • Name: cifar10-classifier

    • Namespace: seldon

    • Type: Seldon ML Pipeline

create model
  1. Configure the default predictor as follows:

    • Runtime: Tensorflow

    • Model Project: default

    • Model URI: gs://seldon-models/triton/tf_cifar10

    • Storage Secret: (leave blank/none)

default predictor
  1. Click Next for the remaining steps, then click Launch.

Add an Outlier detector

  1. In the Overview page, select the pipeline that you created.

  2. In the Deployment Dashboard, click Add in the OUTLIER DETECTION card.

  3. Configure the detector with these parameters:

    • Detector Name: cifar10-outlier.

    • Storage URI: gs://seldon-models/scv2/examples/cifar10/outlier-detector

    • Reply URL: Leave as the default value.

Note: If you are using a custom installation, change this parameter according to your installation. http://seldon-request-logger.seldon-logs

  1. Click Create Detector. After sometime the status of the detector reads Available.

setup detector

Make Predictions

Now that the outlier detector is available, you can use of it to identify outliers in the inference data. You send two requests to the model, one with a normal image and another with a perturbed image to identify the outlier.

A frog image from the CIFAR-10 dataset in the Open Inference Protocol (OIP) format:

A perturbed image of the same frog in the Open Inference Protocol (OIP) format:

  1. In the deployment dashboard click Predict in the left pane.

  2. Click Browse to upload the cifar10-frog-oip.json file.

  3. Click Predict. The prediction request is processed and the response is displayed.

  4. Click Remove to remove the uploaded file.

  5. Click Browse again and upload the cifar10-frog-perturbed-oip.json file.

  6. Click Predict to make a prediction with the perturbed image of the frog.

View Outliers From Request Logs

Navigate to the Requests page in the left pane to view the requests made to the model and their prediction responses. Outlier score are available to the right side of each instance.

Previously made prediction requests with their prediction responses and outlier scores

You can also highlight outliers and filter them by enabling Highlight Outliers.

Highlighted outlier prediction requests with their prediction responses and outlier scores

Real-Time Outlier Monitoring

It is important to be able to monitor the outlier detection requests in real-time to ensure that the model is performing as expected and to take corrective actions when necessary.

  1. Click Monitor in the left pane.

  2. Select the Outlier Detection tab to view a timeline graph of outlier/inlier requests.

A timeline graph showing the first request classified as an inlier and the second as an outlier

Troubleshooting

If you experience issues with this demo, see the troubleshooting docs and also the Knative or Elasticsearch sections.

Last updated

Was this helpful?