# Outlier Detection

In a production environment, it is critical to monitor the data your machine learning model runs inference on, as changes in data can adversely affect the performance of ML models.

In this demo learn how to identify outliers in your inference data using Alibi Detect's [VAE outlier detection](https://docs.seldon.io/projects/alibi-detect/en/stable/examples/od_vae_cifar10.html) method for tabular datasets.

In this demo, we will:

* Launch an image classification deploying with a model, trained on the [CIFAR-10 dataset](https://www.cs.toronto.edu/~kriz/cifar.html).
* Set up an VAE outlier detector for this particular model
* Send a request to get an image classification
* Send a perturbed request to identify an outlier instance

{% hint style="warning" %}
This demo requires Knative installation on the cluster as the drift detector will be installed as a kservice. See [Knative installation instructions](https://docs.seldon.ai/seldon-enterprise-platform/production-environment/knative) for necessary setup required.
{% endhint %}

## Create a Seldon Deployment

1. In the **Overview** page, click **Create new deployment**.
2. Enter the deployment details as follows:
   * Name: *cifar10-classifier*
   * Namespace: *seldon*
   * Type: *Seldon Deployment*

![Deployment details](https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-8ca7a41d838e34c8883cf873af36589897c06bfa%2Fdeployment-details-core1.png?alt=media)

3. Configure the default predictor as follows:
   * Runtime: *Triton (ONNX, PyTorch, Tensorflow, TensorRT)*
   * Model Project: *default*
   * Model URI: *gs\://seldon-models/triton/tf\_cifar10*
   * Storage Secret: *(leave blank/none)*
   * Model Name: *cifar10*

{% hint style="warning" %}
The `Model Name` is linked to the name described in the `model-settings.json` file, located in the Google Cloud Storage location. Changing the name in the JSON file would also require changing the `Model Name`, and vice versa.
{% endhint %}

![Default predictor](https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-2c36f745bd6bbf5509b2d0e0661e3496c7b764ab%2Fdefault-predictor-core1.png?alt=media)

4. Click the `Next` button until the end and, finally, click `Launch`.
5. If your deployment is launched successfully, it will have `Available` status.

## Add an Outlier Detector

1. From the `cifar10-classifier` deployment dashboard, click `Add` inside the `Outlier Detection` card.
2. Configure the detector with the following values:
   * Detector Name: *outlier-detect*
   * Storage URI: *gs\://seldon-models/scv2/examples/cifar10/outlier-detector*
   * Project: *default*
   * Storage Secret: *(leave blank)*
   * Reply URL: *<http://seldon-request-logger.seldon-logs>*

![Outlier Detector](https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-143a367149aefb94dd4e1a81ec8e21f4991b765e%2Foutlier-detector.png?alt=media)

3. Click `Create Detector` to complete the setup.
4. After a short while, the detector should become available.

## Make Predictions

Now that the outlier detector available, we can showcase how to make use of it to identify outliers in the inference data. We are going to send 2 requests to the model, one with a normal image and another with a perturbed image to identify the outlier.

A frog image from the CIFAR-10 dataset in the Open Inference Protocol (OIP) format:

{% file src="<https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-d9e459a2c892a934617b4d08a1fca77440401f66%2Fcifar10-frog-oip.json?alt=media>" %}

A perturbed image of the same frog in the Open Inference Protocol (OIP) format:

{% file src="<https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-a6008951d6647dfcc8228fcf92cbc404b2991df4%2Fcifar10-frog-perturbed-oip.json?alt=media>" %}

1. In the deployment dashboard of click **Predict** in the left pane..
2. Click **Browse** to upload the `cifar10-frog-oip.json` file.
3. Click **Predict**. The prediction request is processed and the response is displayed.
4. Click **Remove** to remove the uploaded file
5. Click **Browse** again to make a prediction with the perturbed image of the frog using the `cifar10-frog-perturbed-oip.json` file. The prediction request is processed and the response is displayed.

## View and highlight outlier detection results

Navigate to the `Requests` page using the left navigation drawer to view the requests made to the model and their prediction responses. Outlier scores will be available on the right side of each instance.

![Previously made prediction requests with their prediction responses and outlier scores](https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-f2f91ef01911c3b491027672819db2f9425e2d21%2Frequests-outliers.png?alt=media)

You can also highlight outlier prediction responses using the `Highlight Outliers` toggle on the top-right of the screen.

![Highlighted outlier prediction requests with their prediction responses and outlier scores](https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-48198957a31a9a95c1d0e3b32c9406625aae4437%2Frequests-highlighted-outliers.png?alt=media)

Furthermore, you can filter the requests to show only the outliers by clicking on the `Filter` icon in the top right(next to the Page number) and selecting the `Outliers` option, toggling to show only outliers or not.

![A highlighted outlier prediction after applying a filter to show only outliers](https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-a5d6223654edd5862a0b4e1afeeed1f81709e9ed%2Frequests-filtered-outliers.png?alt=media)

### Real-Time Outlier Monitoring

It is important to be able to monitor the outlier detection requests in real-time to ensure that the model is performing as expected and to take corrective actions when necessary.

1. Click **Monitor** in the left pane.
2. Select the `Outlier Detection` tab to view a timeline graph of outlier/inlier requests.

![A timeline graph showing the first request classified as an inlier and the second as an outlier](https://1921172648-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FvLlcdnP8YnaIFsv8EiMA%2Fuploads%2Fgit-blob-d67856670fd0175ac6e39c4c304439e415bd98dc%2Fmonitor-outliers.png?alt=media)

Congratulations, you've successfully demonstrated how to identify outliers in your inference data using Alibi Detect's VAE outlier detection method! 🥳

## Next Steps

Why not try our other [demos](https://docs.seldon.ai/seldon-enterprise-platform/demos)? Ready to dive in? Read our [operations guide](https://docs.seldon.ai/seldon-enterprise-platform/operations) to learn more about how to use Enterprise Platform.

## Troubleshooting

If you experience issues with this demo, see the [troubleshooting docs](https://docs.seldon.ai/seldon-enterprise-platform/help-and-support) and also the [Knative](https://docs.seldon.ai/seldon-enterprise-platform/production-environment/request-logging) or [Elasticsearch](https://docs.seldon.ai/seldon-enterprise-platform/production-environment/elasticsearch) sections.
