arrow-left

All pages
gitbookPowered by GitBook
1 of 1

Loading...

Image Explanations

Understanding how complex models make predictions is essential for promoting transparency, trust, and identifying potential biases. Model explainers offer insights into feature influence on outcomes, supporting debugging and model refinement.

We are going to demonstrate model explanations using Alibi Explain's Anchor Images method, taking note of the segments in an input image that influenced the prediction the most, as well as, observing the Anchor's Precision and Coverage metrics.

In this demo we will:

  • Launch an image classification deployment

  • Send prediction requests to the deployment

  • Create an explainer for the deployment

  • Generate explanations for previously sent prediction requests

The model used in this demo was trained to classify images based on the .

hashtag
Create a Seldon Deployment

  1. Click on Create new deployment button.

  2. Enter the deployment details as follows:

    • Name: cifar10-classifier

  1. Configure the default predictor as follows:

    • Runtime: Triton (ONNX, PyTorch, Tensorflow, TensorRT)

    • Model Project: default

circle-exclamation

The Model Name is linked to the name described in the model-settings.json file, located in the Google Cloud Storage location. Changing the name in the JSON file would also require changing the Model Name, and vice versa.

  1. Click the Next button until the end and, finally, click Launch.

  2. If your deployment is launched successfully, it will have Available status.

hashtag
Get Predictions

We will make a prediction request using the image of a frog from the cifar10 dataset. The image is a JSON file in the REST format of the .

  1. Click on the cifar10-classifier deployment created in the previous section to enter the deployment dashboard.

  2. Inside the deployment dashboard, click on the Predict button.

  3. On the Predict page, click on Browse

hashtag
Add an Anchor Image Explainer

  1. From the cifar10-classifier deployment dashboard, click Add inside the Model Explanation card.

  2. For step 1 of the Explainer Configuration Wizard, select the Image model data type and click Next.

  1. For step 2, make sure Anchor is selected, then click Next.

  2. For step 3, enter the following values in the Explainer URI tab:

    • Explainer URI:

  1. Click the Next button until the end, and finally, click Launch.

After a short while, the explainer should become available.

hashtag
Get an explanation for the request

Now that the explainer is available, we can make use of it to generate an explanation for the prediction request we made earlier.

  1. Click on the cifar10-classifier deployment created in the previous sections to enter the deployment dashboard.

  2. Navigate to the Requests page using the left navigation drawer, and you will see the request you made earlier.

  3. Click on the View explanation button to generate an explanation for the request.

  1. After a few seconds, the explanation should be generated and displayed on the page.

Congratulations, you've created an explanation for the request! 🥳

hashtag
Next Steps

Why not try our other ? Ready to dive in? Read our to learn more about how to use Enterprise Platform.

Namespace: seldon
  • Type: Seldon Deployment

  • Model URI: gs://seldon-models/triton/tf_cifar10
  • Storage Secret: (leave blank/none)

  • Model Name: cifar10

  • to select and upload the previously downloaded prediction file.
  • Click the Predict button, the prediction request should be successful and the response should be shown.

  • gs://seldon-models/tfserving/cifar10/cifar10_anchor_image_py3.7_alibi-0.7.0
  • Model Project: default

  • Storage Secret: (leave blank/none)

  • CIFAR10 datasetarrow-up-right
    Open Inference Protocol (OIP)arrow-up-right
    file-download
    85KB
    cifar10-frog-oip.json
    arrow-up-right-from-squareOpen
    demos
    operations guide
    Deployment details
    Default predictor
    Successful prediction using a JSON file
    Explainer Model Data Type
    Explainer URI
    Previously made prediction request with its predicted response
    The frog image segments that influenced the prediction the most
    The Precision and Coverage Anchor metrics
    Perturbed samples of the prediction request which comply with the prediction