Image Explanations
Understanding how complex models make predictions is essential for promoting transparency, trust, and identifying potential biases. Model explainers offer insights into feature influence on outcomes, supporting debugging and model refinement.
We are going to demonstrate model explanations using Alibi Explain's Anchor Images method, taking note of the segments in an input image that influenced the prediction the most, as well as, observing the Anchor's Precision and Coverage metrics.
In this demo we will:
Launch an image classification deployment
Send prediction requests to the deployment
Create an explainer for the deployment
Generate explanations for previously sent prediction requests
The model used in this demo was trained to classify images based on the CIFAR10 dataset.
Create a Seldon Deployment
Click on
Create new deploymentbutton.Enter the deployment details as follows:
Name: cifar10-classifier
Namespace: seldon
Type: Seldon Deployment

Configure the default predictor as follows:
Runtime: Triton (ONNX, PyTorch, Tensorflow, TensorRT)
Model Project: default
Model URI: gs://seldon-models/triton/tf_cifar10
Storage Secret: (leave blank/none)
Model Name: cifar10
The Model Name is linked to the name described in the model-settings.json file, located in the Google Cloud Storage location. Changing the name in the JSON file would also require changing the Model Name, and vice versa.

Click the
Nextbutton until the end and, finally, clickLaunch.If your deployment is launched successfully, it will have
Availablestatus.
Get Predictions
We will make a prediction request using the image of a frog from the cifar10 dataset. The image is a JSON file in the REST format of the Open Inference Protocol (OIP).
Click on the
cifar10-classifierdeployment created in the previous section to enter the deployment dashboard.Inside the deployment dashboard, click on the
Predictbutton.On the
Predictpage, click onBrowseto select and upload the previously downloaded prediction file.Click the
Predictbutton, the prediction request should be successful and the response should be shown.

Add an Anchor Image Explainer
From the
cifar10-classifierdeployment dashboard, clickAddinside theModel Explanationcard.For step 1 of the Explainer Configuration Wizard, select the
Imagemodel data type and clickNext.

For step 2, make sure
Anchoris selected, then clickNext.For step 3, enter the following values in the
Explainer URItab:Explainer URI: gs://seldon-models/tfserving/cifar10/cifar10_anchor_image_py3.7_alibi-0.7.0
Model Project: default
Storage Secret: (leave blank/none)

Click the
Nextbutton until the end, and finally, clickLaunch.
After a short while, the explainer should become available.
Get an explanation for the request
Now that the explainer is available, we can make use of it to generate an explanation for the prediction request we made earlier.
Click on the
cifar10-classifierdeployment created in the previous sections to enter the deployment dashboard.Navigate to the
Requestspage using the left navigation drawer, and you will see the request you made earlier.Click on the
View explanationbutton to generate an explanation for the request.

After a few seconds, the explanation should be generated and displayed on the page.



Congratulations, you've created an explanation for the request! 🥳
Next Steps
Why not try our other demos? Ready to dive in? Read our operations guide to learn more about how to use Enterprise Platform.
Last updated
Was this helpful?