LogoLogo
search
⌘Ctrlk
LogoLogo
  • Overview
    • Introduction
    • Getting Started
    • Algorithm Overview
    • White-box and black-box models
    • Saving and loading
    • Frequently Asked Questions
  • Explanations
    • Methods
    • Examples
      • Alibi Overview Examples
      • Accumulated Local Effets
      • Anchors
      • Contrastive Explanation Method
      • Counterfactual Instances on MNIST
      • Counterfactuals Guided by Prototypes
      • Counterfactuals with Reinforcement Learning
      • Integrated Gradients
      • Kernel SHAP
      • Partial Dependence
      • Partial Dependence Variance
      • Permutation Importance
        • Permutation Feature Importance on “Who’s Going to Leave Next?”
      • Similarity explanations
      • Tree SHAP
  • Model Confidence
    • Methods
    • Examples
  • Prototypes
    • Methods
    • Examples
  • API Reference
    • alibi.api
    • alibi.confidence
    • alibi.datasets
    • alibi.exceptions
    • alibi.explainers
    • alibi.models
    • alibi.prototypes
    • alibi.saving
    • alibi.utils
    • alibi.version
gitbookPowered by GitBook
block-quoteOn this pagechevron-down
  1. Explanationschevron-right
  2. Examples

Permutation Importance

Permutation Feature Importance on “Who’s Going to Leave Next?”chevron-right
PreviousFeature importance and feature interaction based on partial dependece variancechevron-leftNextPermutation Feature Importance on “Who’s Going to Leave Next?”chevron-right

Last updated 5 months ago

Was this helpful?

Was this helpful?