Icon

01_​Compute_​LIMEs

Compute Local Model-agnostic Explanations (LIMEs)

This is an example for computing explanation using LIME.
An XGBoost model was picked, but any model and its set of Learner and Predictor nodes can be used.

- Read the dataset about wines
- Partition the data in train and test
- Pick few test set instances rows to explain
- Create local samples for each instance in the input table (LIME Loop Start)
- Score the samples using the predictor node and a trained model
- Compute LIMEs, that is local model-agnostic explanations by training a local GLM using the samples and extracting the weights.
- Visualize them in the Composite View (Right Click > Open View)

URL: LIME - Local Interpretable Model-Agnostic Explanations, Marco Tulio Ribeiro, Blog Post https://homes.cs.washington.edu/~marcotcr/blog/lime/
URL: Data Source: UCI - Wine Quality Data Set http://archive.ics.uci.edu/ml/datasets/wine+quality

Local Interpretable Model-agnostic Explanation (LIME)This is an example of a computing explanation using LIME.An XGBoost model was picked, but any model and its set of Learner and Predictor nodes can be used.See View -> Description for more information about what the workflow does. training a local GLMfor each input instance to generate a Local Inter. Model-agn. Explanationtop: 90% train setbottom: 10% test setscoresamplestop input : test set instance rows to be explainedbottom input : test set distributionfew wines with high sulphates +few wines with low sulphatesred wine datatrain the modelcollectsexplanationsCompute LIME Partitioning XGBoost Predictor LIME Loop Start SelectInstance Rows VisualizeExplanations Data Preparation ExplanationsPost-processing File Reader XGBoost TreeEnsemble Learner Loop End Local Interpretable Model-agnostic Explanation (LIME)This is an example of a computing explanation using LIME.An XGBoost model was picked, but any model and its set of Learner and Predictor nodes can be used.See View -> Description for more information about what the workflow does. training a local GLMfor each input instance to generate a Local Inter. Model-agn. Explanationtop: 90% train setbottom: 10% test setscoresamplestop input : test set instance rows to be explainedbottom input : test set distributionfew wines with high sulphates +few wines with low sulphatesred wine datatrain the modelcollectsexplanationsCompute LIME Partitioning XGBoost Predictor LIME Loop Start SelectInstance Rows VisualizeExplanations Data Preparation ExplanationsPost-processing File Reader XGBoost TreeEnsemble Learner Loop End

Nodes

Extensions

Links