0 ×

01_​Compute_​LIMEs

Workflow

Compute Local Model-agnostic Explanations (LIMEs)
LIMEmachine learning interpretabilitymliexplanationinstance-level explanationreason codeexplaininterpretmodelmachine learningXGBXGBoost
Data Source This KNIME workflow uses data from Paulo Cortez, University of Minho, Guimarães, Portugal, www3.dsi.uminho.pt/pcortez A. Cerdeira, F. Almeida, T. Matos and J. Reis, Viticulture Commission of the Vinho Verde Region(CVRVV), Porto, Portugal @2009 Available at archive.ics.uci.edu/ml/datasets/wine+quality Local Interpretable Model-agnostic Explanation (LIME)This is an example for computing explanation using LIME.More infos at: homes.cs.washington.edu/~marcotcr/blog/lime/An XGBoost model was picked, but any model and its set of Learner and Predictor nodes can be used.- Read the dataset about wines- Partition the data in train and test- Pick few test set instances rows to explain- Create local samples for each instance in the input table (LIME Loop Start)- Score the samples using the predictor node and a trained model- Compute LIMEs, that is local model-agnostic explanations by training a local GLM using the samples and extracting the weights.- Visualize them in the Composite View (Right Click > Open View) collect explanationsred wine data top: 90% train setbottom: 10% test settrain the modelscoresamplestop input : test set instance rows to be explainedbottom input : test set distributionfew wines with high sulphates +few wines with low sulphatestraining a local GLMfor each input instance to generate a Local Inter. Model-agn. ExplanationLoop End File Reader Partitioning XGBoost TreeEnsemble Learner XGBoost Predictor LIME Loop Start SelectInstance Rows VisualizeExplanations Data Preparation ExplanationsPost-processing Compute LIME Data Source This KNIME workflow uses data from Paulo Cortez, University of Minho, Guimarães, Portugal, www3.dsi.uminho.pt/pcortez A. Cerdeira, F. Almeida, T. Matos and J. Reis, Viticulture Commission of the Vinho Verde Region(CVRVV), Porto, Portugal @2009 Available at archive.ics.uci.edu/ml/datasets/wine+quality Local Interpretable Model-agnostic Explanation (LIME)This is an example for computing explanation using LIME.More infos at: homes.cs.washington.edu/~marcotcr/blog/lime/An XGBoost model was picked, but any model and its set of Learner and Predictor nodes can be used.- Read the dataset about wines- Partition the data in train and test- Pick few test set instances rows to explain- Create local samples for each instance in the input table (LIME Loop Start)- Score the samples using the predictor node and a trained model- Compute LIMEs, that is local model-agnostic explanations by training a local GLM using the samples and extracting the weights.- Visualize them in the Composite View (Right Click > Open View) collect explanationsred wine data top: 90% train setbottom: 10% test settrain the modelscoresamplestop input : test set instance rows to be explainedbottom input : test set distributionfew wines with high sulphates +few wines with low sulphatestraining a local GLMfor each input instance to generate a Local Inter. Model-agn. ExplanationLoop End File Reader Partitioning XGBoost TreeEnsemble Learner XGBoost Predictor LIME Loop Start SelectInstance Rows VisualizeExplanations Data Preparation ExplanationsPost-processing Compute LIME

Download

Get this workflow from the following link: Download

Nodes

01_​Compute_​LIMEs consists of the following 107 nodes(s):

Plugins

01_​Compute_​LIMEs contains nodes provided by the following 12 plugin(s):