0 ×

01_​Global_​Feature_​Importance_​Example

Workflow

Compute and Visualize Global Feature Importance Metrics

This application is a simple example of inspecting global feature importance for binary and multiclass classification with KNIME Software. The key of this example is the Global Feature Importance component verified and developed by the KNIME Team. In this example, the Wine quality data set is partitioned to training and test samples. Then, the black box model (Neural Network) is trained on the standardly pre-processed training data using the AutoML component. The Workflow Object capturing the pre-processing and the model is provided as an input for the Global Feature Importance component together with the test data. The component provides the global feature importance according to four techniques: three interpretable Global Surrogate Models (GLM, Decision Tree, and Random Forest) and Permutation Feature Importance.

autoMLautomated machine learningguided analyticsintegrated deploymentinterpretabilityglobal surrogate modelspermutation feature importancesurrogate random forestsurrogate GLMsurrogate decision treeglobal feature importancexAI
Top: train setBottom: test set1. Standard pre-processing2. Training and optimization of a Neural networkWine qualityhistorical dataInspect global feature importancefor the wine quality classificationPartitioning AutoML CSV Reader Global FeatureImportance Top: train setBottom: test set1. Standard pre-processing2. Training and optimization of a Neural networkWine qualityhistorical dataInspect global feature importancefor the wine quality classificationPartitioning AutoML CSV Reader Global FeatureImportance

Download

Get this workflow from the following link: Download

Resources

Nodes

01_​Global_​Feature_​Importance_​Example consists of the following 1781 nodes(s):

Plugins

01_​Global_​Feature_​Importance_​Example contains nodes provided by the following 16 plugin(s):