Icon

03_​Global_​Feature_​Importance

Global Feature Importance Component with AutoML

In the example, the Credit Scoring data set is partitioned to training and test samples. Then, the black box model (Neural Network) is trained on the standardly pre-processed training data using the AutoML component. The Workflow Object capturing the pre-processing and the model is provided as one of the inputs for the Global Feature Importance component.

The Global Feature Importance component is then used to inspect the global model behavior using three Global Surrogate models (Generalized Linear Model, Decision Tree, and Random Forest) and Permutation Feature Importance explainability technique.

URL: KNIME Integrated Deployment - KNIME.com https://www.knime.com/integrated-deployment
URL: Molnar, Christoph. "Interpretable machine learning. A Guide for Making Black Box Models Explainable", 2019. https://christophm.github.io/interpretable-ml-book/
URL: Give Me Some Credit - Kaggle data set https://www.kaggle.com/c/GiveMeSomeCredit/

This workflow demonstrates the usage of the verified Global Feature Importance component developed to interpret global behavior of a black box machine learning model onthe example of the credit scoring. Top: train setBottom: test setStandard pre-processing & black-box model trainingNeural NetworkCredit scoring dataUndersamplingof the frequent"creditworthy" classPartitioning Global FeatureImportance AutoML CSV Reader Equal Size Sampling This workflow demonstrates the usage of the verified Global Feature Importance component developed to interpret global behavior of a black box machine learning model onthe example of the credit scoring. Top: train setBottom: test setStandard pre-processing & black-box model trainingNeural NetworkCredit scoring dataUndersamplingof the frequent"creditworthy" classPartitioning Global FeatureImportance AutoML CSV Reader Equal Size Sampling

Nodes

Extensions

Links