In the example, the Credit Scoring data set is partitioned to training and test samples. Then, the black box model (Neural Network) is trained on the standardly pre-processed training data using the AutoML component. The Workflow Object capturing the pre-processing and the model is provided as one of the inputs for the Global Feature Importance component.
The Global Feature Importance component is then used to inspect the global model behavior using three Global Surrogate models (Generalized Linear Model, Decision Tree, and Random Forest) and Permutation Feature Importance explainability technique.
URL: KNIME Integrated Deployment - KNIME.com https://www.knime.com/integrated-deployment
URL: Molnar, Christoph. "Interpretable machine learning. A Guide for Making Black Box Models Explainable", 2019. https://christophm.github.io/interpretable-ml-book/
URL: Give Me Some Credit - Kaggle data set https://www.kaggle.com/c/GiveMeSomeCredit/
To use this workflow in KNIME, download it from the below URL and open it in KNIME:
Download WorkflowDeploy, schedule, execute, and monitor your KNIME workflows locally, in the cloud or on-premises – with our brand new NodePit Runner.
Try NodePit Runner!Do you have feedback, questions, comments about NodePit, want to support this platform, or want your own nodes or workflows listed here as well? Do you think, the search results could be improved or something is missing? Then please get in touch! Alternatively, you can send us an email to mail@nodepit.com.
Please note that this is only about NodePit. We do not provide general support for KNIME — please use the KNIME forums instead.