Icon

02_​Snowflake_​Model_​Learning

Training and Testing a Customer Churn Predictor with KNIME and Snowflake
Training and Testing a Customer Churn Predictor with KNIME and SnowflakeThis workflow is an example of how to train a machine learning model in KNIME with data sampled from Snowflake. Once the model is applied at scale to the rest of the data in Snowflake to evaluate its reliability.Example data provided from kaggle: https://www.kaggle.com/becksddf/churn-in-telecoms-dataset 1. Data Preparation.Sample and filter data within Snowflake before loading it into a KNIME table. If you haven't executedthe 01b_Snowflake_ETL workflow before you can use the upper branch to create the" calls_contracts" table in Snowflake with example data. 2. Model Learning. Learns a Random Forest model and converts it into atransportable MOJO model. 3. Model Evaluation and Storing.Send the MOJO model to Snowflake and predict remainig data in the Snowflakedata table. The predicted and filtered result is read into KNIME to evaluate themodel acurracy. Change connectioninformation to your Snowflake accountSelect existingtableRead datainto KNIMEtableStratifiedsampling onchurnremove unwanted columns e.g. phoneApply H2O modelat scale in SnowflakeRead predicteddata into KNIME tableKeep onlychurn and predicted columnView ScoreSave modelfor later usagecalls contractsCreate db table with dataSnowflake Connector DB Table Selector DB Reader DB Row Sampling H2O Local Context Table to H2O H2O RandomForest Learner DB Column Filter H2O Model to MOJO Snowflake H2O MOJOPredictor (Classification) DB Reader DB Column Filter Scorer (JavaScript) H2O MOJO Writer Table Reader DB Writer Training and Testing a Customer Churn Predictor with KNIME and SnowflakeThis workflow is an example of how to train a machine learning model in KNIME with data sampled from Snowflake. Once the model is applied at scale to the rest of the data in Snowflake to evaluate its reliability.Example data provided from kaggle: https://www.kaggle.com/becksddf/churn-in-telecoms-dataset 1. Data Preparation.Sample and filter data within Snowflake before loading it into a KNIME table. If you haven't executedthe 01b_Snowflake_ETL workflow before you can use the upper branch to create the" calls_contracts" table in Snowflake with example data. 2. Model Learning. Learns a Random Forest model and converts it into atransportable MOJO model. 3. Model Evaluation and Storing.Send the MOJO model to Snowflake and predict remainig data in the Snowflakedata table. The predicted and filtered result is read into KNIME to evaluate themodel acurracy. Change connectioninformation to your Snowflake accountSelect existingtableRead datainto KNIMEtableStratifiedsampling onchurnremove unwanted columns e.g. phoneApply H2O modelat scale in SnowflakeRead predicteddata into KNIME tableKeep onlychurn and predicted columnView ScoreSave modelfor later usagecalls contractsCreate db table with dataSnowflake Connector DB Table Selector DB Reader DB Row Sampling H2O Local Context Table to H2O H2O RandomForest Learner DB Column Filter H2O Model to MOJO Snowflake H2O MOJOPredictor (Classification) DB Reader DB Column Filter Scorer (JavaScript) H2O MOJO Writer Table Reader DB Writer

Nodes

Extensions

Links