Icon

08_​Feature_​Selection

Feature Selection
Exercise: Feature SelectionLet's take a look at which features are the most important in predicting the rank of a house. Below you see the workflow that we built earlier in the course.1) Use a Concatenate node to unify training and test set previously splitted during preprocessing2) Apply a forward feature selection algorithm to the trained model (Forward Feature Selection metanode)- Replace the Learner and Predictor nodes by the Logistic Regression Learner and Logistic Regression Predictor nodes in the workflow- Exclude the "rank" column in the Feature Selection Loop Start node- Select stratified sampling on the "rank" column in the Partitioning node- Select Cohen's kappa as the metric to maximize in the Feature Selection Loop End node- Take a look at the Feature Selection Filter node. Which columns can you leave out and still obtain the maximum Cohen's kappa? Start feature selection loop here Read AmesHousing.csv Preprocessing CSV Reader Exercise: Feature SelectionLet's take a look at which features are the most important in predicting the rank of a house. Below you see the workflow that we built earlier in the course.1) Use a Concatenate node to unify training and test set previously splitted during preprocessing2) Apply a forward feature selection algorithm to the trained model (Forward Feature Selection metanode)- Replace the Learner and Predictor nodes by the Logistic Regression Learner and Logistic Regression Predictor nodes in the workflow- Exclude the "rank" column in the Feature Selection Loop Start node- Select stratified sampling on the "rank" column in the Partitioning node- Select Cohen's kappa as the metric to maximize in the Feature Selection Loop End node- Take a look at the Feature Selection Filter node. Which columns can you leave out and still obtain the maximum Cohen's kappa? Start feature selection loop here Read AmesHousing.csv Preprocessing CSV Reader

Nodes

Extensions

Links