Node Connectivity

There are 2898 nodes that can be used as successor for a node with an output port of type Table.

NBTree (3.6) (legacy) 

Class for generating a decision tree with naive Bayes classifiers at the leaves. For more information, see Ron Kohavi: Scaling Up the Accuracy of […]

REPTree (3.6) (legacy) 

Fast decision tree learner. Builds a decision/regression tree using information gain/variance and prunes it using reduced-error pruning (with backfitting). […]

RandomForest (3.6) (legacy) 

Class for constructing a forest of random trees. For more information see: Leo Breiman (2001). Random Forests. Machine Learning. 45(1):5-32.

RandomTree (3.6) (legacy) 

Class for constructing a tree that considers K randomly chosen attributes at each node. Performs no pruning. Also has an option to allow estimation of […]

SimpleCart (3.6) (legacy) 

Class implementing minimal cost-complexity pruning. Note when dealing with missing values, use "fractional instances" method instead of surrogate split […]

UserClassifier (3.6) (legacy) 

Interactively classify through visual means. You are Presented with a scatter graph of the data against two user selectable attributes, as well as a view of […]

ConjunctiveRule (3.6) (legacy) 

This class implements a single conjunctive rule learner that can predict for numeric and nominal class labels. A rule consists of antecedents "AND"ed […]

DTNB (3.6) (legacy) 

Class for building and using a decision table/naive bayes hybrid classifier. At each point in the search, the algorithm evaluates the merit of dividing the […]

DecisionTable (3.6) (legacy) 

Class for building and using a simple decision table majority classifier. For more information see: Ron Kohavi: The Power of Decision Tables. In: 8th […]

JRip (3.6) (legacy) 

This class implements a propositional rule learner, Repeated Incremental Pruning to Produce Error Reduction (RIPPER), which was proposed by William W. Cohen […]